Category Archives: Software

Affinity Photo Review

There’s a new image processor on my computers. Recently the chief developer of one of my favorite image editors, Picture Window Pro (PWP), sent out a sad email letting all PWP users know that he is stopping PWP development. He thanked us for over twenty years of support and as a last gift converted the final version of PWP to freeware. You can now download and run Picture Window Pro without a key. PWP is a superb program! It’s still the fastest and meanest image editor I have ever used and I am constantly trying image editors. If you’re interested in getting a free copy of PWP download the program before the distribution website shuts down.

I was saddened by this news but all good things eventually end. With PWP going away I decided, for the nth time, to look for alternatives. I reconsidered Photoshop. I’ve used full-blown Photoshop but frankly, I’ve never been impressed. It’s expensive and slow! I use an old copy of Photoshop Elements, mainly to remove blemishes on film scans, but in my opinion, the only Adobe image processing product worth paying for is Lightroom. Adobe is the evil image processing empire. They squeeze you with sluggish performance and abusive subscription payment models and then act like they’re doing you a favor. It didn’t take me very long to abandon Photoshop (again) and keep looking for PWP alternatives. Lucky for me there’s this thing called DuckDuckGo that quickly led me to Affinity Photo.

Affinity Photo is a relatively new image editor that got started in the Mac world and, as of November 2016, is also available for 64 bit Windows systems. Affinity has snagged dozens of positive reviews and, unlike Photoshop, is reasonably priced. I decided to give myself a little Christmas present and bought Affinity Photo.

The Affinity Windows download file is large: over 200 megabytes. Affinity Windows depends on  .Net 4.6.2 which is also installed if it’s not on your machine. It took a few minutes to suck down and install all the required bytes but things went smoothly and I eagerly started the program.

Before relating my impressions of Affinity Photo I will describe my binary image format philosophy. Image editors typically create and manipulate vendor specific proprietary binary image files. Binary image files like PSDs, NEFs, DNGs, and now Affinity’s AFPHOTOs, have a nasty tendency to evolve on vendor whim. This poses fundamental long-term image storage problems. Even if you conscientiously backup and archive your original image files you may discover, a decade hence, that you can no longer load them with current software. I hate this! Photography is for the ages, not the marketing cycles of software and camera companies! If you have ever wondered why the lowly JPG image format still reigns supreme despite its abundant technical deficiencies stop wondering. The JPG format is an open and well described eight-bit channel format. Any competent programmer can write software to read and write JPGs. The same holds for TIFs, PNGs and a few other open formats. This is not true for vendor dominated formats. The specifications, even if disclosed, can change on a moment’s notice.

How can photographers deal with transient binary image formats? There are two basic approaches. You can convert all your images to an open image format. Some photographers convert camera RAW files to high bit TIFs. Converting large numbers of image files is a tedious and resource hungry process but it’s probably the best bet for long-term storage. I use the second lazier approach: maintain at least two independent image programs that can read and write the binary image formats you work with. I use Nikon cameras; they crank out proprietary NEF binaries. Currently, I have four programs on this machine that can read NEFs: PWP, Lightroom, Affinity and ThumbsPlus. I will tolerate proprietary binaries if and only if I have options. Don’t let software and camera companies box you in.

I started using image editors about fifteen years ago. My first editor came with my first digital camera: a one megapixel HP. I cannot remember the name of this program; I only used it long enough to discover its appalling deficiencies. Within a week I had purchased my first version of Photoshop Elements (PE). I was happy with PE until I encountered posterization (read the link for the gory technical details). Posterization wreaks prints and it’s easy to posterize eight-bit channel images. The answer then, as it is now, is to increase your working bit depth. Adobe recommended upgrading from PE to full Photoshop. Why fork over $70 bucks when you can fork over $500? Photoshop Elements has a long history of half-assed support for sixteen-bit channel images and the reason is painfully obvious. If Photoshop Elements fully embraced sixteen-bit channel images there would be very little need for full Photoshop. You could save yourself hundreds of dollars. Adobe decided not to compete with themselves and adopted the time-tested pseudo-monopolistic practice of sodomizing the customer. I did not embrace the butthurt! I started looking for low-cost programs that properly handled sixteen-bit images. It didn’t take me long to find PWP.

This early experience shaped my entire image processing approach. Instead of adopting a single monolithic “industry standard” program and joining the nerd herd I decided to go my way and use many small programs. Instead of a Goliath, I went with many Davids.1 When you take the David approach interoperability moves to the top of the stack. The output of one program must effortlessly become the input of another. How programs play together matters. Additionally, when you apply the David approach, you never look to completely replace your tool set. General purpose tools like Affinity may be able to do all the things more specialized tools can do but probably not as efficiently or as well.

So, before adding Affinity to my trusted tools I asked:

  1. Does Affinity play nice with others?
  2. Is Affinity’s user interface (UI) tolerable?
  3. Does Affinity streamline common tedious tasks?
  4. What new capabilities does Affinity offer?

With these points in mind let’s look at Affinity Photo.

Does Affinity play nice with others?

One of the first things I look for in image processors is tolerable loading times. Part the reason I’ve never been able to stick with full Photoshop is because it takes forever to get the damn thing up. Affinity on Windows easily beats full Photoshop but it’s still slower than good old C coded PWP. PWP comes up in a flash. It’s one of the many reasons I stuck with it for over a decade. Affinity’s loading speed is comparable to GIMP, Photoshop Elements and Lightroom: fast enough to not drive me crazy.

After loading Affinity, I immediately started testing the program’s ability to read and write sixteen-bit TIF files. The basic single layer sixteen-bit TIF file format is one of the best supported lossless image formats. It’s often the only way to move information from one program to another without trashing bits. JPGs are universal, but every time you write a JPG you lose data: that’s the lossy part of JPG compression. Lossy image formats are fine for the web and final presentation but are a total disaster for image editing. Affinity can read and write sixteen-bit TIFs. It can also read and write a number of other important formats like Photoshop Elements PSDs. Affinity converts PSD layers to AFPHOTO layers. It also handles JPGs, PNGs and many RAW formats like Nikon’s NEFs. Affinity plays well with others.

Is Affinity’s user interface UI tolerable?

Once I had satisfied myself that I could slosh bits through Affinity I started evaluating the program’s user interface. UIs have ruined many editors. I’m immediately suspicious when reviewers start lauding a program’s UI before spending a few hundred hours using it. UIs either help or hinder. Affinity’s UI is decent. If you have ever used a layer oriented image editor you will quickly adjust to how Affinity works. I strongly recommend watching the Affinity tutorial videos; they are among the best video tutorials I’ve seen and quickly show what the program can do.

Once Affinity is loaded it’s pretty zippy. Common image handling operations are fast enough to fly under my annoyance radar. Image processing can be very demanding. Don’t expect to stitch 500-megabyte panoramas from original RAW files instantly. With current hardware and software, some things will take time. It’s fair to say that Affinity’s performance compares favorably to other image processors. I can put up with Affinity’s user interface.

Does Affinity streamline common tedious tasks?

After playing with Affinity for a few days I used the program to help restore some old scanned slides. Old pictures are always damaged; they all need a bit, or a lot, of retouching. The problems most people associate with old pictures, tears, color changes, and loss of tone are usually easily fixed in most editors. The biggest job is removing thousands of scratches, spots, and stains. Most restorers give up and crop or blur away such defects but I’m with Lady Macbeth: “out, out damn spots.” Any tool that helps me hunt down and exterminate spotty pests will be lovingly embraced.

The Affinity inpainting brush works a lot better than the corresponding Photoshop Elements healing brush. In particular, it crosses linear backgrounds, buildings, fabric patterns, and so forth without unduly destroying detail. Removing long linear scratches that cross regular structured detail is a soul draining chore. Whenever I see such defects I typically give up and find another picture to restore; I have a big backlog of scans awaiting restoration! This slide of the southern end of Beirut Lebanon, taken by my

I am still exploring the Affinity Photo image editor. I used it to restore this scan of a Kodachrome slide my father shot from a hotel window of the south coast of Beirut Lebanon in 1968. The Continental Hotel is visible in the lower left corner of this image. My mother often stayed in the Continental when she visited me in Beirut. I fondly remember having continental breakfast in the Continental. The original slide was overexposed and covered with splotches and sky fingerprints. The retouching tools in Affinity Photo are better than corresponding tools in Photoshop Elements. I particularly like the Affinity inpainting brush; it works well on textured and linear subjects. I was able to remove long scratches cutting through the buildings in this image without unduly wreaking building detail. I also used the inpainting tool to remove a cutoff street light and a car it was shading on the bottom of the image. It's easier to remove objects with Affinity Photo than it is in Photoshop Elements.

I am still exploring the Affinity Photo image editor. I used it to restore this scan of a Kodachrome slide my father shot from a hotel window of the south coast of Beirut Lebanon in 1968. The Continental Hotel is visible in the lower left corner of this image. My mother often stayed in the Continental when she visited me in Beirut. I fondly remember having continental breakfast in the Continental. The original slide was overexposed and covered with splotches and sky fingerprints. The retouching tools in Affinity Photo are better than corresponding tools in Photoshop Elements. I particularly like the Affinity inpainting brush; it works well on textured and linear subjects. I was able to remove long scratches cutting through the buildings in this image without unduly wreaking building detail. I also used the inpainting tool to remove a cutoff street light and a car it was shading on the bottom of the image. It’s easier to remove objects with Affinity Photo than it is in Photoshop Elements.

father in 1968, is typical of many images in my backlog. There were dozens of long linear scratches running through the buildings. It would have taken hours to fix them with PE. All it took was a few passes with Affinity’s inpainting brush to remove them. I was impressed. This single tool significantly speeds up restoring scratched and spotted images and justifies Affinity’s purchase price all by itself.

Another Affinity tool that streamlines common image processing tasks is the Affinity panorama tool. Most modern image editors have fairly decent panorama tools and building a panorama is easier than it used to be. In the image editing Dark Age, you had to manually select control points and master blend masks to build decent panoramas. It could take hours to align a single image. Current editors use effective automatic control point detection and advanced blending algorithms. In most cases, it’s a simple matter of using good capture technique followed by loading the individual images into a panorama tool to generate decent to excellent results.

We are living in a panoramic golden age but there are still problems. I shoot entirely in RAW because RAW preserves the most information and affords the greatest post processing options. Panoramas often encompass scenes of high contrast. Image tones will vary from extremely bright to very dark. It’s not uncommon for ordinary panoramas to span twelve or more stops of dynamic range. When processing high dynamic range pictures it’s extremely advantageous to do all your work on sixteen-bit or thirty-two-bit channel images. Blending eight-bit panoramas can release the posterization Kraken; trust me, you don’t want that monster savaging your scenes.

Unfortunately, the Photoshop Elements panorama tool is inherently eight-bit. This means I must do all my major tone adjustments in Lightroom before panorama stitching. Adjusting the tones of a single image is tedious, doing it for many panorama frames is cruel and unusual punishment. Adobe’s answer is always the same; give us a lot of money and we’ll release you from eight-bit Hell! Lucky for us Affinity gets us out of eight-bit panorama Hell for a lot less.

The following panorama of the mountains near the eastern entrance of Glacier National Park was directly generated from Nikon NEF RAW files. All feature detection and blending calculations were high-bit. Tone adjustments were aided by regular tone mapping. Tone mapping is like an automatic Zone System. Compared to what I used to put up with ten years ago Affinity panorama building is almost as easy as scanning scenes with an iPhone. Affinity significantly streamlines routine retouching and panorama building.

Looking west from just outside the eastern side of Glacier National Park near Saint Mary. The weather was grim and dark, just the way I like it, when I braved the rain to snap the frames that went into this panorama. I built this panorama directly from Nikon NEF files in the Windows version of Affinity Photo. My favorite image processor, Picture Window Pro, is being retired and I am exploring alternatives. I rather like this result.

Looking west from just outside the eastern side of Glacier National Park near Saint Mary. The weather was grim and dark, just the way I like it when I braved the rain to snap the frames that went into this panorama. I built this panorama directly from Nikon NEF files in the Windows version of Affinity Photo. My favorite image processor, Picture Window Pro, is being retired and I am exploring alternatives. I rather like this result.

What new capabilities does Affinity offer?

So far, the features I’ve discussed are common to most image editors. Does Affinity offer anything new or special? There is one Affinity feature, the FFT (Fast Fourier Transform) Denoise filter, that greatly mitigates one of my long-standing retouching nightmares: regular patterns.

Many old portraits were printed on patterned paper. The following is a crop of an old (1935) baby picture of my mother.


My mother as a seven-week-old baby. This 1935 picture was printed on patterned paper. Patterned paper often adds luster and depth to photographs but it also makes it more difficult to retouch them. The Affinity Photo FFT (Fast Fourier Transform) Denoise filter can remove regular patterns without unduly softening images.

As you can see the entire picture is covered with tiny regular hexagons. Patterned prints were popular in the early and mid-20th century. The pattern adds depth and luster and has the nice side effect of making prints difficult to copy. Patterns also make retouching difficult. Retouching spots and scratches on patterned backgrounds tends to make them more, not less conspicuous. If only there was some way to remove the damn pattern before retouching. Affinity’s FFT Denoise filter does just that.

I applied the FFT Denoise filter to my mother’s baby picture and then ran through my regular retouching regime: see the following before and after diptych.

For some restorations, the ones that please or annoy me, I create a before and after diptych. I want to convince myself that my restoration work was worthwhile. Most of the time the restored image is better but in more cases than I would like the original scan is superior. And, every now and then, I cannot decide which one I like the best. This rendering of an old faded patterned print of my mother as a baby is one of those images. The original print is on patterned paper. The pattern imparts a quality that the restoration lacks. Ansel Adams once wrote that the negative is the score and the print is the performance. For restorers, the scan is the score and the restoration is the performance. Sometimes the music is glorious and clear and sometimes it's rap – rhymes with crap!

For some restorations, the ones that please or annoy me, I create a before and after diptych. I want to convince myself that my restoration work was worthwhile. Most of the time the restored image is better but in more cases than I would like the original scan is superior. And, now and then, I cannot decide which one I like the best. This rendering of an old faded patterned print of my mother as a baby is one of those images. The original print is on patterned paper. The pattern imparts a quality that the restoration lacks. Ansel Adams once wrote that the negative is the score and the print is the performance. For restorers, the scan is the score and the restoration is the performance. Sometimes the music is glorious and clear and sometimes it’s rap – rhymes with crap!

Affinity Conclusion

Affinity, like PWP, is a great value. The first Windows version is already superior to every version of Photoshop Elements I’ve ever used. It’s not as comprehensive as full Photoshop, but if you subtract the marginal features of Photoshop and keep the essential core elements, you’ve basically described Affinity’s feature set. Affinity is a layer editor but it’s not a Photoshop clone. The UI is completely modern, RAW development is built in, sixteen-bit layers are the default, and useful stack operations like automatic alignment are a click away. Affinity also supports thirty-two-bit HDR file formats and high dynamic range composites. There’s a lot of bang for the buck: strongly recommended!

  1. If you remember what happened to Goliath siding with David doesn’t seem like much of a risk.

Falling Colors Technology a BHSD Crony that needs Competition

BHSD (Behavioral Health Services Division) is a New Mexico state agency that doles out federal and state funds to a variety of small, ostensibly health related, programs. For example, in the state of New Mexico, BHSD runs a program called Synar1 that attempts to cut down on merchants selling cigarettes to minors. One Falling Colors employee characterized the program as “mostly stick and no carrots.” Synar funds a stable of ambush inspectors that descend on merchants hoping to catch them selling to minors. It’s a standard bit of well-intentioned government coercion. If you are wondering what’s in it for the merchants stop wondering: it’s all stick. They lose sales and face fines. If you are wondering why the state of New Mexico supports Synar follow the money. Depending on the dubious statistics compiled by Synar administrators the state could lose millions of dollars of federal grants if the percentage of offending merchants exceeds an arbitrary threshold. Synar, in the twisted minds of state bureaucrats, “generates revenue.”

In addition to saving the state from the unconscionable scourge of teenage smoking BHSD also funds a mishmash of programs to prevent drug addiction, help the mentally ill, subsidize methadone treatments and reimburse psychologists, psychiatrists and other health professionals for counseling and other services. BHSD’s budget for all these operations is, according to Mindy Hale, roughly fifty million dollars per year. In the greater wasteful schemes of government this is small beans, even for New Mexico, but, it’s still fifty million public dollars so it’s not out-of-bounds to ask, what are the taxpayers getting for their money?

If you are naïve enough to think the intended clients of BHSD’s largesse, the teenage smokers, the mentally ill, and the drug addicts, garner the lion’s share of that fifty million dollars you’re probably a statist or a moron, but I repeat myself. Many years ago a wise old wag, when badgered about the high cost of landing a man on the moon, chirped, “None of that money was spent on the moon!” While some of BHSD’s fifty million is directed to clients, the moon, the lion’s share goes to contractors, service providers, and BHSD internals. Whether the state of New Mexico and the federal government are getting good value for their money is debatable; what’s not debatable is that some IT service providers are doing very well from themselves.

Two IT providers consume a significant share of BHSD IT funds: Optum New Mexico and Falling Colors Technology. The founders of Falling Colors, Mindy Hale and Pamela Koster3, claim Optum bills the state of New Mexico roughly four million dollars per year for the onerous job of cutting checks. It’s important to understand that Optum is not dispensing their own money. They are simply managing a pool of funds that are replenished by state and federal tax dollars. Yes, it takes money to manage money. You have to pay auditors, comptrollers, and other financial professionals to make sure the funds are not redirected into questionable pockets. Surely you don’t think New Mexico’s corruption free government would abscond with unwatched dollars?


The Falling Colors Technology Logo. This logo was designed by a competent graphic designer. I’ve observed an inverse relationship between the quality of company logos and the products and services they offer.

Still, four million seems a bit steep for providing a routine service that any experienced financial entity like a bank could do, and to the state’s credit, they have recognized this and are in the process of renegotiating Optum’s four million dollar fee. Optum has responded with a “this isn’t worth the damn hassle” attitude. If they cannot get their four million they’re threatening to pull out of the state and cede the check cutting business to others. How much of this is hardball negotiating, corporate whining, or even the truth, is hard to determine. The only thing that seems certain is that there is a business opportunity for an IT provider if Optum makes good on their threat and leaves New Mexico.

Falling Colors Technology, a little company that is already extracting about one million dollars per year from the state, is angling to take over Optum’s fund disbursement role. In standard insider crony fashion, they hope to keep this transfer quiet and elude potential competitors. Why go through all that messy inefficient public bidding? There’s only one problem with their business plan. Falling Colors has absolutely no experience managing funds. There is nobody on their staff that could be considered a financial professional. They are planning to hire staff, but I have to wonder why BHSD, and the state of New Mexico, are considering flushing millions of dollars through an entity that has no financial expertise and has already received a formal letter of warning for shoddy IT work.

Instead of branching out into lines of business that they have no experience with Falling Colors efforts would be better invested in fixing their core problems and they have lots of core problems. Let’s look at what nearly one million dollars or public funds per year buys from Falling Colors Technology.

Your one million is buying a few unreliable, crash prone, insecure, low volume websites geared towards BHSD staff and service providers. When I first ran the following SQL query on the database that backs many Falling Colors websites I was alarmed at the results.

SELECT  iq.WeekNumber ,
        AVG(iq.ErrorCount) AS AvgWeekErrors ,
        MIN(iq.ErrorCount) MinWeekErrors ,
        MAX(iq.ErrorCount) AS MaxWeekErrors ,
        STDEV(iq.ErrorCount) AS StdDevWeekErrors
                    COUNT(1) AS ErrorCount ,
                    MIN(DATEPART(iso_week, TimeUtc)) AS WeekNumber
          FROM      dbo.ELMAH_Error
        ) iq
GROUP BY iq.WeekNumber

Falling Colors websites were crashing about twenty times per day. On some days the crash count exceeded fifty. I thought to myself, “If this doesn’t dramatically improve this little company is doomed.” I’ve worked with lots of bug infested software over my long career but twenty to fifty crashes per day, distributed over a few dozen users, was an entirely new level of unreliability.

Why is it so bad? The developers at Falling Colors, like developers everywhere, bitched about “inherited code.” Basically, this means they’re working with code that they didn’t entirely write themselves. Developers complaining about inherited code is so common that software managers rightly label it whining. Software developers bitching about inherited code is like civil engineers griping about inherited bridges. The world is not created fresh every day. The inherited code base is a source of problems but the main reason Falling Colors exhibits such a high crash rate is simply a lack of formal quality control.

Testing at Falling Colors is mostly performed by one beleaguered Business Analyst. She runs through a series of basic web page checks after significant new releases. This is a very low standard of testing for modern software development. Falling Colors does not practice many common quality control techniques. For example, most development environments support a variety of internal testing tools. Falling Colors is a Visual Studio shop and Visual Studio has built-in unit testing tools and supports a host of third-party add-ons. Developers focused on quality, spend as much time implementing internal units test as they do writing production code. There is an entire coding regime known as TDD that strongly promotes writing tests before you write software to pass the tests. At the end of June 2016, there were precisely zero internal unit tests in Falling Color’s code base. In addition to missing internal unit tests, there were no repeatable or scripted tests, no large case tests, and no stress tests. Lack of formal testing combined with misplaced developer optimism is a recipe for high error rates and Falling Colors is really boiling that pot.

Buggy insecure low volume websites are a dime a dozen. There’s a lot of crap out there. If Falling Colors cranked out standard public websites we would click on and ignore their rubbish. Unfortunately, being intertwined with BHSD, the users of Falling Colors websites do not have the option of clicking on. Making things worse, Falling Colors hosts a substantial amount of HIPAA protected information.

HIPAA is a set of federal guidelines that outline how health providers and their contractors must protect information that might be used to uniquely identify people. HIPAA penalties, for both providers and individuals, are severe if protected information is either accidentally or willfully disclosed. You can go to jail for exposing HIPAA protected information.

HIPAA guidelines list common data elements that must be protected. There is only one way to properly protect these elements: full element encryption. Every single data element should be encrypted and the keys should be rigorously guarded by a small number of individuals. Even developers, especially developers, should never see the unencrypted information. This is the way things should work, but, if you have followed the news about an unending stream of website hacks and data breaches, you’re probably aware that this is not how it works in the big nasty world.

It’s certainly not the way things are working at Falling Colors. With the exception of website passwords, which were only hashed in the last year,4 HIPAA data is stored in plain, ready to hack, text. If I were an IT savvy methadone user in the state of New Mexico I would be reluctant to disclose personal information to CareLink, TreatFirst, Prevention, or any of the Falling Colors managed programs. One HIPAA breach and your methadone habit is on Facebook.

Falling Colors is fully cognizant of their shabby security and are planning to eventually fix it. They’re taking steps to harden their websites and tighten up their loose databases but they are not, as of the end of June 2016, pursuing a full element encryption regime. Anything short of full element encryption is just putting lipstick on the security pig. Currently, Falling Colors is a HIPAA breach in waiting. BHSD would be well advised to insist on an immediate and independent full security audit of Falling Colors systems!

BHSD should also demand a fair and public RFP (Request for Proposal) process when seeking IT contracting services. Currently, some individuals in BHSD, in connivance with Falling Colors, are delicately crafting RFPs that are designed to exclude Falling Colors competitors. This is a blatant abuse of the public RFP process and the perpetrators should be ashamed of themselves. Crony state contracting may be business as usual in New Mexico but it is not in the interests of the pubic, BHSD, or even Falling Colors. Cronies without competition invariably turn into parasites and BHSD, which recently suffered a bedbug outbreak in their Santa Fe offices, has enough of those.

  1. The Synar program is named after Congressman Mike Synar of Oklahoma. How many tax dollars would be saved if it was illegal to name things after politicians?
  2. The founders of Falling Colors are questionable sources; their claims should be subjected to a high standard of scrutiny.
  3. Yes, incredibly user passwords were stored as plain text for years. This is monumentally inept.

JOD Update: J 8.02 QT/JHS/64 bit Systems

JOD LogoI have pushed out a JOD update that makes it possible to run the addon on J 8.02 systems. In the last eight months a QT based J IDE has been developed that runs on Linux, Windows and Mac platforms. To maintain JOD’s compatibility across all versions of J from 6.02 on I had to tweak a few verbs.

The only significant changes are how JOD interacts with various J system editors.  I have tested the system on Windows  J 6.02, J 7.01, J 8.02, and Mac J 7.01 systems. I expect it will behave on 32 and 64 bit Linux systems from J 7.01 on, but I have yet to test these setups. My hardware budget limits my ability to run common variants of Windows, Linux and Mac systems.

JOD is still not complete; that’s why the version number has not been bumped past 1.0.0. The missing features are noted in the table of contents of jod.pdf, (also available in the joddocument addon), with the suffix “NIMP,” which means “not implemented.”  I will fill in these blanks as I need them. Most of the time JOD meets my needs so don’t hold your breath.

If you want to make your own additions to JOD the program and documentation source is available on GitHub. Just follow the links and enjoy.

As a last note: I will be at the J Conference in Toronto (July 24 and 25, 2014) where I will be giving a short presentation and handing out a few hardcopy versions of the JOD manual to one or two JOD fans.

Twitter is not Trivial

I enjoy being wrong because it doesn’t happen very often. When Twitter first reared its itty-bitty head I thought it was one of the dumbest ideas ever. Who wants to “tweet”, in an utterly disorganized stream of consciousness way, 140 character messages to total strangers? What could emerge from such inane chatter? Isn’t this like looking for meaning in the splash patterns of tossed primate poop? Twitter may still end up being a bad business but it’s clearly shown that brevity really is the soul of wit.

Twitter teaches that:

The much maligned common man, the hoi polloi, the rabble, the unwashed masses, the lumpenproletariat are collectively a hell of a lot smarter than their delusional masters. Take any media hot topic of the last few years and compare snarky Twitter traffic to the foul self-serving emissions of the drooling, (they only think they rule), classes and you’ll quickly see that the Twitter’atti not only see through multiple layers of bullshit but usually arrive at viable solutions long before our self-declared “elites” even perceive a problem.

The public has a collective genius for image captions. The best political cartoons, captioned images and irreverent posters are no longer found in newspapers or “sanctioned” outlets. The best of the best are posted on Twitter. I waste far more time than I should thumbing through captioned Twitter images. I usually see at least one pants peeing funny image every day. Thank you Twitter: laughter is the only medicine that’s not encumbered by pharmaceutical patents, co-pays and Obamacare coverage fines.

The 140 character message limit brings people together. This is one of the more surprising Twitter revelations. My Twitter contacts are far more diverse that my email buddies, blog readers, Facebook friends or LinkedIn references. On Twitter I deal with “entities” I wouldn’t even cross virtual streets to pee on. I regularly hear from religious nuts, schizophrenic 9/11 troofers, Big Foot hunters, Austrian and Keynesian economists, far left and right bloggers, porn stars, government agencies, (the IRS is on Twitter), academic organizations, space-faring robots, mathematical theorems, gun nuts, anti-gun nuts, global warming alarmists and deniers, Justin Trudeau fans and, the most delusional of the lot: Obama supporters. You can take anything in 140 character doses!

Not only is Twitter fun it’s already logged an impressive public service resume by:

Providing an irresistible honey trap for narcissistic class A assholes. Without Twitter Anthony Weiner would probably still be a corrupt lying New York Democrat congressman or worse, the mayor of New York. Alex Baldwin would still be hurling feces on MSNBC and Dane Deutsch would be happily ensconced in the Wisconsin state legislature. Twitter makes it easy for our, in their own heads only, “elites” to self destruct in public. Self humiliation is the best kind.

Serving as another canary in the liberty coal mine. Distinctions like democratic and totalitarian are almost without merit. One man’s democratic republic is another’s Gulag. A more meaningful metric is: allows Twitter, bans Twitter. Here’s a list of Twitter banners. Only one country on this list surprised me. If your country is on this list you might want to consider emigration, or fleeing, because you are living in regime that cannot tolerate even 140 characters of criticism.

Yes Virginia, Twitter is far from trivial.

The New SmugMug

Websites compete in a brutal Darwinian struggle for eyeballs and clicks: adapt or die is an understatement. Every few years users get “upgraded” whether they want it or not. Generally things move in a better direction. Even twenty-something web programmers aren’t completely stupid but setbacks and complete disasters are not uncommon.

My new SmugMug layout

My new SmugMug layout – click to view.

My relationship with SmugMug started about five years ago when my Flickr account was suddenly declared “mature” by some faceless administrative ape that couldn’t tell the difference between innocent nudes and pornography. I was so incensed I sent Flickr administrators a message they couldn’t ignore. I painstakingly deleted all my images, mass deletion was oddly not supported by the Yahoo’s that ran the joint, and dropped my account. How’s that for maturity? I consider it my sacred duty to punish companies that screw customers. After dumping Flickr I searched around and found SmugMug.

There were things I liked and didn’t like about SmugMug. SmugMug did a better job of displaying images than Flickr and you could select your own damn background color. On the downside, SmugMug has more of an empty art museum vibe than Flickr’s busy social whorehouse milieu. For a few days I missed complete morons dropping snide asides on my pictures. The only comments I get on SmugMug come from family members and energetic strangers that find something they like enough to breach SmugMug’s spam fortifications. The silence is welcome. This is an art museum after all.

SmugMug offers a number of account types. I am a “power” user. A power account falls between basic and pro accounts. This account differentiation makes sense. SmugMug users fall into three classes.

  1. Basic: plain folks that want a no fuss picture website.
  2. Power: nondelusional keen photographers.
  3. Pro: delusional “serious” photographers.

Only paparazzi, porn, wedding, fashion, sports and National Geographic photographers are making any money taking pictures these days. If you don’t fall into any of these classes my guess is that you are spending more on photography than you are making. SmugMug harbors many photographers attempting to sell their pictures. I would never buy a picture nor would I expect to sell one. I see many shots on sale that aren’t as good as many I’ve made for free. There are billions of cameras in the wild these days. Photography is not exempt from the law of supply and demand. When the supply is nearly infinite what do you expect the price to be? This is why newspapers are laying off staff photographers and small photography studios have mostly gone out of business. As a keen amateur photographer this saddens me but as a right-wing libertarian death beast it warms my dark evil heart that most of the “photographers” being discarded are rent seeking nitwits playing Henri Cartier-Bresson. Still photography is no longer a viable personal business! We’re deep into another age people. Now that you see where I’m coming from let’s get on with what’s good and bad about the new SmugMug.

Let’s start with the good stuff:

  1. Stretchy layouts: The new website does a better job of automatically adapting to a variety of display devices. I’ve browsed my own site on phones, tablets, laptops and giant desktop displays. It looks OK on all of them and great on tablets and laptops.
  2. Easy customization and layout tweaking: It took me all of ten minutes to figure out the new layout controls. Programming easy nontrivial customization is hard. Here the SmugMug programmers have brilliantly succeeded. I know enough about JavaScript, CSS and HTML to roll my own designs but photography is a hobby; I’d rather spend my time taking and refining pictures that writing JavaScript to display them.
  3. Better overall site organization: The new site organizer is a significant improvement on older methods and allowing deeper directory paths will be appreciated by many.
  4. An improved and better integrated mapping facility: Displaying geotagged images on old SmugMug was somewhat jarring. The control was clunky and it didn’t match your site design. The new control adapts to your layout and “circle” area browsing is slick and intuitive.

Now for the dark side:

  1. Website migration has hiccups: My site has over two thousands images. I didn’t expect the migration to the new layout to be without problems and it wasn’t. For me the biggest problem was the handling of keywords. The new site does not properly display keyword strings if the “;” delimiter is used. I have thousands of pictures with “keywords” like:

    The “;” character delimits separate words. It should be displayed like:

        5x5 | capillary | glass | microscope | polywater | ultra

    When you click on the “;” string it is interpreted as a “find all images with all these keywords” request which is usually the very image you are browsing. This is mostly a display problem. The individual keywords were properly parsed and loaded.

  2. Custom API applications break: I use a custom application I wrote to synchronize my SmugMug online galleries with my offline ThumbsPlus databases. This application issues SmugMug REST API calls to collect and update image metadata. When I migrated I expected this application to stop working and it did. It looks like I need to authenticate my application with the new SmugMug site. There are no easily found instructions on how to do this. I hope this is just an oversight and that power users can still run custom API applications.
  3. The new map control is limited to one hundred images: The slick circle browser map control will only display one hundred images and there is no easy way to set it to map pictures in a particular gallery. The old clunky control allowed two hundred pictures and it could display arbitrary galleries.

I could go on but I program for a living. I know users always whine about change and seldom express gratitude for all the hard work the code monkeys of the world do to keep the lights on. Overall the new SmugMug is better than the old. There are problems but for a first production cut this is fine work. It certainly merits one prestigious Analyze the Data not the Drivel attaboy award. See the following to print your award.

Dilbert almost gets an Attaboy

Dilbert almost gets an Attaboy. What would we do without Dilbert? For more click and enjoy.

JHS meets MathJax

With the release of J 7.01 Jsoftware “deprecated” COM. J 6.02 can run as a COM automation server but J 7.01 cannot.1 Throwing COM under the bus is hardly radical. Microsoft, COM’s creator, has been holding COM’s head underwater for years. Many .Net programmers cringe when they hear the word “COM” and the greater nonwindows2 world never really accepted it. COM is a complex, over-engineered, proprietary dead-end. Yet despite its bloated deficiencies a lot of useful software is COM based. So with COM going away, at least for J programmers, the hunt is on for viable replacements and while we’re hunting let’s rethink our entire approach to building J GUI applications.

J GUI applications are traditional desktop applications. They’re built on native GUIs like Windows Forms and GTK and when done well they look and act like GUI applications coded in other languages. This is all good but there is a fundamental problem with desktop GUIs. There are many desktop GUIs and they do not travel well. Programmers have spent many dollars and days creating so-called cross-platform GUIs but, if you wander off the Windows, Mac and Linux reservation, the results are not particularly portable. And, as portable GUIs rarely outperform native alternatives, programmers tend to stick in their tribal silos. GUI programming is a bitch, has always been a bitch and will always be a bitch. It’s time to divorce the desktop GUI bitch.

All divorces, even the geeky GUI variety, are hard. When you finally cut the knot you’re not entirely sure what you’re doing or where you’ll end up. All you know is that there is a better way and with respect to J GUI applications I believe that JHS is that way. JHS leverages the large CSS, HTML and JavaScript (CHJ) world and in recent years some impressive browser-based applications have emerged from that world. The application that changed my mind about JavaScript and browser-based applications in general is something called MathJax.

MathJax typesets mathematics. It renders both LaTeX and MathML using fully scalable browser fonts. This is better than what WordPress does. The following Ramanujan identity taken from MathJax examples renders on WordPress as an image.

\frac{1}{\Bigl(\sqrt{\phi \sqrt{5}}-\phi\Bigr) e^{\frac25 \pi}} =  1+\frac{e^{-2\pi}} {1+\frac{e^{-4\pi}} {1+\frac{e^{-6\pi}}  {1+\frac{e^{-8\pi}} {1+\ldots} } } }

MathJax renders the same expression with scalable fonts and supports downloading the expression as LaTeX or MathML text. This is pretty impressive for browser JavaScript. I wondered how hard it would be to use MathJax with JHS and was pleased to find it’s easy peasy.

Writing a basic JHS application is a straightforward task of setting up three JHS J nouns HBS, CSS and JS.3 HBS is a sequence of J sentences where each sentence yields valid HTML when executed. JHS generates a simple web page from HBS and returns it to the browser. MathJaxDemo HBS is:

 HBS=: 0 : 0
 '<hr>','treset' jhb 'Reset'

 '<hr>',jhh1 'Typeset with MathJax and J'
 '<hr>',jhh1 'Typeset Random Expression Tables'
 '<br/>','ttable' jhb'Typeset Random Expression Array' 
 '<br/>','restable' jhspan''        

CSS is exactly what you expect: CSS style definitions. Finally, JS is application specific JavaScript. MathJaxDemo JS matches HBS page events with corresponding JHS server handlers. This demo uses ajax for all event handlers.

JS=: 0 : 0

 function ev_ttable_click(){jdoajax([],"");}
 function ev_tquad_click(){jdoajax([],"");}
 function ev_tmaxwell_click(){jdoajax([],"");}
 function ev_tramaujan_click(){jdoajax([],"");}
 function ev_tcrossprod_click(){jdoajax([],"");}
 function ev_treset_click(){jdoajax([],"");}

 function ev_ttable_click_ajax(ts){jbyid("restable").innerHTML=ts[0]; MathJax.Hub.Typeset();}
 function ev_tquad_click_ajax(ts){jbyid("resquad").innerHTML=ts[0]; MathJax.Hub.Typeset();}
 function ev_tmaxwell_click_ajax(ts){jbyid("resmaxwell").innerHTML=ts[0]; MathJax.Hub.Typeset();}
 function ev_tramaujan_click_ajax(ts){jbyid("resramaujan").innerHTML=ts[0]; MathJax.Hub.Typeset();}
 function ev_tcrossprod_click_ajax(ts){jbyid("rescrossprod").innerHTML=ts[0]; MathJax.Hub.Typeset();}

 function ev_treset_click_ajax(ts){
   jbyid("restable").innerHTML=ts[0]; jbyid("resquad").innerHTML=ts[0];
   jbyid("resmaxwell").innerHTML=ts[0]; jbyid("resramaujan").innerHTML=ts[0];

Running the JHS MathJaxDemo is a simple matter of:

  1. Downloading the J scripts in the MathJaxDemo folder to a local directory.
  2. Editing J’s ~config/folders.cfg file and pointing to your download directory with the name MathJaxDemo. You are set up correctly if jpath '~MathJaxDemo' returns your path.
  3. Loading the demo: load '~MathJaxDemo/MathJaxDemo.ijs'
  4. Browsing to the site:

It’s not hard to use JHS as a general application web server. JHS provides many common controls right out of the box but to compete with desktop applications it’s necessary to supplement JHS with JavaScript libraries like MathJax. In coming posts I will explore how to use industrial strength JavaScript grids and graphics with JHS.

MathJaxDemo Screen Shot

Screen shot of JHS MathJaxDemo running on Ubuntu.

  1. On a purely numerical basis there is no greater nonwindows world.
  2. To learn about JHS programming study the JHS demos and the JHS browser application.
  3. Bill Lam has pointed out that J 7.01 can function as a COM client. The JAL addon tables/wdooo controls OpenOffice using Ole Automation which is one of the many manifestations of COM.

Git me a Hub’bery

Sometime ago I crossed my machine synchronization threshold. I routinely work on four operating systems, three laptops, a few servers, a bunch of phones and so on. I synchronized the directories I cared about while forming deep and rewarding relationships with file sharing services like Dropbox. Dropbox is great but its success has attracted the attention of paranoid IT micromanagers and it’s now frequently blocked on internal corporate networks. The beSOXed imbeciles that set IT policies will not rest until it’s impossible to do useful work on corporate assets and people wonder why there is so little return on IT investment.

Living without Dropbox and its many peers is annoying but tolerable provided humble USB ports are still useful but restricting plug-in drives is now standard not-operating procedure in most companies. So if you cannot share files or use USB what’s left? Would you believe GitHub?

GitHub is close to total global dominance in the geeky code sharing world. I would not have expected a version control system to attract a fiercely loyal and dedicated cult following but it has. Linus Torvalds, the Linux demigod, started Git when he correctly observed that all pre-Git version control system were fundamentally flawed because they enshrine the overlord peon hierarchy. The overload, usually some IT nitwit, manages the entire code submission, review and backup process and the peons, that would be us, bend over take it. Until Git appeared the majority of programmers despised and detested version control. It was just more IT management bullshit.  We put up with it because version control is a necessary evil and paychecks are even more necessary evil. We only wished things could be less evil and then Git appeared.

Git dumped the overlord peon hierarchy and adopted the radical notion that all repositories are created equal. When you synchronize Git repositories everything is synchronized. Your local copy contains everything the source does. This makes Git a superb peer-to-peer file distribution tool. Not only are you distributing files you’re also distributing their complete histories. This, almost biological replication model, has resulted in an explosion of Git repositories and the rise of hub sites like GitHub. Git’s dominance would not be possible if it was centrally managed. It’s succeeded because it’s harnessed market chaos.

I’ve used Git for over a year and I have often thought about pushing JOD source to public repositories. This weekend I bit the bullet and set up public repositories. Now it’s easy to Git me a Hub’bery!

Turn your iPhone into a jPhone

J for iPhoneJsoftware recently released a free J app for the iPhone. Search for “jsoftware” in Apple’s app store and you will land right on it. There are many excellent free iPhone apps, I have half-a-dozen on my iPhone, but this little jewel sets a new standard for power in your palm.

Let’s start with the good news; this is not a crippled version of J. It’s the same high-caliber interpreter that J programmers have used on Windows, Linux and Mac machines for years. Anything this app’s big desktop brothers can do this little app can do. You won’t see the same blistering desktop speed but I think you’ll be pleasantly surprised at how fast this app munches numbers. I have run desktop J since the 1990’s and J on the iPhone 4 matches or beats what I was seeing on laptops ten years ago. This is the real J deal.

Now for a few caveats — this J app is not a complete J development environment. To meet Apple’s restrictive app store rules C-style APIs, JAL addons, GUI tools and third-party libraries like opengl and lapack are not included. With this app you get the J interpreter and a few of the most useful J addons like plot. Despite these limitations this J app is probably the most powerful, purely local, calculator available for the iPhone.

Don’t believe me? Turn off Wi-Fi and set your iPhone to Airplane mode: Airplane mode cuts off phone and internet access.  Now try computing the following on your favorite iPhone calculator.

NB. generate one million random numbers and average them
(+/ % #) ? 1000000#10000

NB. generate a 50 50 random matrix, invert it and multiply with the
NB. original - rounding to the nearest 0.0001 to form an identity matrix
round=: [ * [: <. 0.5 + %~
matrix=: ? 50 50 $ 10000
invmat=: %. matrix
identity=: 0.0001 round matrix +/ . * invmat

NB. sum of the diagonal elements of matrix (identity) is 50
50 = +/ (<0 1) |: identity

NB. multiply two polynomials with complex number coefficients
polyprod=:  +//.@(*/)

NB. complex number coefficients - AjB is J's notation for A + Bi
poly0=: 2j5 3j7 0 1
poly1=: 1j2 0 3j7 0 0 2
poly0 polyprod poly1

NB. prime factorization table of 50 random integers less than one billion
(<"0 nums) ,: -.&0 &.> <"1 q: nums=.50?1e9

The J app blows through these examples on my iPhone 4 in a few seconds. Of course this is only a tiny taste of what J on the iPhone is capable of. I have managed to run 1000+ line J scripts on this app. The only desktop J code I have not been able to run depends on external compiled libraries like regex.

Cell phones are powerful little computers and it’s gratifying to finally see software that can focus that power on something other than Angry Birds.  If you’re interested in learning an array oriented functional programming language or if you’re already familiar with J and want to pack serious computational heat then this app is for you!

J iPhone app

J running on the iPhone

Turn your Blog into an eBook

If you have worked through the exhausting procedure of converting your blog to LaTeX: see posts (1), (2) and (3), you will be glad to hear that turning your blog into an image free eBook is almost effortless. In this post I will describe how I convert my blog into EPUB and MOBI eBooks.

eBooks how the cool kids are reading

eBook readers like Kindles, Nooks, iPads and many cell phones are optimized for plain old prose. They excel at displaying reflowable text in a variety of fonts, sizes and styles. One eBook reader feature, dear to my old fart eyes, is the ability to increase the size of text.  All eBooks are potentially large print editions. There are other advantages: most readers can store hundreds, if not thousands of books, making them portable libraries. It’s now technically possible to hand a kindergarten student a little tablet that holds every single book he will use from preschool to graduate school. The only obstacle is the rapacious textbook industry and their equally rapacious eBook publishing enablers. But fear not open source man will save the day. The days of overpriced digital goods are over! I will never pay more than a few bucks for an eBook because I can make my own and so can you! Let’s get together and kill off another industry that so has it coming!


Native eBook file formats like EPUB and MOBI do not handle complex page layouts well. If your document contains a lot of mathematics, figures and well placed illustrations stick with PDF workflows.[1] You will save yourself and your readers a lot of grief.  But, if your document is a prose masterpiece, a veritable great American novel, then “publishing” it as an EPUB or MOBI is great way to target eBook readers. EPUBs and MOBIs can be compiled from many sources.  I start with the LaTeX files I created for the PDF version of this blog because I hate doing the same boring task twice. By far the most time-consuming part of converting WordPress export XML to LaTeX is editing the pandoc generated *.tex files to resolve figures and fix odd run-together-words and paragraphs. To preserve these edits I use pandoc to convert my edited *.tex to *.markdown files.


Markdown is a very simple text oriented format. A markdown file is completely readable exactly the way it is. All you need is a text editor. Even text editors are overkill. You could compose markdown with early 20th century mechanical typewriters; it’s a low tech format for the ages: perfect for prose.

The J verb MarkdownFrLatex [2] calls pandoc and converts my *.tex files to *.markdown. I place my markdown in the directory


and to track changes to my markdown files I GIT this directory. MarkdownFrLatex strips out image inclusions and removes typographic flourishes.  When it succeeds it writes a simple markdown file and when it fails it writes a *.baddown file. Baddown files are *.tex files that contain lstlistings and complex figure environments that are best resolved with manual edits. After removing such problematic LaTeX environments the J verb FixBaddown calls pandoc and turns baddown files into markdown files.

Generating EPUB and MOBI files

When the conversion to markdown is complete I run MainMarkdown to mash all my files into one large markdown file with an eBook header. The eBook header for this blog is:

% Analyze the Data not the Drivel
% John D. Baker

The first few lines of the consolidated bm.markdown file are:

% Analyze the Data not the Drivel
% John D. Baker

#[What’s In it for


*Posted: 05 Sep 2009 22:44:50*

[Facebook]( is huge: they brag about a user
count well north of one hundred million. If only 0.5% of their users are
active that’s 500,000 *concurrent users.* How many expensive servers
does it take to support such a load? .....

Generating an EPUB from bm.markdown is a simple matter of opening up your favorite command line shell and issuing the pandoc command:

pandoc -S --epub-cover-image=bmcover.jpg -o bm.epub bm.markdown

You can read the resulting EPUB file bm.epub on any EPUB eBook reader. Here’s a screen shot of bm.epub on my iPhone.

iPhone loaded with my blog

iPhone loaded with my blog

The last step converts bm.epub to MOBI is a native Kindle format. Pandoc can generate MOBI from bm.markdown but it inexplicably omits a table of contents. No problemo:  I use Calibre to convert bm.epub to Calibre properly converts the embedded EPUB table of contents to MOBI.  Here’s on a Kindle.

Kindle loaded with my blog

Kindle loaded with my blog

All the “published” versions of this blog are available on the Download this Blog page so please help yourself!

[1] LaTeX is usually compiled to PDF making it one of hundreds of PDF workflows.

[2] All the J verbs referenced in this post are in the script TeXfrWpxml.ijs

WordPress to LaTeX with Pandoc and J: Using TeXfrWpxml.ijs (Part 3)

WordPress to LaTeX

WordPress to LaTeX

In this post I will describe how to use the J script TeXfrWpxml.ijs to generate LaTeX source from WordPress export XML.  I am assuming you have worked through (Part 1) and (Part 2) and have:

  1. Successfully installed and tested Pandoc.
  2. Installed and tested a version of J.
  3. Set up appropriate directories (Part 2).
  4. Know how to use LaTeX.

Item #4 is a big if.  Inexperienced LaTeX users will probably not enjoy a lot of success with this procedure as the source generated by TeXfrWpxml.ijs requires manual edits to produce good results.  However, if you’re not a LaTeX guru, do not get discouraged. It’s not difficult to create blog documents like bm.pdf.

Step 1: download WordPress Export XML

How to download WordPress export XML is described here.  Basically you go to your blog’s dashboard, select Tools, choose Export  and select the All content option.

Tools > Export > All Content

Tools > Export > All Content

When you press the Download Export File  button your browser will download a single XML file that contains all your posts and comments. Remember where you save this file. I put my export XML here.


Step 2: download TeXfrWpxml.ijs

Download TeXfrWpxml.ijs and remember where you save it.  I put this script here.


Step 3: start J and load TeXfrWpxml.ijs

TeXfrWpxml.ijs was generated from JOD dictionaries. With JOD it’s easy to capture root word dependencies and produce complete standalone scripts. TeXfrWpxml.ijs needs only the standard J load profile to run.  It does not require any libraries or external references and should run on all Windows and Linux versions of J after 6.01.  Loading this script is a simple matter of executing:

load 'c:/pd/blog/TeXfrWpxml.ijs'

The following shows this script running in a J 7.01 console. The console is the most stripped down J runtime.

Step 4: review directories and necessary LaTeX files

The conversion script assumes proper directories are available up: see Part 2. The first time you run TeXfrWpxml.ijs it’s a good idea to check that the directories and files the script is expecting are the ones you want to process.  You can verify the settings by displaying TEXFRWPDIR, TEXINCLUSIONS, TEXROOTFILE and TEXPREAMBLE.


If all these directories and files exist go to step (5).

Step 5: make sure you are online

The first time you run the converter it will attempt to download all the images referenced in your blog. This is where wget.exe gets executed.  Obviously to download anything you must be connected to the Internet.

Step 6: run LatexFrWordpress

Run the verb LatexFrWordpress.  The monadic version of this verb takes a single argument: the complete path and file name of the export XML file you downloaded in step (1).

xml=: 'c:/pd/blog/wordpress/analyzethedatanotthedrivel.wordpress.xml'

LatexFrWordpress xml

As the verb runs you will see output like:

   LatexFrWordpress xml
What's In it for Facebook?
downloading: c:/pd/blog/wp2latex/inclusions/demotivational-posters-facebook-you.jpg
1 downloaded; 0 not downloaded; 0 skipped
Fake Programming
downloading: c:/pd/blog/wp2latex/inclusions/672169130_vajvn-M.png
1 downloaded; 0 not downloaded; 0 skipped
Laws or Suggestions
downloading: c:/pd/blog/wp2latex/inclusions/i-B5mfdRF-M.jpg
1 downloaded; 0 not downloaded; 0 skipped
Lens Lust

... many lines omitted ...

downloading: c:/pd/blog/wp2latex/inclusions/i-mNK4RHL-M.png
1 downloaded; 0 not downloaded; 0 skipped
WordPress to LaTeX with Pandoc and J: LaTeX Directories (Part 2)
0 downloaded; 0 not downloaded; 1 skipped

When the verb terminates you should have a directory c:/pd/blog/wp2latex full of *.tex files:  one file for each blog post. Now the hard work starts.

Step 7: editing LaTeX posts

The conversion from WordPress XML to LaTeX produces files that require manual edits. The more images, video, tables and other elements in your posts the more demanding these edits will become.  My blog has about one image per post.  Most of these images are wrapped by text. LaTeX has a mind of its own when it comes to floating figures and getting illustrations to behave requires far more parameter tweaking than it should. This is a longstanding weakness of LaTeX that pretty much everyone bitches about. My advice is start at the front of your document and work through it post by post. The files generated by LatexFrWordpress do not attempt to place figures for you but they do bolt in ready-made figure templates as comments that you can experiment with.  Each post file is also set up for separate LaTeX compilation. You don’t have to compile your entire blog to tweak one post. The one good thing about this edit step is once you have sorted out your old posts you do not have to revisit them unless you make major global document changes. The next time you run LatexFrWordpress it will only bring down new posts and images.

Step 8: compile your LaTeX blog

I use batch files and shell scripts to drive LaTeX compilations.  I processed my blog with this batch file.

echo off
rem process blog posting (bm.tex) root file
title Running Blog Master/LaTeX ...

rem first pass for aux file needed by bibtex
lualatex bm

rem generate/reset bbl file
bibtex bm
makeindex bm

rem resolve all internal references - may
rem comment out when debugging entire document
lualatex bm
lualatex bm

rem display pdf - point to prefered PDF reader
title Blog Master/LaTeX complete displaying PDF ...
"C:\Program Files\SumatraPDF\SumatraPDF.exe" bm.pdf

The presence of Unicode APL, see this post, forced me to use lualatex. I needed some very nonstandard APL fonts.  See bm.pdf — also available on the Download this Blog page — to judge the effectiveness of my edits. Producing nice figure laden typeset blog documents is work but, as I will describe in the next post, producing image free eBooks is a simple and far less laborious variation on this process.