Math For An Aging Brain

A Personal Essay

By Gene Wilburn

In my 76th year I decided to re-engage in the study of math. It’s important for seniors to keep their brains active, and entire books have been written about the plasticity of the brain and the desirability of keeping it stimulated. Yet one tires of a daily round of crosswords, sudokus, and Solitaire.

Even so, why math? Partly it’s because math fascinates me with its precision, logic, elegance, and beauty. Plus, math is hard. Hard, but not unreasonably hard. It requires effort, which is the point of studying it. Math is also open ended—you can grow with it, taking it as far as your ability allows. This makes math a progressively stimulating brain exercise. There’s always something new to challenge your thinking.

To put my relationship with math in perspective, some background is in order. Before immigrating to Canada and becoming a Canadian citizen, I grew up in the U.S. at the time the former Soviet Union shocked the world by launching Sputnik I, the first man-made orbiting satellite, in 1957. This triggered a call for a new generation of scientists and engineers.

President Ike and the Republican Party (which in those days was pro science) highly encouraged students to study science and engineering —today called S.T.E.M. studies —to catch up with the Russians and to usher in a new scientific and technological era. The transistor had been invented, television was transitioning from black-and-white to colour, rockets were sitting, and sometimes exploding, on launch pads, and exciting things were afoot. I wanted to be part of it.

Being an impressionable teen who, though tall, played basketball rather poorly and baseball even worse, I decided I’d become an engineer rather than pursue my childhood fantasy of becoming a sports star. I wasn’t sure, exactly, what engineers did for a living, but I owned a pair of engineering boots and thought them grand. In my naïveté I imagined that any profession where wearing them was considered de rigeur was the profession for me.

And so I began the math studies of a typical university-bound high schooler of the day: algebra 1, plane geometry, algebra 2, analytic geometry, and a mishmash of precalculus, including trigonometry, functions, slopes of lines, and limits . I was a solid B+ student, decent but plodding.

Off I went to university, slide rule dangling from my belt in a leather holster. I spent my first year flailing away at calculus, chemistry, engineering drawing, and vector analysis  and doing poorly in all of them. The only class I liked, and excelled at, was English Composition.

In a life-changing epiphany, it occurred to me that I could switch majors to English, which I did, and for which I’ve been grateful my entire life. I changed abruptly from S.T.E.M. studies to the Humanities, and I couldn’t have been happier. Literature, philosophy, history, French language, art, and music became my focus. I didn’t miss mathematics at all —then.

But life is strange, taking unexpected turns. In the end, against any reasonable probability, I became a kind of engineer after all. Propelled by a deep fascination with personal computers, I self-studied my way into the field of Information Technology, learning the skills needed to undertake a new profession: programming, database design, networking, web design, and infrastructure management. Over the years I’ve worked in IT, variously, in a cultural agency, a government department, a small business operation, and a large financial organization, all in Toronto. It has been an engaging and satisfying career.

Now, as I enter what are sometimes referred to as the “Twilight Years,” I find myself re-attracted to the study of math, though not with any specific goal in mind other than keeping my brain supple  by challenging myself with problem solving that transcends crosswords and sudokus.

With this goal in mind, I ordered a widely-used university textbook on precalculus. My god, I thought, can a math textbook really cost $170? How do today’s students afford them? The sticker price caused my senior’s fixed income to stutter. At least the graphing calculator app I selected for my iPad was free.

Ever since the book arrived, I have attempted to learn at least one new concept each day, or to work on an existing set of math problems until I get them right. Speed is not an issue. Slow and steady is the path.

Not that math study has so far helped me remember why I’m standing in the pantry, staring at the food shelves, or trying to recall what day of the week it is, but gradually, almost imperceptibly, I sense that my memory is improving. This is anecdotal, to be sure, but when I mentioned to my family doctor that I was reviewing math, he was delighted to hear it and encouraged me to continue, saying that he wished more of his senior patients would engage in a similar pursuit.

I’m aware that the study of math is the last thing in the world most seniors would want to undertake. Too many of them have had unpleasant experiences with it during their school years. Nor is the study of math something you can readily share with family or friends. People are impressed if you tell them you made a hole-in-one playing golf yesterday, or even if you finished under par, but they’re not perceptibly eager to hear that you successfully solved and graphed a dozen polynomial nonlinear inequalities.

Studying math is a solitary pursuit, almost a meditation on the nature of numbers. What distinguishes my study today from my studies as a young student is that there are no deadlines or exams. I can take my time. But I no longer wear engineering boots; they’ve been swapped out for fuzzy house slippers.

Greppy: A Lightweight Perl/PHP Website Search Engine Based on Grep

GNU Grep

My friend and colleague Mark mentioned to me recently that one of his clients was interested in having a search engine on their website and did I have any ideas?

The scenario was this: the site is an informational site, with monthly updates and is hosted in the AWS cloud. It runs in a minimal instance of Linux, with only 1 GB RAM and very tight storage. It’s not an e-commerce site. Was there something small and lean enough to serve?

Mark and I had once worked together on a project for a different client where we installed Apache Solr to build a sophisticated search engine for large amounts of data, but Solr would be massive overkill for the site in question.

GNU Grep to the Rescue

As I thought about a solution for this small site, I immediately thought of grep, the open-source search utility with a long Unix heritage that can absolutely rip through text files to search for words or phrases and show them in context. All it needs are some text files to aim at.

The site in question has a large number of PDF files and HTML files. What, I thought, if copies of these were converted to plain text files and placed in data directory where grep could rapidly search through them? Text files could substitute for the usual inverted index of search engines and, at the same time, have a much smaller footprint on the system. The client wasn’t looking for fancy searches.

Similarly, grep doesn’t need much memory to run in. Furthermore, a lightweight website search engine based on grep could be built with a few day’s programming and testing. After getting the go-ahead to start programming, I invoked vim and began building a simple system.

Building the Text Database, or Index

I knew I’d use Pandoc to convert html files to plain text, but I needed something to convert the PDFs. I discovered the command-line utility pdftotext that is part of xpdf-tools in Linux. (For MacOS, Homebrew installs the utility when you install xpdf.) Between these two, pandoc and pdftotext, I had to tools for building a text database.

To that end, I wrote a batch-processing script in Perl, that takes the results of a find command that selects all the PDF and HTML files on the site, and processed them through pandoc or pdftotext, putting the resulting text files into a collective data directory called textdata. The script also checks an exclude.txt file that can used to exclude directories that contain private information.

Embedded Filename Metadata

GNU Grep is not a fully-featured search engine, but with a little help from the GNU ls command I was able to prepend the date of last creation or update (mtime format) to the filename so it could later be sorted into most recently updated work to display at the top of a search.

The batch script populates the textdata directory with files that look like this:

1645031852dot5724170650dot_99_news_99_2021_August_Newsletterdotpdf.txt 1645031853dot9364470210dot_99_news_99_2021_February_Newsletterdotpdf.txt
1645031855dot7164861080dot_99_news_99_2021_January_Newsletterdotpdf.txt 1645031857dot1805182550dot_99_news_99_2021_July_Newsletterdotpdf.txt

Breaking this down, the initial part of the saved text file — 1645460076dot2987186510 — is the date in mtime format.

The word dot indicates an initial dot (.) in the relative pathname, and every _99_ represents a forward slash (/) in the original pathname. The added new file extension is .txt.

This metadata allows the search program to quickly reconstruct the path back to the original document, and to replace .txt with the original extension name.

The Search Module

The client’s website is powered by PHP, so that is the language I used for the search module.

A search form module, searchform.php, prompts for a search term or phrase, which is then passed to the main search program, search.php. The search.php script, in turn, calls on grep to do the search and stores the results in an array that is then reverse sorted. Looping through the array, the search script reconstructs the full path and original extension of the filename, turning it an <a href> HTML link.

To make the results easier to read, search terms found in the results are highlighted in red, to make them stand out in context. Overall appearance is controlled in HTML with an embedded CSS style sheet.

The results, reflecting the song lyrics in my test site, look like this:

Context and Word Boundaries

To refine the search somewhat, the searchform.php file offers two checkboxes. The first allows the searcher to search on whole words and phrases, or do stem searching. In a whole word search, the default setting, the word “train” for example would find instances of “train”, “Train”, or “TRAIN” as a whole word surrounded by spaces or by punctuation. A stem search on “train” would find “train”, “Train”, or “TRAIN” as well, but also things like “trains,” “training,” and “restrain.” This is sometimes useful as an option.

The second checkbox specifies the amount of context surrounding the search term. The default is up to 90 characters on either side of the term. Unchecking the box results in a context of three lines of text: the line before the search term is found, the line it’s in, and the line following.

Batch Processing

To keep the search index, or text data directory, in sync with the information on the site, the buildindex.plscript uses brute force. It deletes everything in textdata/ and rebuilds it from scratch. What this lacks in sophistication it makes up for it in efficiency. It takes no more than five minutes to rebuild the index for the entire site, which can be run manually when needed, or run as a cron job at desired times.

Bottom Line

To our delight, this lightweight, batch-oriented search engine is speedy, and is well suited to the needs of the client. In honour of grep, we named the search system Greppy.

Greppy follows the Unix philosophy of using existing discrete utilities combined together to process text files. There is no need to reinvent the wheel.

To make this engine available to others who might have a use for it, it is available here at Github.

Gene Wilburn is a Canadian IT specialist and technical writer

Antix Linux

antiX Linux: A Lightweight Speedster

antiX Linux Screenshot

antiX (pronounced “antics” not “anti-X”) is a lightweight Linux distro designed to run on minimal PC hardware, or “live” on a stick disk, similar to LXLE, Lubuntu, Linux Lite and others. AntiX Linux is based on Debian but proudly describes itself as free of systemd — a newer and widely used system startup environment that some Linux gurus dislike.

Two things immediately caught my attention about this distro. Despite being lightweight, it includes major software such as LibreOffice and Firefox. It uses the snappy IceWM as its default windows manager along with the Rox file manager. I’ve always liked IceWM so I decided to give antiX a whirl on my sandbox computer: an older Lenovo Thinkpad laptop with Intel i3 processor, 4GB memory, and 400GB HD — not exactly minimal, but not particularly fast either.


Installation is easy. The installation process has a somewhat different look and feel compared to Ubuntu-derived distros but it asks most of the same questions.

During installation it prompts for two different passwords, the user password and the root password. It’s possible to create a user with no password at all, which may be convenient for some users. I chose to create a user password to keep it in line with my Ubuntu and Mac systems.

Once installed, antiX presents an attractive IceWM environment with a bold wallpaper and an app called Conky that displays a number of runtime stats including current time, uptime, CPU, disk, and connection usage. Conky can be switched off via the Desktop menu if you find it distracting.

The IceWM menu is invoked from the start button in the lower-left hand corner of the screen, à la Windows, but it can also be invoked from anywhere on the desktop with a right-click of the mouse. IceWM has dozens of contributed schemes you can try out to subtly alter the appearance of the desktop environment. I particularly liked the metal 2 look.

The only app I found wanting in the initial setup is Rox file manager. Finding it too primitive for my taste, I installed PCmanFM ($ sudo apt install pcmanfm) and added it to the menu’s Personal tab for quick access.

The default terminal app is Roxterm. It seemed quite decent but it crashed on me when I moved my .profile file to .bashrc. It was also not exporting my customized $PATH statement. So I installed LXTerminal ($ sudo apt install lxterminal), which is also lightweight and fast, and configured it to be my preferred terminal application. LXTerminal interpreted my .bashrc file perfectly.

Next up was Dropbox, which I use to share my writing and scripting files across my computers. It requires installing the Dropbox daemon via the Synaptic package manager, a simple task. You choose Nautilus-Dropbox from Synaptic. Rest assured it doesn’t install Nautilus dependencies.

All my writing is done in Markdown plain-text format so I installed Ghostwriter, a dedicated Markdown editor. In line with Markdown, I installed Pandoc for converting Markdown files to other file formats. I use TeX/LaTeX for typesetting which required the installation of Texlive and, in my case, LyX, a graphical document editor to accompany LaTeX.

Likewise, I installed Sigil, an Epub creator and editor for producing nice-looking ebooks, and, not finding Gimp on the system, I installed that too.

Missing Utilities

One thing that never works well for me in Linux is a laptop’s trackpad. Unlike the slick trackpad drivers on a MacBook, Linux trackpadding ends up shooting me all over the screen so I end up typing things in the wrong paragraph — not something that makes a writer happy. To fix this problem I prefer to switch off the trackpad entirely and use a wireless USB mouse instead.

There is no simple way to do this in antiX. You have to issue the command synclient TouchpadOff=1 to switch the trackpad off. Because I usually forget how to invoke this command I created two .bashrc aliases:

alias padoff='synclient TouchpadOff=1'
alias padon='synclient TouchpadOff=0'

allowing me to switch off the trackpad by typing padoff, a command I can remember, at a terminal prompt.

The antiX Control Panel offers no visual support for a laptop’s power management. An Internet search tipped me off that some antiX users install xfce4-power-manager to set power levels for both plugged-in and battery options. It brings in very little of the xfce4 environment, keeping the distribution light. Using XFCE Power Manager I was able to easily adjust my Thinkpad to switch off the screen when the lid is closed, and to go to sleep after a certain timeout. This greatly improved the Thinkpad’s battery life.

Okay, But What is This?

I’m impressed at the way antiX Linux adds new programs to the IceWM menu. Painless. Except for one weird exception.

I’ve lately been using an open-source, Markdown-based note-taking app called Joplin across all my computing platforms — MacOS, iOS, and, of course, Linux. I hoped that I could type $ sudo apt install joplin, but this wasn’t in the repositories for antiX.

This took me to the Joplin site where I downloaded the Linux file Joplin-2.5.10.appimage. Neither antiX nor I had ever seen this file extension before. An Internet query explained that it was a self-contained Linux program (“container?”) with all the dependencies included. After setting the permissions of an AppImage file to execute, you can double click the app in a file manager to launch it. AntiX certainly had no built-in way to deal with an AppImage package, nor any way to add it to the menus.

To make it simpler for me to use, I placed the Joplin AppImage file in my $HOME/bin directory and created a symbolic link to it called joplin. Since I nearly always have a terminal open, this allowed me to launch the program simply by typing “$ joplin &”.

Bottom Line

To be honest, antiX Linux made my day. It’s not often I find myself highly attracted to a new distro, but I enjoy antiX so much I’m going to keep it as the default Linux on my Thinkpad laptop. Due to its speed and lightweight interface, it’s easily one of the top distributions to consider for aging computers. In fact, my Thinkpad has never run better. It’s made a believer of me.

Gene Wilburn is a tech writer and essayist with more curiosity than time.

Back to Bash

How to Make MacOS More Linux-like

Up-to-date Bash shell in MacOS

One of the great things about MacOS is its command line, a terminal onto a Unix-derived set of utilities that are available for free. All you have to do is issue the following command in a Mac’s Terminal application to get a full set of them.

% xcode-select —install

Apple’s official name for these is Xcode Command Line Tools. If you’re used to a Unix-style command line, you’ll feel right at home. Well, almost.

If you come from a Linux background, you will find the tools a bit … lacking. The problem is that many of them are out of date. Apple doesn’t put a high priority on keeping them current. Another problem is that many of the utilities are derived from BSD rather than GPL Linux repositories, which means that they’re not as feature rich as their Linux counterparts.

For years, Apple made Bash (Bourne-Again Shell) its primary CLI, but even though this is Linux-like, Apple installs a very out-of-date version of Bash. The reason for this is that Apple, being a proprietary company, doesn’t like the very open GPL licensing of Bash and other Linux utilities. As a result, Apple has recently switched to making Z Shell (zsh) its default shell program, leaving an obsolescent Bash for those who might need it for Bash scripts that might not work in zsh.

There is nothing wrong with Z Shell. It has a few features that are better than Bash, such as slick directory changes, but Bash users who prefer to stick with Bash and prefer a more Linux-like command-line environment may wish to return to Bash.

Fortunately there’s a way to enjoy the best of both worlds — the lovely MacOS graphical interface that can run programs like Office 365 and Adobe products, as well as having the latest utilities from the Linux side of things.

Meet Homebrew

The secret is Homebrew, which calls itself “the missing package manager for MacOS.” If you know how to use the apt-get utilities from any derivative of Debian Linux, such as Ubuntu or Linux Mint, you will be very comfortable with Homebrew.

You first need to install Homebrew into MacOS, which you do using the Mac’s existing command-line tools by typing

% /bin/bash -c "$(curl -fsSL

If you’re running an Intel Mac, Homebrew will put up-to-date open-source utilities in /usr/local/bin. If you’re running one of the new Apple M processors, the utilities will be placed in /opt/homebrew/bin.

Homebrew installs with the latest version of Bash. On my M1 Macbook Air, Bash is version 5.1.8. The version Apple ships is 3.2.57.

One final step is required to use the tools more conveniently. You need to add the path to the new utilities to your PATH environment. You do this by updating your ~/.bash_profile settings. Open .bash_profile in the editor of your choice and add the following line for Intel Macs

export PATH=/usr/local/bin:$PATH

or for M-based Macs

export PATH=/opt/homebrew/bin:$PATH

Type $ source ~/.bash_profile and your pathing will be set and you can run the utilities from the command line and get the right set.

Switching to Bash as Default Shell

Setting Homebrew’s Bash Shell as the MacOS Default

To make Bash your default shell, Open Terminal -> Preferences and add the path your system needs to boot Bash. For an Intel Mac this would be /usr/local/bin/bash and for a Silicon Mac /opt/homebrew/bin/bash .

Adding and Removing Applications

Homebrew doesn’t stop with just the utilities. Many of the applications that are available in Linux are also available to the Mac. For instance, if you needed to do some PHP development on your Mac, you could type

$ brew install apache2 php


$ brew install ngnix php

You then have the latest general release of both the web server and of PHP (PHP 8 by default).

You can remove applications just as easily, e.g.,

$ brew uninstall ngnix php

And you can keep all your apps and utilities up to date by occasionally typing brew update followed by brew upgrade if there are updates to be applied. Notice how parallel the usage is to Linux apt.

If you run Apple silicon, most of the Homebrew utilities have been recompiled for the M processor. The utilities and apps that have not been recompiled run in Rosetta2, and they run about the same speed they do on Intel Macs. The M-compiled ones really zip along.

With your Mac now more Linux-like, you can add your favorite languages in the same way, e.g.,

$ brew install python3

Homebrew takes care of all the dependencies.

Using Brew

Brew Usage

Anyone who has used apt-get in Linux will feel right at home with these parallel Homebrew commands.

One last tip. If you would like a really attractive Bash interface with color coding and built in alias like ll for ls -l and la for ls -a, grab the contents of the default .bashrc file from Ubuntu or Linux Mint (or probably any distro that uses Bash as its default shell), and paste it into the top of ~/.bash_profile on MacOS and it transforms your environment from plain jane to Linux cool.

The synergy the Mac gets from these ‘Linux’ utilities and applications will take your command-line computing to a higher level.

Gene Wilburn is a computer generalist, tech writer, essayist, and photographer.

Full-Frame Photography on a Budget: Buy Used

Used Nikon D610 and 3 used lenses

By Gene Wilburn

For someone who cut their photographic teeth shooting with 35mm film cameras and lenses, full-frame digital cameras feel like a homecoming.

It’s not just the impressive image quality that comes from a full-frame sensor—it’s the instincts for the focal lengths you grew up with. A 24mm lens is 24mm with no crop factor. Not 38mm equivalent as with APS-C sensors, nor 48mm equivalent as with M43 sensors. 24mm, the real thing. The same for 50mm, or 85mm, etc. It’s a familiar world with a long and deep history that echoes back from the days of the earliest Leicas.

For someone on a restrained budget, however, full-frame can be a stretch financially. The latest cameras are dazzling things, most often mirrorless, frequently with image stabilization built into the camera body, and superb features, including tilt and swivel LCD panels for convenience when composing shots. Yet if you’re willing to move away from the leading edge of digital photography, you’ll discover that there are interesting deals to be found at the trailing edge.

The Trailing Edge

Think of the camera market as a kind of comet moving through time. The latest and greatest products form the bright head of the comet, its leading edge. The comet is then followed by a long tail, the trailing edge of discontinued models from the most recently discontinued to older models, stretching all the way back to film gear from the last century.

All along the way are deals to be had, for cameras don’t stop working because they’re discontinued. There are full-frame cameras from a relatively short while ago that were great cameras at the time but are now discontinued. Used ones in good shape have lots of service left in them, and they sell far below the price of the latest models.

Take my own case. I wanted to get back into full frame after having once owned a Nikon D610 DSLR, an intermediate-level body. I sold it when I was downsizing but soon had seller’s regret. So I scoured eBay and found another D610 advertised as being in “mint” condition. You have to be careful buying a camera unseen, but if you check the seller’s feedback you get some idea of how reliable they are.

I ordered my used D610 body from a camera seller in Japan. It meant a bit more spent on shipping and duties, but when the camera arrived it was immaculate. Cost: $675 USD. For $200-400 more I could have purchased a fine D800 body, but as an amateur photographer, I didn’t need all its features.

I knew I wanted a macro lens for it and for years I’ve been a distant admirer of the legendary Tamron Di 90mm macro lens. I found a “mint” one in Nikon mount from another Japanese vendor, for $200 USD.

I already had a 50mm f/1.8 Nikon lens so I had the middle ground covered. What I needed was a wide angle. From my experience with my iPhone (28mm equivalent) I knew I wanted something a little wider, so I located a good-condition Nikon 24mm AF-D lens for $150 USD.

My total base cost was just over $1000 USD for a full-frame camera plus two “new” lenses. Not bad considering a Sony a7C sells new, body only, for nearly $2000 USD and its lenses are expensive.

Recycling at its Best

I’m not a professional photographer and don’t need all the (lovely) bells and whistles of the latest models. And as a retiree on a fixed income, I have to be careful with my spending, so the prospect of recycling some of the slightly older, used gear is financially appealing.

It’s also a nice feeling to give a good used camera a new home rather than allowing it to sit idle on a shelf.

Buying used, as long as you’re careful about it, represents recycling at its best. As you can see from some of the photos I’ve added to this article, all taken from plants in our gardens, I’m a happy camper, very pleased to be photographing with my used gear.

Are You a Dark Mode or a Light Mode Person?

Are you attracted to the Dark Side?

Recently many apps and desktop backgrounds have sprouted a “dark mode” option or theme, reducing the amount of white light that strikes your eyes. It’s a welcome option in Kindle Reader, Apple Books, and Overdrive Media, giving respite to tired eyes especially in the evening when the greatest eye fatigue sets in and the ambient light is more subdued.

Dark mode is now making its way into writing apps, such as my go-to editor, iA Writer, and I, for one, am delighted (pun intended). For long-haul ebook reading and for writing, I prefer dark mode, finding it causes me less eyestrain than light mode.

Dark mode isn’t something new, though. It was a previous age’s standard.

A Little History

If your memory goes back to the 1980s or earlier, you may remember that computer monitors, and terminals before that, had a black background with white, green, or amber characters. This was in the age of command-line MS-DOS, CP/M, Commodore, TRS-80, and other PCs of the era. When colour computing became an option, screens were often dark blue, with white letters — classic WordPerfect colours. Dark mode has a long history.

Things changed abruptly, in 1984, the original Macintosh computer (and Unix workstations before the Mac was released), introduced a graphical, windowed environment. Soon Microsoft Windows followed, and the new white background with dark letters became de rigeur, proving ideal for desktop publishing and word processing.

We’re now used to seeing our work as a kind of virtual paper with black letters on a white background. This has been the standard for so long that the dark mode had largely been forgotten, other than for command-line users, many of whom adjust their terminal emulation colours to white on black.

The Dark Way

Some changes are the result of fads and it’s become stylish suddenly to sport dark mode backgrounds. Apple has taken this a step further by introducing wallpapers that change mode from light to dark depending on the time of day. Whatever the reasons, dark mode has again become popular, especially among computer geeks.

Gizmo China ran a poll in 2020 asking which do you prefer: Light Mode or Dark Mode? Approximately 78% of the 562 participants preferred dark mode, 11% preferred light, and 11% preferred “scheduled mode” — light mode during the day and dark mode in the evening.

Which is Better?

“Better” is a subjective term, of course, and for many of us “better” is simply what we’re used to. There have been some studies on this and the answer seems to be “it depends.” When ambient light is high, as it often is during the day in a well-lit room, light mode is easier on the eyes because the pupils are contracted and black on white is easier to see. In the evening, though, when the light begins to fade and the ambient light is less strong, our pupils dilate more and white on black is easier to read for many users. “Scheduled mode,” which you can set up in Apple’s Books app, for instance, is an ideal balance.

As far as I know, no one has done an informal study on whether dark mode conserves battery life on a laptop. It should, for the simple reason that black on the screen indicates LCD pixels that are switched off, and a lot of laptop battery power goes into powering the screen. In light mode, most of the pixels are drawing full power.

With these factors in mind, it might be worthwhile for you to “visit” the dark side to see if it works for you. The only right answer to the question of which is better is this: “Your eyes, your call.”

Create Beautiful Self-Published Books With Free Software Tools

By Gene Wilburn

There are many tools that can be used to write a book and prepare it for self publication. Microsoft Word, of course, is commonly used for this purpose, and LibreOffice Writer, a free, open-source alternative to Word, is rock solid.

Word processors, however, are not true typesetting systems, though they do a decent job if “good enough” is your aim. If your aim is a little higher, you need to move up a level.

The next level up from word processors is the tier of publishing systems that do a more nuanced, attractive, and professional-looking job of kerning and leading, especially for print books and PDFs with fully justified lines. Adobe InDesign, Adobe Framemaker, QuarkXPress, and Affinity Publisher are commercial products that offer this kind of quality. The open source world offers Scribus, as well as traditional text-based typesetting systems such as troff and LaTeX, two systems frequently lauded for their ability to produce beautiful typesetting. It’s your choice which typesetting program or system to use, but they all have a non-trivial learning curve.

As a self-published independent author myself, as well as being a retiree on a tight budget for software, I’m going to outline a way to produce great looking books and ebooks using a combination of free tools that work in Windows, MacOS, or Linux. These products and systems are not as widely used as Microsoft Word, and they have a reputation for being “techie,” but I think they’re accessible to anyone who is willing to take on a modest amount of learning.

The tools covered in this article are:

  • Markdown text markup notation
  • Google Docs
  • Pandoc
  • LyX and LaTeX typesetting systems
  • Sigil ebook editor
  • Zotero bibliographic manager

These are the tools I used to produce Shift Happens: Essays on Technology, co-authored with my wife, Marion Turner Wilburn, in 2020, during the first wave of the Covid-19 pandemic. Shift Happens is an overview of many of the technologies of the past century that have shifted our lives, environment, and perceptions. We made it available in ebook, PDF, and printed book formats. Given the subject matter, we wanted to include a “Further Reading” bibliography. We wanted the references in the ebook format to provide hotlinks to the cited sources so that readers could simply tap or click on a link to jump to it. The tools we used made achieving this an easy task.

Markdown Notation

One input, multiple outputs

Markdown is an example of what are called text markup schemes — methods that allow you to add attributes and structure to plain text files, then run them through a document converter that translates the Markdown files into another format, such as HTML. The goal of Markdown is to create one set of master input files, such as chapters of your book, and from those create multiple outputs, whether ebooks, PDF documents, HTML pages, or printed books or reports.

Markdown is simple and easy to learn. For example, surrounding a word or phrase with asterisks, e.g., *italic* produces italic text. Double asterisks around **boldface** produce boldface. Other features follow similar patterns.

Furthermore, there are text editors that are designed specifically to help you with Markdown. Four of the best known are iA Writer, Byword, Ghostwriter, and Typora. They make italicizing a word or phrase as simple as pressing Ctrl-I, as in a word processor. All four have preview modes.

Although Markdown files are usually created in a text editor, you can use Markdown notation in a word processor such as Google Docs, then export your work as a plain-text Markdown file. We used Markdown this way when we wrote our book.

Google Docs

Work from anywhere

Google Docs is a brilliant collaboration tool. As we were writing the content of our book, we had many writing sessions where we both sat in the same room, each with a laptop on our laps, working back and forth through rough passages. We could see in real time the changes the other was making to the text and we would then decide whether to keep it or modify it further.

What made this possible is that Google Docs is a Cloud-based product that you can access from any browser or Google Docs app. The writing and editing of our book was done from a mix of Windows, Mac, Linux, Chromebook, and iPad computers. Having the content in the Cloud also protects it from computer hard disk failure or any other local calamity. There is comfort in knowing that content remains safe on the Web.

Another thing Google Docs is brilliant at is versioning, which it does automatically. We sometimes decided that we preferred an earlier version of what we had written, and we could go into a file’s document history and recover previous passages easily and painlessly.

We used Markdown inside Google Docs as our master documents for the project and any changes to our chapters were done there and nowhere else. Google Docs can export its files as plain text, and we exported them back as plain-text Markdown files once the chapters were finished.


From anywhere to anywhere

Pandoc is an open-source, command-line utility that is an impressive document converter — a Swiss-army knife that can convert a large number of document formats into other formats. We used it to convert plain-text Markdown files to LaTeX files and HTML files in preparation for final book production. It can even be used to convert Markdown files into other formats, such as Microsoft Word. Pandoc is available for Linux, MacOS, and Windows.

In use, Pandoc invoked from the command line, such as

$ pandoc -o chapter1.html

as an example of converting a Markdown file to an HTML file, or

$ pandoc -f markdown -t latex

to convert a Markdown file to a LaTeX file with the same base filename, e.g. chapter1.tex.

Lyx and LaTeX

Pretty printing

LaTeX (pronounced LAY-tek) is a rich typesetting system available for all major operating systems. There are distributions of LaTeX available for easy installation in Windows (MikTeX), MacOS (MacTeX), and Linux (TeX Live). Often used for formal academic books, reports, conference proceedings, and theses, it has hooks for creating footnotes, end notes, bibliographical entries, and mathematical equations. Its output is gorgeous.

To be honest, though, LaTeX can be bewildering to a newcomer, and for this reason I highly recommend using LyX, a front-end word-processing-like editor that uses LaTeX as the back end for final output. LyX is easier to use than straight LaTeX — if you can use Word, you can use LyX, which comes with excellent help file documents. LyX, too, is available for Windows, Mac, and Linux computers. LyX and LaTeX were used to typeset the PDF and on-demand print versions of Shift Happens, and using them proved no more difficult than using a graphical DTP package.

In practice I used Pandoc to convert our Markdown text files to plain .tex files, and imported those into LyX. From inside LyX I adjusted margins, spacing, justification, chapter and section numbering, page size, gutter margin, kerning level, bibliography, and table of contents to create a 6×9″ format trade book. I exported the result as a PDF file, ready to read, and also ready to upload to our on-demand book publisher Blurb. You may choose a different publisher such as Kindle Direct Publishing (KDP).


Getting it right

An Epub file can best be described as a zip file containing a miniature website. The contents of the zip container are HTML files, maybe some CSS files and some images in its /img directory, plus a manifest that lists all the files and graphics in the publication, as well as containing Epub metadata. If you want to create an Epub, or modify an existing one, you could scarcely do better than turn to Sigil, a terrific, free Epub editor.

One part of Sigil is an HTML editor displaying HTML code on the left, and live rendered output on the right, for comparison and direct editing. High-level menu options can be used to adjust heading levels, and attributes such as bold and italic. It is easy to create and test Internet links, and to include graphics. Sigil also makes it easy to split or combine chapters and sections. For Shift Happens we simply imported our individual chapter files in the HTML format created from our Markdown files by Pandoc, made a few adjustments where needed, and saved the results to Epub format. For the Amazon Kindle store, we used KDP to upload the finished Epub to convert to Amazon’s proprietary ebook format. The conversion was perfect.


By the way …

If you plan to publish a non-fiction book and intend to include a bibliography or “Further Reading” appendix, it’s useful to use some kind of bibliographic software that will store your references and format them according to one of the bibliographic style sheets that are used in the sciences, social sciences, and humanities. Zotero to the rescue.

Zotero describes itself as a “personal research assistant — a free, easy-to-use tool to help you collect, organize, cite, and share research.” It does a great job at this, allowing you to grow your references as you research topics. It has data entry screens, but best of all it can automatically create entries from a website and format them correctly. It is used directly on a computer and the results are syncronized with the Web version. It is also available as an add-on to most major browsers, and it can be integrated with Word, Google Docs and LibreOffice.

Zotero is a professional-grade package, up to the task of organizing and exporting references in accepted academic bibliographic citation styles. If all you need to do is create a simple bibliography and don’t need all of Zotero’s bells and whistles, you can use its simplified Web-based sister product, ZoteroBib.

Your Turn

The devil is in the details

Assembling these tools in Windows, MacOS, and Linux —- downloading them for use on your book project —- is as straightforward as any software installation. Although it may sound complicated to use several packages instead of just one or two, like Word and InDesign, the workflow smooths out as your familiarity with the software grows. Nonetheless, there is a learning curve involved and all the packages require attention to detail. The shift, for many users, is see your book as a logical structure, rather than a visual one. The software is guaranteed to produce visually beautiful output once you get the structure of your book down. The beauty of these products is that they work with anything from a simply structured novel to a complex academic book. Best of all, the products are free.

Gene Wilburn is the author of Northern Journey: A Guide to Canadian Folk Music, Recreational Writing, Markdown for Writers, as well as co-author of Shift Happens. He has also written dozens of articles, essays, and reviews, primarily on computer technology.

F. Scott Fitzgerald vs. S.S. Van Dine: A Vocabulary Bakeoff

An Exploration in Natural Language Processing

By Gene Wilburn

Image for post
F. Scott Fitzgerald / “S.S. Van Dine”

Just as I began learning basic techniques for natural language processing (sometimes called “computational linguistics”) in the Python programming language, I read that F. Scott Fitzgerald’s Great Gatsby had been released into the public domain. As an English major (B.A., M.A.) who had pivoted into a career in IT, this attracted me like a folksinger to a new acoustic guitar. I knew I had to try out my new licks on Gatsby, so I downloaded the plain text version of the novel from Project Gutenberg.

As is the case with most lit majors, I’m a word addict, so I thought I’d pull out some of Fitzgerald’s vocabulary to see if it was in any way exceptional — meaning the words he used, not the deft way he put them together in his classic American novel. This was not intended as a form of literary criticism. It’s more of a word-watcher’s curiosity about how a gifted writer used his word hoard.

By extracting all the individual words from the plain text form of the novel, then putting it through a stop list of words like a, an, and, or, but, the, etc., I had a working list of relevant words. Using the resources of the excellent NLTK (natural language toolkit) module (NLTK Book), I was able to prepare a frequency distribution list that could be used to highlight the most frequently used words as well as those used only once or a few times.

It turned out that words Fitzgerald used most were not particularly interesting or enlightening. Examining the words used 50 or more times one sees common words like I, she, he, said, Gatsby, Tom, Daisy, you, house, car, get, and something. Nothing particularly inspiring.

It then occurred to me that it might be much more interesting to look at Fitzgerald’s least used words, hoping to find some fancier words he used only occasionally. In addition to its .FreqDist() method, NLTK also has a method called .hapaxes(). This derives from the Greek expression Hapax legomenon meaning, literally, “something said only once.” The method, appropriately, flags all words used only once in the novel.

This immediately produced more interesting results, as varied as adventitious, amorphous, aquaplanes, vestibules, and wall-scaling, along with common words used only once. By experimenting with selecting various degrees of frequency, I found that the most interesting all-around list was obtained by including all words used three times or fewer.

Although this was interesting, it seemed to me that it would be doubly interesting to a word hound to compare The Great Gatsby with another novel of the same period. I thought of Hemingway, since Fitzgerald and Hemingway were friends, but Hemingway’s use of simple vocabulary makes him less interesting in terms of the actual words he employed.

Gatsby was published in 1925. By coincidence I had just finished reading a 1926 murder mystery novel called The Benson Murder Case, by S.S. Van Dine, the pseudonym for American art critic and writer Willard Huntington Wright, an erudite writer whose detective, Philo Vance, was at one time highly popular with readers and who was featured in several Hollywood films. Amateur detective Vance, a kind of American version of the British sleuth Lord Peter Wimsey, was played in films by actors William Powell (before his Nick Charles period), Basil Rathbone, and Edmund Lowe (Wikipedia, “S.S. Van Dine”). Nick Caraway rubbed shoulders with the rich. Vance was a member of rich NewYork society, and an art collector, and, as such, had a remarkably sophisticated, at times foppish, vocabulary. The novel sent me to the dictionary several times to look up new words.

I purchased an Epub edition of S.S. VAN DINE Premier Collection: Thriller Classics, Murder Mysteries, Detective Tales & More and extracted the text of The Benson Murder Case and put it through the same lexical procedures as Gatsby, likewise limiting the word list to words used three times or fewer. I then converted all the words to lower case, alphabetized both lists, and filtered the two lists together using a Unix/Linux word utility called comm. What it did was put the results in three columns. Words used only by Fitzgerald, words used only by Van Dine, and words used by both. The full list is here.

I then imported the list into Google Docs and exported it as an Epub file that I loaded into Apple Books on my iPad. This allowed me to do a leisurely read through the list and highlight words from each author that struck me as being “interesting” and at least slightly out of the ordinary. When I had finished scanning, I manually copied the results for each author into the listings below:

Great Gatsby (1925)

abortive, adventitious, aluminium, amorphous, aquaplanes, araby, asunder, beluga, cahoots, caravansary, caterwauling, chartreuse, coney, convivial, crêpe-de-chine, debauchee, demoniac, dilatory, distraught, divot, dog-days, duckweed, echolalia, ectoplasm, euphemisms, expostulation, fishguards, flounced, foxtrot, fractiousness, grail, harlequin, holocaust, hornbeams, hors-d’oeuvre, humidor, inconsequence, inessential, jonquils, juxtaposition, knickerbockers, lustreless, meretricious, nonolfactory, obstetrical, pasquinade, petrol-pumps, plagiaristic, platonic, pneumatic, portentous, postern, prig, probity, rot-gut, rotogravure, sea-change, sheik, somnambulatory, staid, substantiality, subterfuges, teutonic, vestibule, wall-scaling, whitebait

Benson Murder Case (1926)

a-flutter, a-kimbo, acerbities, adipose, amasis, animadversions, approbation, aquiline, argot, arrentine, astigmatic, badinage, ballyrag, bezique, bisonic, brachycephalic, bunjinga, burglarious, casuistic, champêtre, chef-d’œuvre, cinquecento, cloisonné, confab, contretemps, craniological, darwinian, davenport, deltoids, derring-do, déshabillé, diatonic, discommode, disharmonious, dissolution, dolichocephalic, dulcet, dyspnœa, ebullition, embayed, emulsification, endocrines, factitious, factotum, flâneur, flummery, forensic, garrulous, gewgaws, halcyon, hauteur, hedonist, helixometer, hirsute, mpecunious, imputation, inamorato, infinitesimal, ingress, inspissated, joss-sticks, juxtaposition, lambrequin, leptorhine, lèse-majesté, lineaments, loquacious, lugubriously, mandragora, mêlée, mellifluously, mock-turtle, modish, moue, myrmidons, obduracy, orthognathous, oubliettes, palaver, peccadilloes, perfeccionados, perspicacious, phrenologist, platitudinarian, plebeian, polychrome, popinjay, prognathous, protasis, puerility, quavering, quixotic, rapprochement, ratiocination, redolent, remonstrances, repine, reproche, rubicund, sabreur, sardonic, sententiously, sequester, smouldering, sobriquet, soirée, somnolently, soupçon, stertorous, suave, sybarite, sycophant, syllogism, tenter-hooks, tête-à-tête, teutonic, tonneau, totemistic, triptych, truculent, tutelary, twitted, ventral, vestibule, viscid, vitiated, vituperation, vortices, what-for, whirlin’-dervish

There are a couple of things one might conclude from comparing the lists. The first is that F. Scott Fitzgerald did not use an especially challenging vocabulary for The Great Gatsby. This makes the novel suitable for readers of younger ages, say high school or first-year university students. The second is that you don’t need fancy words to create a masterpiece. Gatsby has stood the test of time.

S.S. Van Dine, though once highly popular, has faded into relative obscurity. Part of that may be attributed to his more challenging vocabulary and part to the writing itself, which is slow-paced for a detective novel.

Those of us who are addicted to detective fiction are used to authors with large vocabularies and, in fact, if a work of detective fiction doesn’t offer some word challenges, it’s disappointing to the reader.

The bottom line: S.S. Van Dine walks away with the prize for most interesting vocabulary, while F. Scott Fitzgerald walks away with a literary masterpiece. A generous reader can enjoy both.

Minimalist Writing Devices, #3: Raspberry Pi 400

By Gene Wilburn

My Covid-era 2020 Christmas present to myself was an eye-catching red and white keyboard with a computer inside: a Raspberry Pi 400. Like a 1980s-vintage Commodore 64 all it needed was a cable connection to my monitor and I was sitting in front of a fully operational Linux computer. Cost: $70 US for the unit alone, or $100 for a complete kit that includes the keyboard/computer, color-coordinated mouse, HDMI video cable, and a book, The Raspberry Pi Beginner’s Guide.

As a writer, I’m fascinated by low-cost,  minimalist writing devices and the Raspberry Pi 400 (RPi 400) delivers more power per dollar of computing device I’ve yet encountered. Let’s take a look.

Introducing the Raspberry Pi 400.

What you get in a Raspberry Pi 400 is not just an attractive keyboard, but a full 64-bit ARM CPU computer inside, with 4GB RAM, a microSD slot to store the operating system and local data, 2 micro-HDMI ports, 1 USB-2 port, 2 USB-3 ports,, a USB-C port for power, a Gigabit Ethernet port, built-in WiFi and Bluetooth, and a GPIO (general purpose input output) 40-pin port.

The GPIO port is for makers and experimenters — those who create things such as robots and robotic structures, specialty electronic circuit boards, art and light installations, and much more. To this crowd the Raspberry Pi is at the heart of many a specialty project. For them Raspberry Pi is as common a brand name as Dell, HP, Lenovo, Acer, or Asus to most home computer users. Chances are you’ve not heard the Raspberry Pi name bandied about much in writing circles … yet.

With the RPi 400 that may be about to change. This is the first Raspberry Pi model that is a ready-to-boot-and-use Linux computer with appeal beyond its usual user base. I can see parents picking up one or two of these for their kids. It’s a inexpensive and great way for anyone who has heard of Linux, but may have been shy about trying it, to get a hands-on introduction. The purpose of this review is to examine this device as a potential minimalist writing tool that could be used by someone with no previous experience with a Linux computer.

Setting Up the Unit

The RPi 400 arrives with a 16GB microSD card inserted, ready to boot up Raspberry Pi OS as soon as you add a monitor or TV, and a USB mouse for convenience. The first time you boot the system it prompts you for your country, language, time zone, and a new password. The RPi then scans for a WiFi connection and prompts for its password. 

Once set up, the interface looks similar to Windows or MacOS, with the task bar at the top instead of the bottom. Navigation is simple: click on the red raspberry icon in the top left corner to display a menu from which you may launch any of the included programs or apps. The RPi 400 comes loaded with programming editors, text editors, and the Libre Office suite, which includes a Word-like word processor. The default browser is Chromium, the open-source version of Chrome. A file manager allows you to browse through your folders to copy, move, delete, or select files. The operations are intuitive and familiar to any Windows or Mac user.

And that’s it! You’re ready to write.

The RPi 400 as Writing Device

Because I use Google Docs for much of my writing, I fired up Docs for this review and found the RPi a very comfortable device to work with. The keyboard is full size, minus a numeric keypad. Because it’s weighted with a computer inside, it has enough heft to feel solid as you work. The keys are well spaced and the layout is normal with well positioned arrow keys in the lower right-hand corner.

At this price you don’t get a first-class keyboard, but it’s completely serviceable. The one caution with the keyboard is that you occasionally get keyboard bounce — two characters appearing with one press of the key. The bounce is infrequent enough that it’s not a show stopper, but you need to keep an eye on the output for occasional misbehavings. Some of the bounce may be determined by your touch on the keypads. I’m a heavy-handed typist, raised on upright typewriters and the original IBM PC keyboards.

The RPi 400 is not a speed demon. It has enough zip that it doesn’t lag while you type but it’s not a sports car. It’s more like a cute VW Beetle with rear engine. Fun to use and it gets you there.

Who is the Raspberry Pi 400 for?

The RPi 400 is a variant of the small Raspberry Pi 4 used in maker projects. As such it will certainly be of interest to makers and experimenters, but putting the computer inside the keyboard opens the device to a much wider audience.

Parents can purchase this unit for their kids as a way to learn programming, or just for general use. It’s a little sluggish on websites that include heavy graphic material but that’s to be expected.

Writers may be interested in this unit if they’re in need of a cheap computer and already have a monitor or HD TV it can attach to. At this price, it could serve as a complementary machine to a laptop or tablet, or even a unit you might want to leave at a site you visit regularly, such as a cottage or other external location.

Overall, the Raspberry Pi 400 is cute, highly usable and cheap. For most writers I would recommend the $100 kit over the $70 standalone model. The kit comes with matching USB mouse plus the critical HDMI video cable.

The Command Line

Although you don’t need to know much about the included terminal app that is similar to the Windows Command Prompt and nearly identical with the Mac Terminal program, you will need to use the command line occasionally to make certain your software is up to date. This is done by starting up the terminal and typing the following two lines at the command prompt:

$ sudo apt update

$ sudo apt upgrade

Running this once a week or so will keep the Raspberry Pi 400 software and operating system up to date with the latest upgrades and security updates.

Bottom Line

As you can tell, I’m enthusiastic about the Raspberry Pi 400 as an inexpensive, minimalist writing device. The bang for the buck is incredible and there’s nothing difficult about using a Linux computer for writing. All the usual amenities are here, packed inside a keyboard. The unit, while easy enough to carry to other locations, is not a portable. This is a small desktop computer waiting for you when you’re ready to create the next best seller. Happy typing!