Pierre MacKay

[done in early 2007]

Pierre MacKay (1933–2015) was a classics professor who was a TUG board member from 1983–1991, TUG president from 1983–1985, and Unix site coordinator from 1983–1992. He was with TUG from the start.

Unfortunately, long term commitments prevented him from completing this interview. Barbara Beeton wrote an additional note at the time of his death.

 

 

Dave Walden, interviewer:     Please tell me about yourself independent of the world of TeX.

Pierre MacKay, interviewee:     My training and my official teaching career is as a humanist, but I had become intrigued by the notion of the computer even before I joined the faculty of the University of Washington to teach Classical Greek, Arabic and Ottoman Turkish. In those days (1966) the only thing ever suggested to a humanist with an interest in computing was a KWIC concordance, and I was led to consider such a project, but in Arabic, not in English. I learned a little Snobol, in hopes that a string-based language might offer better capabilities than assembler or Fortran, but I had not proceeded very far when I realized that there was no available form of output that would be intelligible to the kind of scholar who might find a concordance of Hariri useful. The programming consultants for the CDC 6400 which was then used as the general purpose academic research machine assured me that its six-bit character set would never be able to handle Arabic and, by good fortune, I listened to them.

On the advice of a student, I stepped across the hall to the newly established Computer Science laboratory with its brand new Sigma~tie()5 machine — an entire 16,000 32-bit words to play with; I remember the excitement when memory was increased to 24,000 words. Over the course of about six years, I put together two major programs, learning each technique directly from the graduate students I would sometimes have to evict from the Sigma 5 because I needed almost the entire memory, in addition to overlays called in from one of the earliest disk storage systems. One program (HATTAT) described font characters in the format used by the VideoComp phototypesetter, an American version of the Hell Digiset made by RCA and later taken over by Information International Inc. (III).

It was clear after the very first experiments that the context-sensitive fluidity of Arabic character shapes could best be provided for by making up a limited number of stroke components and assembling them into fully-formed characters on the fly. This approach has been adopted by many later systems, and was also used in the 1960s by the Caldwell Sinotype for Chinese Ideographs.

Since it was obvious in 1967 that no existing composition system could possibly manage the task of assembling the stroke components into letters, I developed a second program (KATIB) to take in a stream of quasi-alphabetic character codes (including many digraphs to extend the Latin letter punch card alphabet) and to replace them with strings of font glyphs and command sequences for positioning adjustment. In the most extreme instances it proved necessary to remember as many as eight preceding input characters and to look three characters forward in order to shape the desired Arabic character correctly. This led me to write what I have since characterized as an ad-hoc, poorly structured, kindergarten version of TeX. It even had a macro interpreter.

This program, which, so far as I have been able to determine, was the first fully automated digital composition system for Arabic script, succeeded in the production of a single book, Diocles, On Burning Mirrors, but it had necessarily been written very close to the architecture of the Sigma 5, and when that machine was decommissioned, so was my program.

DW:     How did you first become involved with TeX?

PM:     After the disappearance of the Sigma~tie()5, I spent some years of ineffectual floundering (including a foredoomed attempt to return to the CDC 6400) until I learned from Rick Furuta about TeX. (I was flattered to discover, when I wrote to Don Knuth, that he knew about HATTAT and KATIB.) It was obvious at once that TeX and METAFONT were the well-designed replacements for my haphazard programs, and Rick and I set to work on introducing TeX to the University of Washington. We worked first on the VAX version, and also on the original SAIL version of METAFONT, which we ultimately used to drive one of the three (or was it four?) Alphatype phototypesetters that ever produced TeX output. We watched, although we never participated in, the brief development of the XDS printing system, and we wrote drivers for a wet toner 200 dpi device. (I still use surplus rolls of the paper from this device as a very superior shelf paper.) I was out of the loop in Greece for the academic year 1978–79, but when I returned, I enthusiastically rejoined the TeX community.

DW:     Did you start using TeX78 and then phase over to use of the “final” version? If so, can you tell me a bit about this transition in the TeX community or at least at the University of Washington.

PM:     When Rick Furuta and I began our association with TeX, TeX78 was the only working program available. We were fortunate enough to have access to a good-sized DEC-20, so we were able to import a SAIL compiler, and to run both TeX and METAFONT. The only alternative I can remember was PascalTeX, which was a moderately successful attempt to translate TeX78 to a more accessible platform. In addition to the DEC-20, we had access to several different sizes of VAXen, which were being earned by the Computer Science department through a commitment to develop a Pascal compiler for VMS.

TeX78 did not make many converts at the University of Washington, owing less to the characteristics of the program itself than to the lack of acceptable output devices. These were the days of strict vertical integration in the printing industry. At the low end there were impact printers with more or less proprietary print-balls and print-wheels, and at the high end the digital expression of font characters was still in its infancy. The main University printing department was still using photo-plates, which cost horrendous amounts for any new character set and were neither accessible nor of interest to an academic research department. Research papers were produced with Scribe and similar formatters on various impact printers, and refinements such as variable-width characters were generally regarded as frivolous extravagance. High resolution digital output was available only on colossal image-setters whose font format was a precious trade secret. There do not seem to have been many of these in Seattle anyway until well into the 1980s, and they would not have been made accessible to a program like TeX which made a speciality of providing “non-standard” characters. To give an idea of the climate of the time, consider that the VideoComp font formats had somehow slipped into the public domain, but neither RCA nor III could ever resign themselves to it. At one time I tried to work on a METAFONT to VideoFont converter, and would send down occasional tapes to III in Los Angeles. I got reasonably good images back, but the company invariably erased all traces of the font I had created from my tapes.

Image-setter companies such as Compugraphic claimed, truthfully, I am sure, that they regularly sold their machines at a loss so as to create an appetite for fonts. If one happened to need a character not in the existing repertory, they could consider producing it, but no one in the academic world could possibly afford it. Movable lead type had been rather a democratic technology, if you could afford the labor and the lead. For about five decades, as lead was disappearing, typesetting moved backward into a feudal world.

The development of PostScript as a machine-independent way of managing both fonts and layout re-democratized the industry, especially after Adobe somewhat unwillingly released PostScript specifications to the public. The first 300 dpi LaserWriter (Apple's largest computer at the time) changed many minds about the possibilities of TeX, and the discovery that the Canon print engine in the LaserWriter was actually a 600 dpi machine made it possible to produce near book-quality output. I used to produce 600 dpi output with TeX at \magstep1 which could then be photoreduced to an effective 720 dpi, which was a resolution used by at least one low-end digital image-setter.

None of this development was obvious in 1979, and we certainly had no notion how fast it would come about, so the potentials of TeX were poorly understood. Knuth managed to talk Alphatype Corporation into allowing him a restrictive license to code directly for the Alphatype, and he once said that writing Alphatype programming was the worst task he ever undertook. Unlike other image-setters, this Rube Goldberg device traced a single, notionally 5500 dpi character at a time on a small television screen and moved lenses, mirrors, and the photosensitive medium itself back and forth and up and down, by means of a complex assortment of worm gears that whined rhythmically, playing what David Fuchs called the [Alphatype] beguine. You had to perforate each tabloid-sized sheet, fasten it accurately onto a pin register, and wait around in the darkroom, while the characters of one page, or anything up to 16 pages, were individually flashed onto the paper, then pull it out and shove it into the developer tank at once. The small television screen was unable to image a single character much larger than 16pt. For larger characters Knuth devised a way of slicing them into quadrants to be reassembled precisely by the machine. When I finally got to run one of the four or five Alphatypes that ever produced TeX output, I was unaware that if I used this feature for titles, I had to turn it off explicitly, so I ended up setting an entire book in broken 11pt type. This was interesting, but not beautiful, and it increased the length of each “beguine” by a factor that felt like 10. I think only one person other than myself ever used the Alphatype.

Rick and I wrote one of the low end output drivers for the 200 dpi liquid toner Versatec, and we won a few converts to typesetting with that device, but it was the LaserWriter that changed most attitudes to TeX. By the time it was available, the translation to TeX82 and METAFONT84 was complete.


Interview pages regenerated January 26, 2017;
TUG home page; webmaster; facebook; twitter; mastodon;   (via DuckDuckGo)