Joe Weening used TeX78 when he was first at Stanford, participated in the transition to TeX82, and continues to use (La)TeX today.
Dave Walden, interviewer: Please tell me about your life before you arrived at Stanford and how you came to be at Stanford.
Joe Weening, interviewee: Growing up in the 1960s and 70s, I did well in math and science, and taught myself computer programming in high school. As an undergraduate at Princeton, I majored in math but also took lots of computer science courses and decided that my career should be in that field. I was accepted to Stanford's PhD program in computer science in 1980.
DW: How did you come to be involved with TeX at Stanford?
JW: Don Knuth created the first version of TeX in 1978, two years before I arrived. By 1980, many professors and grad students in the department were using it to write their papers. Don also taught an intensive computer programming class for first-year PhD students, and told us about TeX, although it wasn't a major aspect of the class.
I've always been interested in typography. (In junior high school, one of my favorite classes was “print shop”, where we learned to set metal type by hand.) So it was natural for me to learn TeX during my first year at Stanford. My research advisor was John McCarthy (a pioneer in artificial intelligence and the inventor of the Lisp programming language), and as a member of his group I used the SAIL computer (a DEC PDP-10 running the WAITS operating system) and learned the SAIL programming language, in which TeX78 was written.
While learning TeX I became a bit frustrated that TeX macros were so cumbersome as a programming language. I had been learning Lisp, so I decided to use it as an extension language for TeX, and called the resulting system Lisp-Augmented TeX: LATeX. Several years later Leslie Lamport heard about this and asked if I minded that he'd chosen the same name. My system had not caught on, so of course I said there was no problem.
DW: When I Google on “LISP TeX LaTeX” I see that people are still working on combinations of Lisp and TeX. Can you say a word about how you combined Lisp with TeX in your LATeX?
JW: If I recall correctly, my system let you put @(Lisp code) anywhere in your document. The Lisp code would output a string that would become part of the TeX input. The Lisp interpreter would maintain its state, so you could define functions, change global variables, etc.
I hadn't known about the newer systems you mentioned, but I see that some are based on Emacs Lisp. Emacs Lisp is a great piece of software technology but it didn't exist in 1981, so my system used Maclisp. (Common Lisp didn't exist yet either.)
DW: Please excuse my interruption.
JW: No problem; I think that about covers my first year at Stanford. I spent the summer of 1981 at IDA in Princeton, New Jersey, where a lot of math papers are written. I knew they didn't use TeX yet, so I brought along with me a magnetic tape (one of those big spools you now see in museums!) containing PTEX, a portable version of TeX written by Ignacio Zabala using DOC, the precursor to the WEB literate programming system [and unrelated to the modern pTeX in Japan]. The output of DOC was a Pascal program, which was portable to many more systems than the SAIL version of TeX.
Almost no one had desktop workstations yet. Everything ran on timesharing systems, and the system at IDA was a Cray-1: a fairly unique machine to run TeX on! There was one problem: the Pascal compiler they had for the Cray-1 had some bugs, so after getting TeX to compile it still didn't work correctly! I spent most of the summer debugging the compiler, but got TeX working before I left. I also wrote a DVI driver for a printer they had, which made efficient use of the Cray-1's vector instructions.
When I returned to Stanford that fall, I began my next TeX-related project: a DVI driver for SAIL's DataDisc displays, called DVIDD. Using a fair amount of PDP-10 assembly code, this program could render DVI files almost as fast as a modern workstation. I believe this was the world's first DVI previewer, and it became quite popular. Many people kept using it into the late 1980s, even after xdvi and similar programs appeared.
That fall, as is well known, Don Knuth began the redesign of TeX, now known as TeX82. He convened an advisory group of mostly grad students. We'd meet about once a week. Don would tell us what he was thinking about or working on, and ask for our comments.
DW: I remember being told in another interview that Frank Yellin and Howard Trickey also participated in that group.
JW: It was quite an experience to witness his creative process. At some point, when enough decisions had been made, he began writing code. He was also developing WEB at the same time, based on Ignacio Zabala's experience with DOC and PTEX. I don't quite remember the order of events, but at some time in the spring of 1982 Don told us he'd finished writing TeX82 in WEB, gotten it to compile, and removed as many bugs as he could find. He told us he was going to take a week of vacation and asked us to test it while he was gone.
There was only one problem: we had no macro packages! For TeX78 there was BASIC.TEX, and several extensions of it such as ARKTEX by Arthur Keller, but these wouldn't work because of the many incompatible changes made in TeX82. So, to avoid disappointing Don, I spent several intense days translating BASIC.TEX into TeX82. He used this as the starting point for TeX's PLAIN format, and there are still a few lines of my code in PLAIN.TEX.
DW: Can you remember an example of the kind of thing Knuth might describe to his group of grad students and the kind of feedback the students gave back to him?
JW: Here's one example: a \mathchar in TeX78 is 9 bits, but in TeX82 it became 15 bits. Before finalizing that change, Don described it to our group, and we commented on whether it met the requirements that led to the change, and whether we could foresee any problems that might result from the new definition.
DW: You mention witnessing Knuth's creative process…
JW: The thing I remember most was how focused he was on one project at a time. He's not a multi-tasker like many of us are. And for programming projects he used a very “top-down” style of design and implementation. I believe he wrote down all of TeX with pencil and paper before typing it in, and finished typing it in before trying to compile it.
Although he likes to get feedback on his ideas from his colleagues, I think Don's actual creative moments are mostly solitary. That's a very personal issue — some people thrive on interaction, while others get their best ideas by concentrating.
I was also Don's teaching assistant in the fall of 1981 for the programming course for new PhD students. This had little to do with TeX (except for the course notes, which of course we wrote in TeX), but it gave me another exceptional opportunity to know Don and learn from him as a teacher and writer.
DW: Please tell me about your activities since leaving Stanford.
JW: I've worked at IDA's center in La Jolla, California since 1990 both as a researcher, and as deputy director from 2005–2009. Being a manager was interesting, but eventually I decided to return to doing research work since I find that more satisfying. The work at IDA is fascinating, and suits me well since it involves both math and computer science.
DW: Do you still use (La)TeX? In any case, what is your thinking about it 30 years after you were part of its creation at Stanford?
JW: All of my center's papers are typeset with LaTeX; it's a critical part of our environment. In addition to using it as an author, I maintain some local macros and keep our TeX software up-to-date.
When I left Stanford in 1990 my belief was that by 2000, someone would develop a new system that would make TeX obsolete. Clearly that hasn't happened. It seems that every five years or so, TeX is adapted to work with a major new technology such as Postscript, PDF, Unicode, HTML, XML, and I'm sure this trend will continue.
I'm still hoping for a system that will better integrate these technologies, be easier to use than TeX and LaTeX, and still produce high-quality output. Maybe that's asking too much, but I think it's a worthy goal to work toward.
DW: Thank you, Joe, for participating in our interview series. I look forward to meeting you in person at the TUG 2010 conference in San Francisco.