Christopher Allan Webber

Why is GNU not a lisp system?

Christopher Allan Webber at

Sometimes I wish GNU really weren't Unix, more so than it isn't currently. Specifically, I wonder what the alternate reality would be where GNU was a lisp machine instead. After all, lisp machines could run a variety of languages, RMS had experience developing them, and Emacs is practically a mini lisp machine fit to run under other operating systems anyway.

I remember reading rms saying he regretted the operating systems he worked on becoming obsolete when their systems were deprecated. At the time, lisps weren't portable (though, they would become more so in the future), and so IIRC he said he heard that Unix was portable, had not run Unix yet, but thought it looked like a reasonable enough design.

So, as a distraction while hacking on soci.el tonight, I was reading through the first GNU bulletin, and here's a relevant excerpt:

         Why a Unix-Like System?

It is necessary to be compatible with some widely used system to give our system an immediate base of trained users who could switch to it easily and an immediate base of application software that can run on it. (Eventually we will provide free replacements for proprietary application software as well, but that is some years in the future.)

We chose Unix because it is a fairly clean design which is already known to be portable, yet whose popularity is still rising. The disadvantages of Unix seem to be things we can fix without removing what is good in Unix.

Why not imitate MSDOS or CPM? They are more widely used, true, but they are also very weak systems, designed for tiny machines. Unix is much more powerful and interesting. When a system takes years to implement, it is important to write it for the machines that will become available in the future; not to let it be limited by the capabilities of the machines that are in widest use at the moment but will be obsolete when the new system is finished.

Why not aim for a new, more advanced system, such as a Lisp Machine? Mainly because that is still more of a research effort; there is a sizeable chance that the wrong choices will be made and the system will turn out not very good. In addition, such systems are often tied to special hardware. Being tied to one manufacturer's machine would make it hard to remain independent of that manufacturer and get broad community support.

Blaise Alleyne, der.hans, GNUstav Huarcaya, Claes Wallin (韋嘉誠) and 2 others likes this.

GNUstav Huarcaya, Claes Wallin (韋嘉誠) shared this.

Show all 14 replies

(Not to mention that lisp systems had a sizable number of users in their heyday!)

That said, rms makes that very point you are making in the above quote...

It is necessary to be compatible with some widely used system to give our system an immediate base of trained users who could switch to it easily and an immediate base of application software that can run on it...

And:

Why not aim for a new, more advanced system, such as a Lisp Machine? Mainly because that is still more of a research effort; there is a sizeable chance that the wrong choices will be made and the system will turn out not very good.

I think that does support your above argument: the lisp curse makes it so that even lisp users familiar with lisp generally may be thrown off, to a small degree, by the differences between lisps. You still see plenty of arguments about lisp-1 vs lisp-2.

Though I think the "lisp curse" also applies to Unix anyway: most unix'es were incompatible-ish to some degree. See the large amount of work spent to get systems to build together. See the recent LWN article: 25 Years of Linux — so far

The computing industry in 1991 looked a little different than it does now. A whole set of Unix-based vendors had succeeded in displacing much of the minicomputer market but, in the process, they had turned Unix into numerous incompatible proprietary systems, each of which had its own problems and none of which could be fixed by its users. Unix, in moving down from minicomputers, had become much more widespread, but it also lost the code-sharing culture that had helped to make it Unix in the first place. The consequences of the Unix wars were already being felt, and we were being told by the trade press that the upcoming Windows NT release would be the end of Unix altogether. Unix vendors were developing NT-based systems, and the industry was being prepared for a Microsoft-only future.

GNU/Linux saved Unix by unifying it. It's possible it would have gone badly if a lisp machine was done instead, but it's also possible that it could have unified the world of lisp, also.

That said, what a clear idea of a "good lisp" is has changed over time... at the time RMS started GNU Emacs, it used dynamic scope rather than lexical scope because rms believed it would be too slow.

I asked RMS when he was implementing emacs lisp why it was dynamically scoped and his exact reply was that lexical scope was too inefficient. So my point here is that even to people who were experts in the area of lisp implementation, in 1982 (and for years afterward, actually), Scheme was a radical, not-at-all-accepted notion. And outside the Lisp/AI community... well, languages with GC were definitely not acceptable. (Contrast with the perl & Java era in which we live. It is no exaggeration, thanks to perl, to say in 2001 that billions of dollars of services have been rolled out to the world on top of GC'd languages.)

Flash forward many years, and lexical scope has universally won out. But that's easier to see in retrospect how to do that right. So maybe GASOLM (GASOLM's A Sort Of Lisp Machine) would not get everything right.

But GNU didn't get everything "right" either initially; in 1990 GNU was implementing long arguments for the first time and even then it used +arg rather than --arg style arguments. And even today, C, the very foundation of Unix and Not-Unix, is no longer considered "safe" by the security world!

So I think it's pretty hard to anticipate all that, but I have some optimism for the lisp machine alternate reality's fate. ;)

Christopher Allan Webber at 2016-09-06T18:18:16Z

Claes Wallin (韋嘉誠), der.hans likes this.

But the success of *nix has clearly been largely influenced by the success of GNU. GNU probably kept unix-like systems alive.

If you define "UNIX-like" as "involves processes and stdout and stdin and has a filesystem", which is the basics of what's needed for different coreutils programs to be developed independently without communication overhead, then note that DOS also is close enough. I perceive an implication that "UNIX-like" systems were on the way out pre-GNU, but I would instead say DOS-esque systems count as UNIX-like enough for my purposes. Therefore, UNIX-like systems weren't really on the way out, the way I see it, and it wasn't GNU that kept them alive; it was DOS.

I know there's more to UNIX than "stdout & stdin & files" but for the purpose of creating a volunteer community that doesn't need to absorb high communication overhead, it's the most salient property IMHO.

asheeshlaroia at 2016-09-06T18:24:34Z

Claes Wallin (韋嘉誠), Christopher Allan Webber likes this.

Very interesting that RMS did perceive this, at least! :D

asheeshlaroia at 2016-09-06T18:25:00Z

Christopher Allan Webber likes this.

Just suddenly thought about what @joeyh said...

If nothing were compiled (scripting all the way down to and including the kernel), then access to the source code would probably not need to have been mentioned in freedom 1. It would be at most a footnote.

I guess that means that the term "open source" wouldn't have made sense / caught on then either.

Christopher Allan Webber at 2016-09-07T01:26:34Z

Claes Wallin (韋嘉誠) likes this.