Christopher Allan Webber

Christopher Allan Webber at

(Not to mention that lisp systems had a sizable number of users in their heyday!)

That said, rms makes that very point you are making in the above quote...

It is necessary to be compatible with some widely used system to give our system an immediate base of trained users who could switch to it easily and an immediate base of application software that can run on it...

And:

Why not aim for a new, more advanced system, such as a Lisp Machine? Mainly because that is still more of a research effort; there is a sizeable chance that the wrong choices will be made and the system will turn out not very good.

I think that does support your above argument: the lisp curse makes it so that even lisp users familiar with lisp generally may be thrown off, to a small degree, by the differences between lisps. You still see plenty of arguments about lisp-1 vs lisp-2.

Though I think the "lisp curse" also applies to Unix anyway: most unix'es were incompatible-ish to some degree. See the large amount of work spent to get systems to build together. See the recent LWN article: 25 Years of Linux — so far

The computing industry in 1991 looked a little different than it does now. A whole set of Unix-based vendors had succeeded in displacing much of the minicomputer market but, in the process, they had turned Unix into numerous incompatible proprietary systems, each of which had its own problems and none of which could be fixed by its users. Unix, in moving down from minicomputers, had become much more widespread, but it also lost the code-sharing culture that had helped to make it Unix in the first place. The consequences of the Unix wars were already being felt, and we were being told by the trade press that the upcoming Windows NT release would be the end of Unix altogether. Unix vendors were developing NT-based systems, and the industry was being prepared for a Microsoft-only future.

GNU/Linux saved Unix by unifying it. It's possible it would have gone badly if a lisp machine was done instead, but it's also possible that it could have unified the world of lisp, also.

That said, what a clear idea of a "good lisp" is has changed over time... at the time RMS started GNU Emacs, it used dynamic scope rather than lexical scope because rms believed it would be too slow.

I asked RMS when he was implementing emacs lisp why it was dynamically scoped and his exact reply was that lexical scope was too inefficient. So my point here is that even to people who were experts in the area of lisp implementation, in 1982 (and for years afterward, actually), Scheme was a radical, not-at-all-accepted notion. And outside the Lisp/AI community... well, languages with GC were definitely not acceptable. (Contrast with the perl & Java era in which we live. It is no exaggeration, thanks to perl, to say in 2001 that billions of dollars of services have been rolled out to the world on top of GC'd languages.)

Flash forward many years, and lexical scope has universally won out. But that's easier to see in retrospect how to do that right. So maybe GASOLM (GASOLM's A Sort Of Lisp Machine) would not get everything right.

But GNU didn't get everything "right" either initially; in 1990 GNU was implementing long arguments for the first time and even then it used +arg rather than --arg style arguments. And even today, C, the very foundation of Unix and Not-Unix, is no longer considered "safe" by the security world!

So I think it's pretty hard to anticipate all that, but I have some optimism for the lisp machine alternate reality's fate. ;)

Claes Wallin (韋嘉誠), der.hans likes this.