Christopher Allan Webber email@example.com
Madison, United States
GNU MediaGoblin founder, former Creative Commons software engineer, python hacker, free software enthusiast, maker of weird drawings See: http://dustycloud.org/ (Any pronouns are okay.)
Lispers: the Cassandras of CS?
A slide from my talk tomorrow at LibrePlanet. Hope to see you there!
Finishing up "slides" for my talk at LibrePlanet, "The Lisp Machine and GNU". Hope to see you there!
Creating bundles with guix pack
Cool blogpost on the Guix blog about "guix pack" which is able to make standalone, binary releases that run on any GNU/Linux distribution. It can also be used to generate reproducible Docker images!
I hate the term "killer app", but Guix is definitely Guile's "killer app".
Claes Wallin (韋嘉誠) shared this.
identi.ca is back!
now to return to your regularly scheduled programming of cwebber complaining about themselves
Claes Wallin (韋嘉誠) shared this.
"You can do anything you put your mind to!"
Unfortunately I can only seem to put my mind to the things that don't matter right now.
Downloading https://mirror.hydra.gnu.org/nar/5ym03w0sba95i6l5bdgr1ybm30iy4dq0-guile-next-2.2.0 (44.4MiB installed)... guile-next-2.2.0 1.6MiB/s 00:05 | 7.7MiB transferred
Guile 2.2 is out!
After six years in the making, Guile 2.2 is out! New register machine based VM, optimizations make it faster than ever, and concurrent programming is a dream to hack on in it.
This is a huge release in Guile's history. Guile is now a very fast Scheme (not the fastest, but very fast), thanks to many compiler optimizations. It also comes with a powerful asynchronous programming layer... I'm biased, but I think that Guile is soon to be one of the most pleasant places to write modern concurrent applications. The last six years have also seen growth for Guile, partly because of the rise of Guix (which is written in Guile, and has in many ways become its de-facto package manager). I hope, and believe, that we're going to see it grow a lot more, and Guile 2.2 opens a lot of doors to make that possible.
For those who are interested in reading more, I recommend reading Andy Wingo's personal blogpost on the subject, as well as the release announcement sent to the mailing list, which has many more details.
Onward and upward! Things are already looking good for Guile, and I suspect they're going to get better.
David Thompson talks about his first compiler addition in Guile, which made it into this released (unboxed floating point numbers!) and encourages interested hackers to check out Wingo's list of ways people can get involved in compiler hacking. Guile's a great place to do it!
Claes Wallin (韋嘉誠) likes this.
Guile 2.2, tomorrow
Tomorrow will be a tremendous day for Guile, as Guile 2.2, a HUGE release in the history of Guile, will come out.
In the meanwhile, you can read Andy Wingo's blogpost about it!
An anecdote on the lead-up to Common Lisp as a standard
In preparing for my The Lisp Machine and GNU talk, I've been reading The Evolution of Lisp. It's interesting.
In particular, I found this bit interesting, on the lead-up to Common Lisp. As context, today Scheme and Common Lisp are the "two big lisps" (as in, multiple implementations... though Scheme maybe has the same problems today as are joked of below), but at the time Interlisp and Maclisp were the two big lisps, but actually had tons of fractured, incompatible implementations.
2.10 Early Common Lisp
If there were no consolidation in the Lisp community at this point, Lisp might have died. ARPA was not interested in funding a variety of needlessly competing and gratuitously different Lisp projects. And there was no commercial arena--yet.
In April 1981, ARPA called a "Lisp Community Meeting", in which the implementation groups got together to discuss the future of Lisp. ARPA sponsored a lot of AI research, and their goal was to see what could be done to stem the tide of an increasingly diverse set of Lisp dialects in its research community.
The day before the ARPA meeting, part of the Interlisp community got together to discuss how to present a situation of a healthy Interlisp community on a variety of machines. The idea was to push the view of a standard language (Interlisp) and a standard environment existing on an ever-increasing number of different types of computers.
The MacLisp-descended groups came off in a way that can be best demonstrated with an anecdote. Each group stood up and presented where they were heading and why. Some questions arose about the ill-defined direction of the MacLisp community in contrast to the Interlisp community. Scott Fahlman said, "the MacLisp community is not in a state of chaos. It consists of four welldefined groups going in four well-defined directions." There was a moment's pause for the laughter to subside [Steele, 1982].
Oof! Reminds me a bit too much of trying to corrall different federation projects for federation standards...
Maybe a ray of hope though: Several people from the MacLisp side did manage to coordinate, and Common Lisp did result from it, which may be one of the most impressive language standardization efforts in history...
Another bit, revolving how to handle extensibility and standards:
One issue that came up early on is worth mentioning, because it is at the heart of one of the major attacks on Common Lisp, which was mounted during the ISO work on Lisp (see section 2.12). This is the issue of modularization, which had two aspects: (1) whether Common Lisp should be divided into a core language plus modules and (2) whether there should be a division into the so-called white, yellow, and red pages. These topics appear to have been blended in the discussion.
"White pages" refers to the manual proper, and anything that is in the white pages must be implemented somehow by a Lisp whose developers claim it is a Common Lisp. "Yellow pages" refers to implementation-independent packages that can be loaded in, for example, TRACE and scientific subroutine packages. The "red pages" were intended to describe implementation-dependent routines, such as device drivers.
Nevertheless, the first question is brought up by a direct reading of the issue: Division of Common Lisp into a core plus modules.
If this were taken to mean a proposal that would have partitioned the language into layers with a central layer and outer layers that depend on the inner ones, then Common Lisp could have been more easily subsetted, which would have led to obvious implementations on smaller machines. This would have satisfied the need to cheap, prolific implementations. This would also have made providing educational versions of the language more readily available. It also would have prevented the strong attack during the ISO meetings by Europe and by, to a lesser degree, Japan.
The response from influential members is revealing: "This seems weird. Motivate it. Maybe these modules are optional at the implementation's choice?" "Keeping things modular is a good goal, but don't expect to succeed completely." "The division only makes a little sense." [?; ?] The group focussed too much on the funny white-yellow-red distinction and not on the corelanguage/extended-language distinction. Had this gone differently, so would have the future of Common Lisp.
Note, this sounds not too unlike the decision to break r7rs into r7rs-small and r7rs-large, for anyone who knows/cares about that in Scheme-land...
Gosh, more good quotes
Carefully deferred was the decision regarding whether () was a symbol. Even though this decision was left until nearly the end of the decision process--causing people to emotionally accept Common Lisp and attach part of their egos to it--when the discussion came up, it was divisive. Symbolics threatened to withdraw from the group unless their position was accepted, and so it was. The salient paragraph from their message is as follows:
We have had some internal discussions about the T and NIL issues. If we were designing a completely new language, we would certainly rethink these, as well as the many other warts (or beauty marks) in Lisp. (We might not necessarily change them, but we would certainly rethink them.) However, the advantages to be gained by changing T and NIL now are quite small compared to the costs of conversion.
The only resolution to these issues that Symbolics can accept is to retain the status quo.
This shows that there are some issues, apparently trivial, that can have a profound influence on people.
FSF licensing team on GitHub ToS change
tl;dr: while there are other reasons to not host on GitHub, it doesn't seem incompatible with copyleft (though more clarification would be useful)
A problem I have: I can follow the mathematical concepts in papers, but not the notation. I don't know how to get past this. It means that I sometimes read programming papers and I'm like right, right, makes sense [EQUATION] mind blanks.
Has anyone else overcome this? How? I have a "my brain learns best by experimenting" mode, maybe I need to play more with... something?
Craig Maloney likes this.Show all 8 replies
I found certain mathematical notations easier to follow once I learned Haskell. In many cases, I only have to squint to ascii-ize symbols to their Haskell equivilants. Also mathy variable naming stuff like
a'is idiomatic in Haskell.
Christopher Allan Webber likes this.
Put in a column near the equation, the meaning of each variable.
Then use differentcolors to highlight each variable (both in the explanation column, and in the equations).
Draw circles/strikes around/on part of equations that make sense by temselves, or represent a concept, or are transformed into other thing in the next step (in a similar way as teacher does in the blackboard while they explain things).
Maybe look for videos explaining the theorems etc (since they visually show all these techniques, I guess).
HTHI have the same "thought lazyness" problem, with mind starting to get white noize when reading scientific equations or also with musical scores (that I passed both years to learn at academy and school... )
I think what worked was to use them not only in a passive way (reading) but active (writing) ... and it is also the same with code I think.
so maybe trying yourself to put some concept of yours into equations with LaTex could help ?
Christopher Allan Webber likes this.
Emacs running wireworld
Don't ask me what I should have been working on today though...
- Okay... it happened. I implemented Wireworld in GNU Emacs. Ascii art circuits ahoy!
Diane Trout likes this.
@George Standish Well, luckily! I just wrote about what wireworld is, but to reduce clicks, here it is on wikipedia and here's a whole computer built in wireworld. It's cellular automata, like Conway's Game of Life, but more circuit'y.
Luckily, if you're vi-accustomed, there's Spacemacs, which I hear is excellent.
George Standish likes this.
Me IRLWhy get things done on the computer I have when I could get nothing done on an imaginary computer?
Dana likes this.
..... ..... .... ....
""""||"""" """""" """", """||"""
'''''''' ||' ||' '' ''''''
"""""" ||"" ||""" """"
'''' ||' ||' \' ''
"" ''"""" """ '" ""
******* **** **** ***
*** *** ** ** ** ** ***
*** ** ** ** ** ***
*** *** ** ** ** ** *** ***
******* **** **** ******
uıɐɾ ʞ ʇɐɯɐs shared this.
Lisp Machine and GNU @ LP 2017
I'm talking at LibrePlanet 2017 on "The Lisp Machine and GNU":
You may have heard of Stallman and the printer, but much of free software's genesis involves the battle over the soul of the lisp machine. We'll trace Lisp and the Lisp Machine's roots, from its genesis in early hacker culture and the AI labs, to the split that (largely) pushed RMS to found GNU, through its role within and without the free software community. Why did GNU become a "Not Unix", and why not a lisp machine? What about the role of Lisp within GNU, with projects like Emacs, Guile, and Guix? For those who are new to Lisp, there will be a mini-tutorial.