Richard Feynman and the Connection Machine

by way of lemonodor

An entertaining article by Danny Hillis about Richard Feynman's work at Thinking Machines Corporation on the Connection Machine.

We've mentioned the Connection Machine's data-parallel programming style on LtU before, and Connection Machine Lisp remains my all-time favourite paper in computer science.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Broken link?

The link to the Connection Machine Lisp paper is broken...

Fixed

Nice catch.

I am pretty sure we mentioned

I am pretty sure we mentioned this paper before.

Haven't we?

You mentioned it in this comm

You mentioned it in this comment.

Connection Machine LISP

I knew the paper, but am delighted to discover it is now available electronically.

Hillis' book is also excellent, well worth expending some effort getting hold of. Since every other comment I make here seems to lead to Alan Bawden's work on linear graph grammars, another link: linear graph grammars arose as a more radically connectionist way of programming the connection machine than Connection Machine LISP (and are briefly mentioned in Hillis' book) which is why they were first called connection graphs.

Easy to get Hillis' book

A great many of the classic out-of-print computer science books can be got for very little effort and money nowadays. I got my copy of The Connection Machine from the online second-hand book seller Alibris and they have more copies available right now starting from $4.25.

My latest Alibris batch arrived recently, including Concurrent Programming in Ada (interesting language!), Iverson's A Programming Language, and How To Write Parallel Programs (the Linda tuplespaces book). The only trouble is that we can't post dead-trees books on LtU.

Danny's thesis vs. Thinking Machines

The book (Danny's thesis) is indeed very interesting. Anyone reading it should keep in mind that the computer systems actually built and sold by Thinking Machines have little to do with the concepts described in the thesis.

When Danny and many others were working on the original concepts at the AI lab, they would always say, mantra-like, "It's NOT a SIMD machine!" Indeed, Danny's thesis does not describe a SIMD-like usage at all.

However, my understanding is that most, if not all, of the useful applications to which customers put the actual Thinking Machines systems were, in fact, SIMD.

None of this takes away from Danny's thesis, of course.

Those who do not know history...

...are doomed to reimplement it, poorly.

This post is relevent to the discussion on the new super-parallel IBM chip. We have been there, done that, and we have a nice picture of Richard wearing the t-shirt.

So what's different now vr.s then- remembering that then wasn't all that long ago?

Lessons?

This makes me wonder what the conclusions were of the Connection Machine experience. Most of the literature talks about the early days and doesn't say much about what they found out going forwards.

I went fishing for anecdotes on the ll1-discuss mailing list but just got one bite from Guy Steele.

According to The Rise and Fall of Thinking Machines it was a bit of a rocky ride, so perhaps they don't like to dwell on it. That saddens me a bit since it sounds like such an incredible undertaking, and as you say it wasn't all that long ago.

Also, a friend who's used a CM told me that it was commonly programmed in a language called *Lisp ("Star Lisp"). You can find a manual for *Lisp on the web and there is an implementation (simulator in Common Lisp) in the CMU AI Repository.

Lessons

Luke, I don't think I've ever read a "lessons" paper, and I would love to. The "Rise and Fall" paper, while fascinating, is too shallow and non-technical for that purpose. I'll ask around.

*lisp

it was 1986 or so when I went to Thinking Machines for a job interview and had to wrestle with that fancy coke machine they rigged up there. My job was to learn *lisp well enough to hand-translate the code to C*. Unfortunately it was all about crunching Dow Jones industrial data. I spent a brief time playing around with the CM. It was a beatuful looking black cube with lots of pretty red LED lights blinking inside. To use the CM, you had to use a SUN workstation (Sun-3) the CM did not even have the OS, just bunch of raw processors. *lisp was built on top of Common LISP. you could use *defvar to allocate variables in processors and compute them in parallel using normal Common LISP functions. For example, + would be +!! and would add to all the variables allocated among the processors... up to 64K of them! actually, you could virtually allocate more than that. Of course the whole idea was that things would work well for a certain defined problem set which requires massively parallel data processing/computation. At every clock each processor of CM ran the same instruction over different data.

I never saw or used "Connection Machine LISP". I think might have been a vaporware.

[Random] Karl Sims

Karl Sims did some pretty cool evolutionary computing things on Connection Machines in the mid-'90s.

Article has moved

It is now here.

Did Feynman's ideas get implemented?

I love Danny's essay. It's consonant with other things I've read about Richard Feynmann and, to me, it further amplifies my incredibly high opinion of him. (Read James Gleick's "Genius", one of the best biographies I've ever read.)

The essay talks about many cool idea that Feynmann had about using the CM, e.g. for QCD. Did these ever get implemented?