## Common Lisp: The Untold Story

Common Lisp: The Untold Story, by Kent Pitman. A nice paper about the history of my favorite lightweight dynamic language.

This paper summarizes a talk given at â€œLisp50@OOPSLA,â€ the 50th Anniversary of Lisp workshop, Monday, October 20, 2008, an event co-located with the OOPSLAâ€™08 in Nashville, TN, in which I offered my personal, subjective account of how I came to be involved with Common Lisp and the Common Lisp standard, and of what I learned from the process.

Some of my favorite parts are:

• How CL was viewed as competition to C++. (Really, what were they thinking?)
• How CL was a reaction to the threat of Interlisp, and how "CLOS was the price of getting the Xerox/Interlisp community folded back into Lisp community as a whole" (link).
• How individuals shaped the processes of standardization. MIT Sloan did an analysis of these processes.
• How the two- to three-day roundtrip time for UUCP emails to Europe may be responsible for the creation of the separate EuLisp.

I have a soft spot for CL, so I am biased, but I think Greenspun's Tenth Rule (and Robert Morris' corollary) still holds - CL is the language that newer dynamic languages, such as Perl 6, JavaScript, and Racket are asymptotically approaching (and exceeding in some cases, which is why I view CL as a lightweight language today.)

## Comment viewing options

### why CL didn't "win big" a long time ago

VCs killed it.

They tried to monetize it in a bunch of dumb ways and then distracted the hacker community by insisting that it was up to the hackers to make the VCs whole on dumb bets (and too many hackers thought they could play along and win). This is mythologized as the so-called "AI winter" (wherein the VCs try to blame academics).

Stallman was right.

But what do I know? It bears only a deep and tragic resemblance to what happened to GNU(/Linux).

### Can I read about [vc's killing CL] in more detail somewhere?

Almost anything on the AI winter or the history of the lisp machine companies will do. A light more "social" one from late on would be Gabriel's book about Lucid's demise.

It's helpful to look at lots of scattered materials (but not hard to find) that talk about interactions between grad programs and investors at MIT, Stanford, etc. Personal essays and recollections around the net -- that kind of thing.

I was a lisp programmer at the time, who became a c++ programmer in 1989. I've never read any details of the AI Winter. I lived through it, and experienced the marketing and mangerial hype. I never would have thought to pin it on the programmers or academics.

I too would be interested in any accounts of how thisunravelled.

### AI winter

There is a wikipedia article which gives a good summary and does place some of the blame on academic hype. FWIW I think it mainly came down to hardware....

Computer languages live in a cultural ecosystem. The leading LISP vendors bet on accelerated hardware, while OO vendors bet on generic hardware. Academics played some role in that choice, but it was a community decision. From the mid 80's to the early 2000s generic hardware was the right play though I can't fault people for making the opposite one. It is much easier to see how things did work out than see how they will work out.

As an aside, as the focus of design is moving from computational speed per dollar to computational speed per watt, to extend battery life; specialized hardware has made a huge comeback. Hardware optimized dynamic computing languages might be on the cusp of a surge in popularity, and already are if you count video codecs as a language. JavaScript and Ruby seem to be the two leaders right now. So we might actually see the "what could have been" play out in the smart phone market.

### I think the AI winter mainly

I think the AI winter mainly was a problem with software. Connectionism was replaced with symbolic reasoning, which we have found to be pretty limited (elephants don't play chess...). Connectionism made a comeback with Neural networks, but that was much later, and of course, machine learning is very big right now.

Lisp machines are lumped together with the AI winter, but I think this is sort of misguided: commodity hardware could just run Lisp faster, regardless of whether the software was useful or not.

Today even the GPUs are becoming more and more generic, I don't see the benefit of specialized language hardware anytime soon.

### Perhaps in a couple of years

Perhaps in a couple of years (or decades) we will start to use FPGAs extensively. I wonder what changes that will bring to high level languages?

### FPGA

I would suspect if FPGA are very common we might see a situation where non-customized high level languages outperform non-customized low level languages or at least come much closer. That is, I think the FPGA optimization will just get absorbed into the runtime library for high level languages. So the author can be oblivious and just notice a speed up. Low level code because the "library" code is intermixed with "program" code would have to be substantially modified to get decent performance.

Adobe is actually a good example of this. When they were trying to make a Flash pluggin for mobile because the hardware was so different and the same hardware used different OSes and there were different browsers they had to handle browser/platform/hardware specific version often with opposite optimizations. The work was completely overwhelming and they failed terribly. Even with the limited success they had Flash programmers had all the browser/platform/hardware abstracted away. As an aside to LtU had they used a functional language to write the original PC FlashPlugin they might very well have been successful since the algorithms and the low level details of implementations would have been separable in the code.

Where I think the effect might be cool is on mid level languages, separability will matter. In which case C++ like languages will be close enough to the metal to prevent the complier from abstracting and far enough away to prevent the code from running well. The JVM might might be able to compensate for Java.

But I kinda imagine functions that evaluate to something like a FORTH stack of code and data which then gets processed on the FPGA. Functions that evaluate to programs are a rather natural fit for FPing so I'll cross my fingers.

### C lost

Many Common Lisp programmers used to pronounce CLOS as "C lost" in the mid 1980s. They truly believed it. Indeed, I had people from TI join me at Rice in the late 80s when it was all over who insisted on this pronunciation. Even the T compiler people thought they had to fight Pascal compilers because C was still slow.

I am also sure there are people around who remember LFP 1984 in Austin (I wasn't there, I was told), which must have been a version of OOPSLA back then with all the Lisp machine vendors and Lisp software companies exhibiting their wares. American Express had warehouses full of Lisp machines checking credit card purchases. Exxon and friends used them by the dozens to check for drill sites in the Gulf. The conference made enough profit to pay for the next two but by 1986 (my first attendance) it was all over.

Blame? There's plenty to go around. I think Lisp machiners, academics, industrial managers, Darpa program officers, you name it. Don't point fingers at others; look in the mirror first.

p.s. What amazes me is how little the FP community is willing to learn from this experience of industrial usage. But as they said, those who don't know history will repeat its mistakes.

### Common Lisp competition with C and C++

I don't think I ever heard "C-lost" (with a "t") for CLOS, but I did often hear "C-loss".

Re C speed -- I have a little example. In the mid 80s, it was not hard to find cases where Franz Lisp was faster than C on a VAX-750 (running Berkeley Unix). There just had to be a high enough density of procedure calls. C always used 'calls', but with the right declarations, Franz would use a faster sequence in some cases.

Re the top post's "what were they thinking?" re CL being "viewed as competition to C++" -- I think it makes more sense if you see it the other way around: not can CL take business away from C++, but can CL survive the C++ onslaught? What Kent's paper says can fit either point of view: 'the relationship between the various Lisp vendors had improved and it was becoming clear to all that they had a common â€œenemyâ€: C++.'

That certainly fits my experience. I was working on an AI planing system, O-Plan, written in Common Lisp, and there was talk of moving to a more mainstream language, with C++ as the obvious choice. Fortunately, Java came along just in time to save me.

(I think that many of the reasons why Lisp "failed" are the mirror images of reasons why Java didn't.)

### Apollo Pascal

Pascal was the target I'd bet because they were using Apollo workstations. Apollo Domain OS used Pascal as the systems programming language, and had a pretty good compiler. I used T for a bit on Apollos. It was not bad but a pain in the butt to boostrap.

### AAAI and LFP Austin 1984

I was there and also attended many OOPSLAs until they too became over-commercialized. You are spot on. The feel was very much the same.

### who's who

Blame? There's plenty to go around. I think Lisp machiners, academics, industrial managers, Darpa program officers, you name it. Don't point fingers at others; look in the mirror first.

I don't see any sharp devision between the VC community and "academics, industrial managers, Darpa program officers" in this context. The academics get busy selling off copyrights and patents, trying to get a piece of the action; darpa program officers come from that same community but generally have ties also to policy makers ... but again, commercialization through new investment becomes a (the?) major focus; The VCs broker the deals. They are of a piece.

If you like, I guess saying "it's the VCs fault" could be seen as short-hand for that.

### What are the mistakes?

So, what were the mistakes? What should the FP world be doing differently?

To me, the current interest in FP feels pretty different from what I've heard about the Lisp Machines heydey. For one thing, industrial uptake seems quite slow, coming only in dribs and drabs. There seems to be no FP goldrush in sight, just a slow growth of technical people who are more and more interested in what FP can provide. All told, that feels healthier, if a little slow...

### >mistakes

I'm not a not a language designer, however I started programming circa 1979 or so, not as a profession but as part of my work as an engineer.

Mistakes, I don't know. I think the problem domain shifted in a way that was unfavorable to Lisp and interpreted languages in general.

My impression was there was a large disconnect between corporate/academic programming where the typical computer was a time share mainframe or minicomputer and users of then new personal computers. Personal computers had very limited resources and primitive operating systems yet their numbers quickly dwarfed their larger cousins. What that meant wad you suddenly had a much larger ratio of users to developers which favors programming to make the most efficient use of the hardware, at the expense of programmer productivity. I think Word Perfect was written in 86 assembly, and that was not uncommon.

Programs written for PC's also often had to directly deal with the hardware. Writing directly to video memory and making direct operating system calls. I've done this with Assembler, Basic and C on those old machines. Assembler was fast, but a pain. Basic way way too slow. C on the other hand, wasn't as fast as hand crafted assembly, and had a larger memory foot print, larger than basic even. Yet was a lot faster to work in.

Worse, I'm sure this isn't totally true, but early implementations of Pascal and Lisp I tried were walled gardens. No pointers, no calls to assembly, thus no escape. You were stuck with the libraries provided which were oriented around file io and terminals.

### Intepreted?

Lisps have had a compiler for quite some time, according to wiki it was the first self hosting compiler circa 1962. reference

Also the lack of direct access to the underlying hardware is hardly uncommon these days, many languages now run ontop of some form of hardware abstraction layer. I understand you meant in the past it was an issue but at the time CL started to lose the interest of business and academics things like Java started to come into existence and gain significant acceptance.

Maybe you mean that Lisp was a victim of coming too soon.

### delay

I think you have too many years break there. There never really was a surge of PC / Windows LISP software. It is a different community.

In the 1980s when programers were moving from Assembly / Pascal / Basic into C and then C++ LISP mostly didn't run on PCs. By the time Java came around it was for the Visual Basic and C++ crowd. There was no LISP crowd because LISP had never really gone after the PC market (yes I know about packages like Corel LISP but they had low market share).

The early versions of Java that were popular on PCs like Microsoft Java / J++ either had low level calls to the browser or low level calls to the OS framework. Once Sun's Java became the norm Java became a language for enterprise applications and not the desktop.

So I think GP was right about low level mattering to the PC crowd at least well until the 1990s.

### Lisp for PCs

There were some decent lisps for PCs in thelate 1980s. Gold Hill had a good one but it cost , cf. AI Winter.

Go to any Byte magazi e from the time and you'll see one or two ads toward the back for reasonably priced Lisps. I dont remember when xlisp made its debut. Wikipedia probably knows. But that was the lisp vehicle for most programmers for PCs around or seen after that time.

### Ratio of users to developers and limited resources

I don't quite agree with the points about the ratio of users to developers and limited resources. In the longer term, the ratio might well increase, but that would be because more people were using computers, rather than because of the difference between time-sharing and PCs. (Though PCs might also mean a longer-term increase in the number of developers.)

There wouldn't suddenly had a much larger ratio of users to developers. Consider a university that had a mainframe time-sharing system that all students and staff could use. It's only when more people were using PCs than had been using the time-sharing system that the ratio would increase, and that wouldn't happen overnight.

Re resources, if a mainframe time-sharing system supported, say, over 200 simultaneous users (which was certainly possible, Dartmouth's DTSS being an example), then it would have to make very efficient use of resources.

### How have dynamic languages advanced beyond CL?

Manuel,

Which languages do you think have advanced beyond what CL can offer, and what exactly do they offer which could not be provided by a reasonably authored library.

It's interesting, from admittedly anecdotal evidence, that at the time CL was considered a bloated standard yet now it seems quite average in size when compared to some of the main contenders in the field of programming languages e.g. C++ and Java.

### Racket

Racket has quite interesting things, including a declarative macro system that provides readable error messages in case of syntax error, Typed Racket (admittedly that could be written as a library for CL, as it is what's been done for PLT, but that would require an unreasonable amount of work, just as it did there), and an interesting system for contract programming including parametric contracts.

I must admit I forgot a lot of the details about Common Lisp. Does it feature true separate compilation? I'm not sure, but I suspect it does not; that is also a very important distinguishing feature.

(In the rest of the Lispish world, Clojure has interesting community dynamics and focus on immutability and concurrency, and Qi/Shen is trying to be relevant, but I must admit I'm dubious. Kernel is also interesting but it's probably still too early.)

### Separate compilation

Fare's XCVB is one CL build system that features separate compilation similar to PLT's You Want it When?.

### Separate compilation

CL certainly does not have separate compilation, and cannot have it. It's relying heavily on smashing all times (runtime, compile-time, read-time) into a single dynamic environment, which is why this can't work. Fare's system is very directly an attempt to get the separate compilation aspect of Racket (actually started when it was PLT Scheme), when Fare faced the horrible mess that the single dynamic environment things can get you into. The way xcvb is trying to get you back into a sane world is via brute force restarts after each compilation. (Roughly.)

### There are a couple of

There are a couple of different models for how Lisp is used. In one, you tend to stay in the Lisp world all the time; and then of course restarting after each compilation seems artificial at best. This is the model that was encouraged by Lisp machines.

But that's not the only way to use Lisp, or even Common Lisp. With Franz Lisp, for example, it was normal to run the compiler separately for each file that was to be compiled and to use the Franz compiler (liszt) with Makefiles, much as people used the C compiler. Common Lisp compilers can be used in the same way, and that is essentially what xcvb is doing.

By "Franz Lisp" I mean the Maclisp-like Lisp that was distributed with Berkeley Unix, and was later sold by Franc Inc, rather than Franz Inc's Common Lisp. However, Franz Lisp macros are the same as in Common Lisp and so present the same problem for separate compilation, namely that macros are written in Lisp so that you have to be able to call Lisp code while compiling in order to expand macros.

There may be a technical sense in what using the Lisp compiler in that way is still not "separate compilation". For example, it requires some discipline from the programmer that is not enforced by the language, and it's up to the programmer to put the dependencies into the Makefile (or equivalent). But I think it's close enough as a practical matter.

The main way people get into a "horrible mess" is when they use the interpreter and ignore compilation issues until a relatively late stage. But when the compiler is used in the way the Franz Lisp one was used, and programmers start using the compiler at an early stage, and (if necessary) are taught how to structure their code, such problems are not hard to avoid, at least in my experience.

(This is not to say that greater and more enforced separation isn't a good thing, but there does seem to be a cost as well. I'd be interested in hearing of any language that's done it while still letting the same language be used for computing macro expansions.)

### ...

• Yes, I know about how lisp machines were used -- it's a workflow that kind of works when you're using an image based environment but otherwise is very limited (interactions are much more of a tool for exploration & testing than a tool for development). Even with images it's easy to end up in some messy state where some code is part of the image but not editable (or not easy to find).
• Yes, the CL way is -- currently -- something that requires discipline. But the fact that the programmer needs to know more means that you reach your mental capacity quicker and you're therefore limited by the kind of towers you can build. If the environment does tha bookeeping for you, you're free to build any kind of towers, since you know that it will just work. BTW, I'm coming from a CL world, so I *KNOW* that no CLers would believe this. When I switched to PLT the kind of freedom I got was absolutely amazing -- on several occasions I've written code that CLers (and Schemers from implementations without a module system) explicitly said that it's crazy to do such tricks, but it's not even an issue with PLT since the environment makes it work. Yes, that's by separating phases -- more restricted => the machine can do more => I can build more complex software.

To put this differently, you can get by mistake into the horrible messes. If you consider Fare's motivation: working on a system that has several millions of source code that was developed for decades means that even a tiny mistake of this kind can end up in exactly this kind of horror story.
• As for a system "that's done it": that properly separates the worlds *and* have a macro system (and a configurable reader)... That should have been clearer: Racket is very much doing all of that.

### Ok, but I wasn't looking for

Ok, but I wasn't looking for a system that separates worlds and has a macro system (with or without a configurable reader) but for a system that separates worlds and has macros written in the same language as ordinary program code (like Common Lisp macros are written in Common Lisp). Does Racket do that? It looked to me like it used a special language for writing macros, rather like Scheme does.

I don't quite agree that the programmer needs to know more to use Common Lisp in the Franz/C-like way I described. It's pretty simple, really.

That's not to say there's no advantage to having separation enforced by the language. Of course there are, especially when a large system is developed by many people or over a long period of time. I just don't think the CL situation is as bad as it's often said to be.

Could you say something about the tricks CLers (and Schemers without module systems) thought were crazy? I'm curious now.

### Racket's expansion phases are equally expressive

A Racket macro is just a function from syntax to syntax. The whole language is available to write such things. But they're generally written using syntax-case' and syntax-parse' because matching syntax is easier to do right than deconstructing it by hand.

### I curious now about which

I'm also curious about which languages have true separate compilation. Does C or Java?

### Separate compilation

Off the top of my head: e.g. OCaml, Modula, and arguably C support proper separate compilation. Java only allows incremental compilation, same for Haskell and C# AFAICT. With C++, you do not really have separate compilation in any serious sense of the word, because too many things have to be shared in header files (e.g., implementation details of classes, inline functions, templates, etc).

### Is there some easy to

Is there some easy to state and understand way to distinguish between

• Incremental compilation.
• Separate compilation.
• True / proper separate compilation.

When I look on the web, I see conflicting claims about whether Java has separate compilation or not.

I've also seen some people distinguishing between separate and independent compilation.

Anyway, what seems to be missing in Java as compared to C is .h files; but couldn't the Java implementation extract the required "header" info automatically?

Also, does Racket have (true) separate compilation?

### Separate compilation a red herring?

Isn't what we want fast compilation of large programs, and modularity in the sense that type correctness of module clients only depends on the module's interface, not the implementation? Separate compilation doesn't imply and isn't implied by these.

Java, for example, has both properties. C++ does not have either property. If you consider macros part of the interface, then Racket does qualify (of course you might argue it's unfair to consider macros part of the interface but not consider templates part of the interface in C++...). There was a discussion about BitC where it was proposed that the compiler just type checks and does basic transformation, and then it's the linker's job to inline all type class dictionaries, and optimized further. This is technically separate compilation, but it's not fast.

### Ambiguous

"Separate compilation" is a bit of an ambiguous term. In the strict sense ("true"/"proper"), it is the ability to compile a module (or unit or class or whatever) without having to compile any of the modules it depends on (except maybe their interface description, see OCaml or Modula), nor any of the modules that depend on it.

"Incremental" means that only the latter half holds. For example, you cannot compile a Java class without having first compiled the classes it depends on. But a lot of people call this separate compilation, too.

Separate compilation in the strict sense implies that changing the implementation of one module only requires recompiling that single module, and then relink the program. With incremental, you generally need to recompile all dependencies. If programs get large, the difference between a constant and a linear number of recompiles can have quite a severe effect on turn-around times.

### The Jave case doesn't seem that different

With Java, you have to have compiled the depended-on classes once, at some time in the past. You don't (of course) have to recompile them every time you compile a dependent class. So I'd see those past compilations as equivalent to automatically extracting .h files (or some other module interface descriptions), rather than as something that makes a longer-term difference.

Normally, when I change a Java class definition, I only have to recompile that one class. I think it's only when I change something in the externally visible "interface" / signature (such as the number of arguments a non-private method expects, or their types) that I have to recompile the classes that depend on it.

Java implementations don't necessarily give me any help in determining just what classes are affected by such a change, but that is arguably an implementation / programming environment issue rather than a language issue.

### Not sure about the Java details

It's true that not recompiling dependencies, or even faking out depended-on classes, usually works in Java. I'm not so sure whether you can actually rely on that in all cases, though. But my Java knowledge has become rather rusty...

### Re. Java

I am not familiar with Java, but here are two properties that I think
are desirable and that your description does not offer.

1. There is a difference between the ability for the compiler to
"extract the interface", as you describe, and the ability for the
programmer to specify the interface. The latter option means that
there is some well-defined language to describe the interface exposed
by some software components. I think this is an important property for
a language to have, because it acts as a design tool, and a positive
design pressure.

And of course this is quite hard, so in practice most languages are
not doing as well as they could: their interface language is too
restrictive to capture some semantics phenomenon which are therefore
forced to be local (eg. contractiveness of parametrized type
definitions, allowing to build well-defined recursive types
from them). Such completeness would sometimes require too much
sophistication, and it may only be present in a core, rather than
user-visible, language.

2. Having the ability to describe an interface allows the programmer
to program a component against a dependency that does not yet exist
(or, of course, against several different components with the
same interface). Admittedly, this kind of blackbox design is not very
frequent and should not be (designing API first, then implementing
components on it is a recipe for disaster if there is no margin left
for experimentation and interface changes), but that is still an
occasionally important property to have. I'm thinking for example of
a professor asking students to write some code against an evaluation
component, giving them only the interface for such component, not the
implementation or compiled object. With an "interface extraction"
tool, the teacher could extract the interface from the implementation
(hidden but hopefully pre-existing), but if you really had no properly
specified interface language, how will he/she explain the expected
interface to the student?

### Specifying interfaces in Java

Arguably, the Java programmer does specify the interface, not using a distinct interface-definition language, but by what's declared public, protected, package (the default), or private.

In some ways, and at some times, that isn't as good as explicitly writing an interface definition separate from the implementation code, but if type inference is considered a good thing (so that programmers don't have to write out the types explicitly), can't interface inference be a good thing too?

And when you want an interface spec that's separate from any implementation (and can have more than one implementation), then to a large extent a Java interface fits the bill. (A Java interface definition looks much like a class definition but without method bodies and with all methods implicitly public.)

Some (many?) Java programmers make extensive use of interfaces, with classes used only as interface-implementations, and "factory methods" that return an instance of a suitable implementation class (rather than have "client" code explicitly instantiate classes using new).

I think that's a sign that Java does have something wrong, at least for cases where that's the best approach, or the approach the programmer(s) want to pursue, because it's a mere convention of use, rather than something the language directly supports and makes easy and nice-looking. But it's not the only reasonable way to use Java.

### In some ways, and at some

In some ways, and at some times, that isn't as good as explicitly writing an interface definition separate from the implementation code, but if type inference is considered a good thing (so that programmers don't have to write out the types explicitly), can't interface inference be a good thing too?

Having the ability to infer interfaces is fine. The problem is when you do not have the ability to specify it explicitly, and you would not even know how to do that. Once you have a comprehensive interface language designed, it's very reasonable to give a tool to infer such interfaces from implementation. Such inference tools are of course even better if the user can see the result, because it is expressed in an interface language that she can understand.

(Inference can be difficult if you want to guarantee that the inferred result is the "best", most general one, but if you assume that the user will check the result and is able to modify it you can allow yourself a bit of approximation.)

That said, component boundaries is very often a very good place where to drop inference and explicitly specify the types/interfaces of your system. I would expect interface inference to be a non-prominent aspect of the system -- though that can certainly change when component granularity gets small enough.

And when you want an interface spec that's separate from any implementation (and can have more than one implementation), then to a large extent a Java interface fits the bill.

There certainly are pre-existing concepts in the language that are very close from such a notion of interface. Types in general are a very natural notion of interface, and are expected to be the main (or unique, depending on how you look at it) component of an interface definition -- you also need to give "types" (specifications) to type definitions if your language support them, etc.

If you can write a bunch of class interfaces somewhere, and then write a class/source-file/component against those interfaces *as if* they were class definition (unknown, but concrete), that is write code against definitions for which you only have the interface, check and compile your code, and then later combine that with code actually implementing those classes, you may have a form of separate compilation.

(As noted in the thread, "compilation" is not an absolute concept and notions of "separation" become different when you have sufficiently rich link-time transformations, etc. I actually think that there is value in a rather strict interpretation of separation: instead of seeing those link-time optimizations as whole-program transformations, is it possible to see them as local transformation depending on finer *interfaces* for the other module, encompassing low-level invariants/aspects (eg memory representation, boxing choices...)? Is it possible to selectively disable the transformation that really are global, to leave a choice in the efficiency/separation tradeoff?)

### Java interfaces

Just a quick note on Java interfaces: they cannot really be used to describe the interface of a class. They can only abstract accessors and mutators, but neither constructors nor ambient functionality (i.e. static methods).

### You can fake it to some

You can fake it to some degree: for static methods and constructors you define a class with a parameterless constructor plus an interface of that class. That way you only depend on the parameterless constructor and the interface. You can use some reflection tricks to be able to eliminate the dependency on the constructor so that you can completely swap one module for another.

### Or you don't provide an

Or you don't provide an externally accessible constructor at all but only factory methods.

Just adding to Andreas' comment regarding C. I think it is worth commenting on Make. C not only has separate compilation in theory but has decades of experience in using this in a mainstream way, part of infrastructure of how C programs (at least on Unix) are commonly written and distributed. Everyone who install Unix software knows the expression "configure, make, make test, make install" without even consciously thinking of the second step as separate compilation depending on fairly complex configurations determined in the first step.

And in general because C/Unix go hand in hand on Unix lots of languages use Make to build their modules and thus often allow for separate compilations, including cross compilation. And once your build routines are written in Make programmers naturally migrate towards designing their modules for separate compilation.

So I'd say it less the language than the build system.

### Make and incremental

Note however that Make also works fine for incremental compilation. That merely induces more dependencies, but Make itself couldn't care less. The only difference is more recompiles when something changed (and reduced potential for e.g. parallel compilation).

### make -j

Agreed on incremental. Not sure if I follow the problem on parallel. make -j tells make where it can safely fork into a parallel mode. Apple for example includes a fully worked out parallel compilation XCode project at /Developer/Examples/Xgrid . What are you saying goes wrong?

### Parallel requires independent

Nothing goes wrong, but you can only parallelise independent compiles. Hence, the more dependencies you have, the less potential parallelism.

### Things that other languages

Things that other languages offer that CL doesn't: simplicity & a well designed standard library. For example if we look at collections support in CL there are cons cells and various operations on them with weird names, plus verbose usage of arrays & hash tables. Contrast this with modern languages: collections are based on generic protocols (for example iterating over a linked list is the same iterating over an array), there is first class support for hash tables, lazy sequences/enumerables, etc. Of course you can throw out basically everything in CL and build your own standard library, but this theoretical possibility will not satisfy practitioners.

### I would argue that Java

I would argue that Java became successful when it lacked a well-designed standard library, and that it was more complex and verbose than Common Lisp. Even now, although Java code can iterate over a list and an array in the same way, arrays are not well integrated with collections in other areas, and generics have made Java even more verbose.

Historically, Lisp emphasised lists as its main data structure and in effect followed a philosophy that was summed up by Alan Perlis in the Forward to Structure and Interpretation of Computer Programs as "It is better to have 100 functions operate on one data structure than to have 10 functions operate on 10 data structures."

However, in Common Lisp lists, strings and vectors (1-d arrays) are united as sequences, and most at least of the sequence functions don't have weird names. Using map, iterating over a list is the same as iterating over a (1-d) array.

Common Lisp does suffer from the late addition of CLOS which meant that many things that should have been done with generic protocols aren't. But I don't think the situation is quite as bad as you suggest.

(I can't work out, BTW, why it might be thought that hash tables aren't first class in Common Lisp.)

### Translate the following to

Translate the following to CL, and you'll see:

    a = {}
a["foo"] = 4
for k,v in a: ...


### Yes?

(let ((a (make-hash-table)))
(setf (gethash "foo" a) 4)
(maphash (lambda (k v)
...)
a))

### Or, if you prefer loop(let

Or, if you prefer loop

(let ((a (make-hash-table :test #'equal))) ; You probably don't want a pointer identity test
(setf (gethash "foo" a) 4)
(loop for k being the hash-keys of a
using (hash-value v)
do (format t "~A => ~A~%" k v)))


EDIT: indentation got mangled on the way.

### A nice illustration of "How

A nice illustration of "How have dynamic languages advanced beyond CL?".

### I'm not sure what it's

I'm not sure what it's suppose to show.

That hash tables aren't first class in CL? I don't think it does show that. (It's true that CL doesn't have a nice and re-readable printed representation for hash tables. I think that was a mistake, but a fairly minor one, and not one that stops hash tables from being first class.)

That some languages can manipulate hash tables more succinctly than in CL? Well, yes, but that's not contrary to anything I've said.

That dynamic languages have advanced beyond CL? It's shows that some languages have a syntax that many prefer to CL's and that is more succinct in many cases. But that's pretty much always been true, right back to M-expressions at Lisp's origin. So I don't see it as an advance, just as a different part of "language space" that some people prefer and that's always been available.

If we're looking for reasons why CL wasn't more successful, the syntax is of course a reason. Many people are put off by the density of parens and by some unfamiliar terminology (such as car and cdr).

However, if the problem is supposed to be that the syntax is too verbose, or that not enough data types have a notation that can be written in source code, then that theory faces the problem that Java was much more successful despite being worse in both respects. (I've spent a lot of time programming in CL and related Lisps and a lot of time programming in Java, and Java's verbosity is probably what I like least about it.)

And if we look at the fate of Lisp alternatives that have a more conventional syntax, they haven't always done any better. Consider the various Pops for example (Pop-2, Pop-11, Poplog), or even Dylan.

Anyway, if we go back to Michael Compton's question of advances "which could not be provided by a reasonably authored library", I'd say threads.

### Why CL wasn't successful is

Why CL wasn't successful is a difficult question, and I don't know the answer. But there are compelling technical reasons for choosing a language like Python over CL, the most important being the relative simplicity, the standard library, and the concise syntax that goes hand in hand with it. That doesn't mean that CL wasn't successful for those reasons, though, just that people are not necessarily irrational for not using CL (as some Lisp advocates would have you believe).

### I have no problem with

I don't have any problem with people picking a language other than Common Lisp. Python seems a good language to me, and it seems to have a much greater range of usable libraries than Common Lisp. (There have been some efforts to collect libraries for Common Lisp, but it's been a while since I looked at them and I'm not sure how successful they were.)

Nor do I have any problem with people preferring Python's syntax, or with the idea that it's more concise. (Indeed, I usually prefer that sort of syntax myself, these days.)

However, I would question the idea that Python is simpler, as a language, and the idea that the concise syntax goes with the standard library. An equally concise syntax could be provided for Common Lisp, it seems to me.

Language simplicity is a trickier issue, since it depends on what's considered to be part of "the language" and what isn't, and CL is notorious for not having a clear language-vs-library distinction. There's also the question of whether the language's syntax is included when considering complexity. (Lisp, even common Lisp, arguably has a very simple syntax, as programming languages go.) Then there's the question of whether complexity is really the right thing to look at. Perhaps what really matters is, instead, the learning curve programmers have to follow. So I think complexity is probably too big a topic for this thread.

However, I will say that Common Lisp now seems simpler than it did at first, back when the primary comparisons were to other Lisps and to C.

### Quicklisp

Zach Beane's Quicklisp seems to be pulling CL libraries in the correct direction; not only does it make getting said libraries a lot easier, I think it may also encourage library authors to at least produce a minimal standard of library as they aim to get it into Quicklisp and so available to a wider audience.

It is quite simple to hide any of warts behind a macro in CL.

As for hash tables, there is a very well known and commmonly used utils library, Alexandria, hash funcs

I would certainly second the above sentiment that compared to many 'popular' languages today CL really doesn't seem that complicated, in the main; there are definitely some very advanced aspects, but generally these are not required for the fast majority.

### Many people are put off...

...by the density of parens and by some unfamiliar terminology (such as car and cdr).

Two personal observations: 1) after 20 years of code walk-throughs in many languages from ASM and C to Java and Haskell, I find those parens and not-always-so-terse symbol names quite pleasant to read, at length, and in depth; and 2) after 35 years of learning, playing, and assisting piano students, I have yet to hear one bitch and moan about the dozens of symbols and Italian, French, and German terms used in music.

First off LISP isn't really a language as much as a language design tool for creating DSLs. You don't write LISP, you write the language you want in LISP. But anyway in terms of a language I'll use Steve Yegge's list from 6 years ago as to the comparison relative to the dynamic languages.

1) A single standardized implementations.
2) Specs is under the control of the active community. No legacy stakeholders.
3) Virtually everything modern is outside the spec and thus specific to the implementation: threads, filesystem access, GUI, unicode...
4) Not object oriented.
5) Unsophisticated debuggers. For example macros aren't well supported.
6) type system issues (I don't understand this complaint and I'm sure others do).

### Which languages do you

Which languages do you think have advanced beyond what CL can offer, and what exactly do they offer which could not be provided by a reasonably authored library.
I think it's worth mentioning that LISP, as Alan Kay puts it, "eats its own children"; ie. any interesting or useful feature/paradigm which appears separately from LISP or as a "next generation" of LISP, tends to get ported to an existing implementation at some point. Once this happens, it kills any momentum that the other language had. He says Smalltalk does this too.

I'm not saying whether this is a good or bad thing in this context. The context that Kay mentions it is Smalltalk's original cycle of self-replacement, which stagnated once it became standardised and commercialised.

### Hello, I'm Joe the Plummer

. . . by that I mean I'm your typical journeyman programmer. There are millions of us out there chugging along writing apps in (insert your meat-headed language here). Back in the late 70's when I was in college, you had to be a math major, have a 3.5 ave in math courses your Frosh years just to apply for the CS emphasis. And then you got to stand in line with your deck of cards. So when PCs came along, us Joes got a break. Remember "Turbo C++?" Millions of us got a bootleg copy of TC++ on our PCs, bought/xeroxed the book and learned. Same with Charles Petzold WinC. Same with Visual Basic. Et cetera, et cetera. Why CL didn't win big to me is obvious: No books, no big promotion of a PC-based CL, no compelling PR campaign aimed at all the Joes and Janes of the world. . . . And there still isn't. Computing has morphed into its true home: The Tabloid. Just look at any website. It's trying to look like an eye-arresting, 3-ring circus, i.e., a tabloid. Programming to produce such a thing is farther from "The Reasoned Schemer" than ever, and so too the Joes and Janes who code for the dominant Web.

And so at age 56, after a career in C/C++/Java/Oracle, and then the mish-mash, kludge-hell of Web programming, I'm finally following the breadcrumbs to FP. After umpteen attempts with umpteen books, I'm finally slogging through "The Little Schemer." I'm on page 88 and I've just made it through my first look at (what is for me) serious recursion. I use a post-it to cover up the "answers," then try to do it myself on MIT-Scheme. I'm doing all right. I'm getting it. But I can say with 100% assurance that, as a beginner, recursion where you can't really hold in your head everything that's going on SCARES THE HELL out of the Joes and Janes of the world. Sure, other languages and their arcane methodologies can also make your brain hurt, but there's always a pot of gold at the end of the rainbow. But with FP and its recursion ... why am I doing this? Is it because lambda calculus is so cool? Again, a PR dead-end.

To sum this up, I'd say I've never seen any FP treatment that was truly meant for Joes and Janes. Something is always missing. Mostly, authors assume too much. Some are too tedious. And NOBODY seems to be able to explain why -- soup to nuts -- I would want to fool with FP in the first place. I know many will want to rebut this last point. Yes, show me this or that ref . . . and I'm sure I can tell you fairly quickly why the "masses" won't understand it.

So, is FP elitist, a secret, private club? Isn't that the real question here? I've spent a lot of time and effort investigating FP, trying to grok why I need it. Most people wouldn't take so much time and effort when other toolsets are bending over backwards to appeal to them.

### Recursion makes a very elegant solution

Recursion is a very elegant and straight forward solution for quite a few problems, the obvious one being transversing a tree structure.

I am certainly no FP purist but I have a deep interest in programming and try to learn as much as I can mainly for the interest but also for the side effect of helping me be a better software engineer.

In no reflection on your self; but the enterprise programming environment, mainly around C# and Java, are really geared towards the "bums on seats" corporate development environment.

### I'm not really Joe. . . .

No, I'm not really Joe the Plummer, i.e., crude, ignorant, and proud thereof. But I did spend lots of time in the "corporate development environment." Probably the worst was at an accounting software firm in Kansas that wrote VB apps. Probably my best was at U of Missouri campus IT support where literally any system/tool combo was cool as long as you could maybe show you were doing something. I put a sign on my office door: "Will work for pay." To this day I can't really tell you what I was doing with that SGI workstation.

The old argument of why you study comp-sci really doesn't fly in the world of corporate IT. You study the theory and fundamentals so to better ride the winds and waves of change. But realistically, MS, Sun-Oracle, etc. will never bring out new stuff that their "basis" could not grok. These big players are in a symbiotic relationship with their developers and, yes, new tools/methods might be a challenge -- but nothing like the FP challenge.

Hey, maybe I'm wrong. Maybe there'll be a "Head First FP" from O'Reilly some day. O'Reilly -- sort of an industry bellweather -- does have a Lisp book "Land of Lisp," two Haskell books, an Erlang, and some Clojure books. But after a quick glance, none of these 5#ers make the connection to lambda calc, which I would consider critical. What I like about "The Little Schemer" is that from day-one I'm seeing lambda calc in action -- even though I won't really get an explanation of the connection.

IMHO, the whole FP paradigm is dying for the "perfect book." Not tedious, not flighty-cutesy, not lost in theory. (I hope to understand "Lisp in Small Pieces" someday. Honest, I do.) A perfect first chapter could be "Lisp, A Language For Stratified Design" from Abelman and Sussman, but explained in the most (for you LtU crowd) painful detail that a reasonably bright high school student could understand.

Another example: Right now I'm on p. 92 of TLS. It's eqlist? time. The second attempt gives you a set of cond cases that are lovely-elegant. Yet in my meatheaded coder past, if someone had given me this problem (compare two lists for equality, considering also sub-lists), I would have blasted away at it with Java/Perl/etc. spaghetti code until it worked. And so would have millions of other Joes.

I guess FP needs to find better spokespeople, better books, better arguments for why the FP way is best, why elegance is best, why FP wins the abstraction game. Sure, hearing stories about Haskell winning coding contests piques interest, but then there has to be a path for the reasonably able and motivated to follow.

Maybe no form of real FP will make it to Kansas. Or maybe some consultant will come in and do some FP in javascript or Java for your project, then leave. But I hope for a better future.

### FP

But realistically, MS, Sun-Oracle, etc. will never bring out new stuff that their "basis" could not grok

The .NET compiler is functional. Sun uses the theory of primes over number systems defined on elliptical curves, I'm not sure what percentage of their "base" groks that, I'd gather close to 0. Companies use technology all the time their end users don't understand.

If you mean that Microsoft, Oracle ... will never release functional languages: F#, Fortress

_____

As for lambda calculus you can't not use lambda calculus in functional programming. It is like worrying that C language books don't have an explicit mention of the uses of punctuation in computer languages.

### recursion and FP

Why do you want recursion. Of course books like the Little Schemer are probably showing you that loops are recursion are the same thing. But I think what you are asking is why write in a recursive style and the reason is because you don't want mutable variables i.e.

for i = 1 to 10 {
stuff
i = x;
stuff
}

you want to absolutely forbid. Because if you can forbid that sort of thing, that is no for sure when it is happening, then you can execute the entire loop in any order you want, across multiple CPUs. You can execute part of the loop, etc... You no longer create accidental ordering of computations all throughout your code.

____

As for why functional programming. Because it is forcing you to break those bad habits you picked up from C/C++/Java. Your head should hurt, you are going to be learning a lot of very complex stuff you haven't considered before. Oracle (SQL) is semi-functional. As for an easy book: Programming in Haskell by Graham Hutton. The Little Schemer is good as well, if you like it keep going.

____

Finally the most popular programming language in the world, Excel is a functional language. The masses do fine.

### Not programming

Finally the most popular programming language in the world, Excel is a functional language. The masses do fine.

It's a pretty big stretch to call what the masses are doing in Excel "programming."

### Excel

A1 = 32
A2 = 16
...

A56 = Sum(A1..A54)
B17 = Graph ....

How is that not a non-extensible functional programming language with an extremely intuitive GUI?

### Photoshop

Visual programming?

### If Excel is meant to be an

If Excel is meant to be an FP success story, why do Excel programs get rewritten in VB after a certain level of complexity?

### No easy loops or simple

No easy loops or simple recursion.

### Why rewrite Excel?

Because the code is concealed behind the grid of cells, and can be easily destroyed by replacing the computed content of a cell with a fixed value.

The Improv/Quantrix style of spreadsheet provides rows and columns with meaningful names and rules that are displayed separately from the grid of cells. Computed cells cannot be overridden: only input cells (those whose value is not defined by rule) can be changed.

### Nit Pick

In your example I'm not sure pure FP will help a whole alot while maintaining out of order execution. You are short circuiting the iteration in some way that may add just as much complexity to a recursive solution.

But it's probably not fair to pick it apart.

My take away from FP is to keep control of state as much as possible, but I'm not so incline to rush to the purest FP language for purity sake.

### iteration

That's the point of purity. You generally don't need an iterator and this way you can tell.

1) If you just have a dependency like f is a function on i where i is from list i1
then just use map f i1.
if you want it parallel parMap (strategy) f i1.

2) If you just have a pairwise dependency, is it associative (they generally are) then a fold will work

etc..

The advantage of purity is that you don't have any kind of accidental impurities. You aren't going to have to add complexity to the recursive solution since you write everything that way,