## Lisp is sin

People are discussing this blog post all over, so we might as well mention it here.

The discussion is quite balanced, though you are likely to disagree about the specifics. One issue, discussed here repeatedly, is that code=data doesn't require S-expressions. Lisp expressiveness runs deeper than that.

Our discussion of Spolsky's Java Schools essay is here, by the way.

## Comment viewing options

### Lisp is hackerish, but it has got some pretty good ideas.

Although Lisp is quite hackerish, it has some pretty good ideas, the most notable of being macros/code=data.

Personally, I do not think that a programming language so hackerish is ready for use in IT environments, because it lacks a certain formality.

Java has formality, and it covers most of the needs of a project, as well as Python, Perl and others. But the real problem with those languages is that they do not allow extensions with domain-specific languages. The Java tax is that everything must be a class, an interface, an API; but some things would be much simpler if there was a DSL, i.e. a declarative way to do them, which Lisp can offer.

### Double-edged macros

Macros are powerful, but they introduce risks when it comes to code maintenance, since they are easily abused. A couple of people I know swore off lisplike languages after having to look after code written in such languages written by other people.

### Of course...

The same thing could be said in regards to any language. Many (most? all?) languages have constructs that can be abused. Unfortunately, it seems like the more powerful the construct, the more likely abuse of that construct will occur. Iâ€™ve seen Java and C# code abused too, but I donâ€™t blame that abuse on the language, I blame it on the programmer. The best way to avoid abuse from a language design standpoint is to cripple the language so much that â€œthere is only one way to do itâ€ (even if other ways are easier to understand in particular instances) and that way panders to the lowest common denominator. But, even this approach wonâ€™t guarantee abuse wonâ€™t occur. (I was hesitant to use the term â€œlowest common denominatorâ€ as that makes me sound like an elitist. I donâ€™t mean lowest common denominator, as â€œsimple for simple peopleâ€, rather I mean the most likely to be understood. Many times using a â€œlowest common denominatorâ€ approach is the correct approach, but many times it is not.)

### The best way to avoid abuse f

The best way to avoid abuse from a language design standpoint is to cripple the language so much that â€œthere is only one way to do itâ€ ... and that way panders to the lowest common denominator.

You could easily claim that this sort of approach was taken by Ada and Eiffel. I wouldn't say either language panders to the lowest common denominator, or, at least, they do provide a sufficient environment to get a lot done. You certainly can't count either language as "crippled".

### I'm really not familiar enoug

I'm really not familiar enough with either Ada or Eiffel to specifically comment on either language, so I'll comment on a language that takes this approach that I am familiar with: C#.

Now, C# is improving, and it's not that bad. When Microsoft introduced the C# language, their goal was to compete with Java: a language that was threatening their market. Hence Microsoft was going after a language that would compete with Java: a language like Java but (at least in Microsoft's eyes) better. Now, as you may know, Java was created to be a language that panders to the lowest common denominator. (When I say "lowest common denominator" I don't mean that Java is easy, I mean that it initially avoided concepts that were unknown by many in the programming community, hence would be easy to pick up by most programmers.) Thus C# naturally follows suit. This isnâ€™t a bad thing most of the time, but sometimes it is a very bad thing. Some algorithms are more difficult to express in Java or C# than other languages because they lack closures. Granted, C# 2.0 does have anonymous methods, but those are gross to look at (and hard to parse). C# 3.0 is suppose to get true lambda expressions (concise, and elegant!), but I have to wait x years for that.

Remember: there is a whole world out there just waiting to be discovered! The languages we use to reason about that world have a great impact on how we perceive that world. Just think of how the language of calculus affected mathematics. Physics has undergone many great discoveries just because physicists had a better tool with which to express their thoughts.

### Macro Languages

I think it's a problem unique to constructs that let you redefine the language. Things like operator overloading sometimes get criticised for this, but I'd always want it in my language because often there is an "intuitive" meaning for an operator overload, eg two strings being concatenated together.

Things get less clear when a language introduces a way to redefine its meaning at will ... the legacy C/C++ macro preprocessor is an obvious example of this, and Lisp macros are kind of the ultimate form.

The problem is that they let you basically invent a new language that happens to look exactly like the existing one. If I spoke in English but had a tendency to make up words all the time then people would treat me as weird and I'd certainly be abusing the language ... people would find it harder to understand what I meant. Obviously, I can "abuse" English in ways that make my meaning clearer occasionaly, but I've got to be careful as I can equally end up talking total crap :)

For the case of Lisp macros, I think given many of the examples I've seen the things they're used for could often be done using other language features in more modern languages, or are quite esoteric anyway. And so maybe it isn't worth it.

### subtle differences

The problem is that they let you basically invent a new language that happens to look exactly like the existing one.

Lisp is the champion in this regard, but Haskell monads are almost as crazy. The C++ equivalent would be overloading operator;.

If I spoke in English but had a tendency to make up words all the time then people would treat me as weird and I'd certainly be abusing the language...

Maybe; but I have been told this is pretty fluval in German.

### Synthetic vs analytical languages

English, being what linguists call an analytic language, uses a largely fixed set of morphemes (words) to express ideas, and complex ideas are expressed in English by composing phrases with multiple such words. Other than the occasional prefix or suffix for things like plurals, verb tense, and the like; English isn't big on cramming arbitrary words together to make new ones. (Compound words do exist in English, but there is a finite and well-known set of them).

Other languages--the synthetic languages; cram words together to make bigger words (which express more complicated ideas) all the time. German is an example of a synthetic Western European language; other synthetic languages (many of which are more synthetic than German) include Welsh, Turkish, Japanese, and numerous native American tongues.

German is hardly unusual in this respect.

### strange though

that english is a germanic language.

### Um, no.

You are systematically confusing things in two domains: (a) language vs. writing, (b) morphological inflection vs. word formation (derivation and compounding). Really, English and German are almost the same when it comes to the morphological rules for compounding; the difference that leads people to make claims like yours is that in German orthography, compounds are written without spaces between their parts, while in English, people do put spaces. But where an orthography for a language puts or not spaces has little bearing on the grammar of the language.

That's your language vs. writing confusion. I have less evidence to go on the next claim, but I think you're also confusing the notion of an "analytic" language with that of an "isolating" one, and correspondingly, the notion of a "synthetic" language with that of a "fusional" or "agglutinative" one. The first terms have to do with a language's preference for using word-formation or compositional phrase formation; the second ones with inflectional morphology.

Haskell monads are nowhere near as crazy for the simple reason that you can easily find out when one's being used, and usually finding out which is not much harder. Also, unlike C++ Haskell doesn't use an equivalent of operator; in normal code.

### Don't be so cromulent.

Don't be so cromulent.

### Redefining the language

If I spoke in English but had a tendency to make up words all the time then people would treat me as weird and I'd certainly be abusing the language

But you'd be weird if you programmed without making up words, i.e. naming functions, classes, variables, macros, etc.

### ...therefore all programming

...therefore all programming is language design. :)

### ...and therefore macros are u

...and therefore macros are useful, since they are really good at designing embeded languages.

For some reason, I feel like we're going in circles. :)

### But...

The problem is that they let you basically invent a new language that happens to look exactly like the existing one. If I spoke in English but had a tendency to make up words all the time then people would treat me as weird and I'd certainly be abusing the language ... people would find it harder to understand what I meant.

But this sort of thing happens all of the time in English. Read an advanced book on economics and there will be all sorts of words you don't understand. The book is written in a language that looks an awful lot like English. The words look like English words, in fact they may even be words you've used before, but the context and meaning inferred by those words will be completely foreign. Books on philosophy, or religion, and sociology are all written in languages that look almost, but not quite, like each other. The English we use right here on this forum, looks almost, but not quite, like the English used in these books, and the media at large. However, you place an economics student who has never studied sociology in a room full of sociologists discussing sociology, and you might as well have placed him in a room full of people speaking a foreign language. (Well, not quite, they can understand a subset of each othersâ€™ languages.)

In this forum, the word â€œfunctionâ€ has a specific meaning that can be inferred from the context by others who are knowledgeable of programming languages. Likewise the word â€œtypeâ€ has specific meanings that can be inferred from their context. A person ignorant of programming languages and programming in general will have a difficult time understanding what â€œfunctionâ€ and â€œtypeâ€ mean from their contexts, even though those two words look exactly like words they know and use in their everyday language. Of course, there are other words used in our domain (and in others) that are completely made up (not simply redefined). For example, in mathematics (and computer science) you might see the word â€œhomomorphism,â€ a word that has no meaning outside of the ream of mathematics (at leas as I am aware of).

Yes, if you enter a room full of sociologists, stand on a chair in the center of the room, and start discussing the details of type systems, they might look at you like youâ€™re quite mad. You shouldnâ€™t expect them to understand you, theyâ€™re sociologists. However, I would be surprised if you thought they were mad, simply because they were discussing sociology with each other in a language you didnâ€™t understand. Sociologists are not mad because they have created a dialect of English to express the concepts and ideas of sociology. Likewise, it isnâ€™t crazy for a programmer to create a dialect for expressing a particular set of ideas of a certain domain, when youâ€™re working on solving problems in that domain. We expect programmers, sociologists, mathematicians, and other professionals to learn the domain specific languages spoken in each domain, because those languages make expressing ideas in those domains much more concise and simple. Likewise, why should we not expect a programmer to learn the domain specific language created for expressing concepts, problems, and solutions in the domain of the program? The end result is better, is it not? We now have a simple solution that is abstracted away from the machine, and into the domain of the problem at had, making communication in that particular domain simple and concise.

### minor OT

"For example, in mathematics (and computer science) you might see the word â€œhomomorphism,â€ a word that has no meaning outside of the ream of mathematics (at leas as I am aware of)."

"homo-" same, "morphism" - form (more or less, I don't speak Greek).

One of the reasons modern English is complex is that it's an amalgamation of several languages now even in everyday language (as are most major languages). In regards to your post more generally, while terms and jargon are introduced, rarely (except apparently in abstract algebra) are the terms completely divorced from the words they displace. If you told someone what a "function" was in programming they would not find it an unreasonable use of the word.

### If you told someone what a "f

If you told someone what a "function" was in programming they would not find it an unreasonable use of the word.

I don't make that claim. In common language the term often means the behavior, or use of something as in "the line functions as a boundary." In computer science we may use the term to mean a subprogram, or procedure or we may use the term in the mathematical sense. Someone might not have a hard time understanding where the term came from, once they understood the concept. Until then, they would have difficulty understanding our conversation without knowing what we mean when we say "function."

I know when I first came to LTU, I had a hard time understanding a lot of the conversations. They were full of jargon. We have: "Currying," "category theory," "lazyness," "referentially transparent," and a host of others. Now, the terminology became clear, once I learned their definitions. But, until then, I was at a loss as to what was being said (except for the occasional word or phrase I understood, such as those borrowed from mathematics). Is this not the same as macros in Lisp or Scheme? Once I understand the meaning of a macro (what it does), itâ€™s use, and name, are clear to me. Until then I am at a loss, and confused. This confusion isnâ€™t bad. It just means I need to learn the jargon for that particular problem domain. In the end it boils down to a language issue. Programming is language design, because you define new words so the expression of concepts in your domain is both easier and clearer for those who understand your language.

### I'm reminded of something a p

I'm reminded of something a pre-law friend of mine said, when a couple of us were geeking out about computers:

"You're saying words, and I know what the words mean, but they don't make any sense together!"

Think about how programmers use the words object, method, variable (is this even a noun in standard English?), function, server, client, network, class, type, primitive, compiler, interpreter, window, icon, mouse, and desktop. Now think about their actual, typical definitions. Granted, a lot of these terms have passed into common usage, so a layperson can probably understand your desktop windows. But if you say "The menu class on the client isn't calling the right method on the server", a typical layperson will visualize an educational institution full of restaurant offerings picking up a telephone to speak to the (method is unparsable in this context) waiters.

The best way to avoid abuse from a language design standpoint is to cripple the language so much that â€œthere is only one way to do itâ€ (even if other ways are easier to understand in particular instances) and that way panders to the lowest common denominator.

So, like Java, isn't this what a pure-FP language is doing by eschewing side-effects? Its LCD is simply based on a different subset of the university: math types vs. CIS.

But, even this approach wonâ€™t guarantee abuse wonâ€™t occur.

### Yes.

So, like Java, isn't this what a pure-FP language is doing by eschewing side-effects?

There is a difference between advocating the avoidance of side effects (as in Scheme) and completely forbidding them all together (Haskell)! I would have to agree that, in a sense, Haskell is crippled due to a lack of side effects. However, Haskell doesn't avoid side effects to enforce a single mindset in programming, but rather avoids them to gain power in other areas (lazy evaluation allows certain algorithms to be expressed easily). This is unlike other languages where there really is only one way to solve a problem, even if a different type of solution more naturally fits that problem. Think of how some problems are so simple once you use lazy evaluation, or closures, or something else.

"I would have to agree that, in a sense, Haskell is crippled due to a lack of side effects."

I've thought about mentioning at some points the following (and may have, I don't remember): Of most of the languages discussed here Haskell is... well incomparable in expressiveness but fudging some... Haskell is very near the lowest end of expressiveness (a la Felleisen though less formally). There is a single simple, fairly systematic, and surprising sleek global transformation that allows most of that expressiveness to be "recovered" plus some perks, namely monadic style.

This is pretty much obvious, but the thing that really drove it home to me was the "discussions" (particularly the ones I was involved in) about the "expressiveness of macros". The simple thought experiment of considering: assembly with Lisp-style macros (either with assembly as the macro language or another language). It's immediately clear that there are many things that require coordinated use of macros and are not "macro expressible" a la Felleisen. I.e. it's clear that you cannot make Scheme, say, as a large pile of macros without essentially writing a compiler. This perspective also makes some other things come out as being extensions in "expressiveness": e.g. global (and/or interprocedural) register allocation, GC (obviously), and some others you can no doubt think up.

Anyways, I think I've stirred enough hornets nests today (perhaps).

### Anyways, I think I've stirred

Anyways, I think I've stirred enough hornets nests today (perhaps).

No, you're making us think about what we're doing, and showing us different possiblities. That's a good thing.

### XML Hell

Unfortunately, most server-side Java does indeed involve several pseudo-declarative-DSLs - in the form of XML configuration files, deployment descriptors and IOC glue-files.

### Style equals formality

I think that some varieties of Lisp are ready for IT.

It is definitely the case that a large number of Lispers code in a certain way that may seem hackerish, but this does not have to be so.

It is largely a matter of style. There are some bad habits in the style used by most Lispers that makes it difficult for newbies to absorb and others coming from different language paradigms to get used to. There is no hard rule that dictates that the language be presented the way it is by many of its users (choice of formatting, namimg conventions, etc.).

This is something of a disservice to Lisp itself that can even be linked to the presentation of the language as given in the relatively few textbooks available.

The potential for formailty, if desired, is already in Lisp. It is just not being put forward enough by its practitioners.

### Formality?

It's curious you would say that Java has a formality that Lisp lacks. In the technical sense, I think this is incorrect. For example, Java does not come with formal semantics or a specification for a core calculus. I believe work has been done on this in recent times (e.g., Featherweight Java, etc.), but this has come much after the fact.

In comparison, there are formal semantics for Lisp (at least Scheme, probably Common Lisp), and the fundamental characteristics of the language are based around a well-understood mathematical model: the lambda calculus.

What you call "formal" about Java, I think would be better stated as "idiomatic." It would probably not be wrong to say most Java code is more idiomatic with respect to certain conventions (e.g., Design Patterns) than Lisp is. I think this stems in part not from a "hackerish" quality of Lisp, but simply from the fact that Lisp is quite a bit more expressive than Java, and allows many different styles of abstraction for solving particularly problems than are correspondingly available through Java.

Is this a bad thing? I think it depends upon the context. If you need to write code that any programmer off the street can understand, then it's definitely bad. But if you expect that the type of programmers working on the code will be familiar with a broad range of computational styles and Computer Science theory in general, then I believe the extra expressivity offered by Lisp can result in better code (e.g., shorter, less buggy, more flexible, etc.).

### Corollary to Greenspun's Tenth Rule

I will here repeat my corollary to Greenspun's Tenth Rule of Programming:

Any sufficiently complicated Common Lisp or Scheme program will eventually be rewritten in C++, Java, or Python.

(In the spirit of the original Tenth Rule which must be extended to cover languages besides Fortran or C, you may add languages like C# or Ruby to that list if you like.)

The bottom line is, HR departments are more likely to find Morts than Einsteins; and Lisp and Scheme cater almost exclusively to that latter group. So for the good of the survival of the project, once it reaches a certain size the manager will eventually have to have it rewritten in a language with a more robust developer community (which means one with more mediocre programmers).

### Scheme is easy

Am I the only one who thinks that you don't have to be a genius to program in Scheme? Isn't it used as a teaching language? It's much easier to understand than Java, C++, or C! If I was teaching a programming class to children, I would use Scheme.

### Easy like Arithmetic...

but also difficult like Mathematics. I say this because while it is easy to learn the Scheme language, it is difficult to learn all of the things that you can do with it.

### Scrolls like butter...

Common Lisp is, in my experience, somewhat the other way around: It's hard to learn, but becomes relatively straightforward to use later on. However, I have written a lot more Common Lisp code than Scheme code - my guess is that it's similar in Scheme when you tie yourself to a particular Scheme implementation. Common Lisp has a much broader standard specification, so it's much easier to translate your programs from one implementation to another.

### As long as you have no prior experience

That was my impression too, though I'd previously learned Common Lisp so had at least some familiarity with parens and functional programming. I think Scheme is easier to learn if you haven't been exposed to a previous (non-functional) language. There're much fewer arbitrary rules than in say, C, and fewer core concepts that you need (no objects, inheritance, polymorphism, keyword args, and so on). I remember learning C when I was 13 and being totally overwhelmed by where to put all the braces, semicolins, parens, dots, stars, and arrows. Then I remember learning C++ and being overwhelmed by classes, polymorphism, inheritance, private/protected/public, private/protected/public inheritance, and so on. Scheme doesn't force you to deal with any of that bullshit: you just define functions and call them.

### Choosing monkeys over innovators

"Any sufficiently complicated Common Lisp or Scheme program will eventually be rewritten in C++, Java, or Python."

You need enough maintenance monkeys after the real work and creativity has already come and gone.

As for a "robust" developer community, don't you mean a nice cage of monkeys for the task masters (management)to choose from.

### Let's try not to use insultin

Let's try not to use insulting names such as "monkeys". Thanks.

### porting

"Any sufficiently complicated Common Lisp or Scheme program will eventually be rewritten in C++, Java, or Python."

how about, any sufficiently useful program in any language will eventually be expressed in every language.

### Why so disparaging on maintenance programmers?

I'm curious why you're so disparaging on people who go into a project and clean up the mess left by the initial feature spikes.

Is this a common belief or just laseray's opinion?

### Monkeys?

There are some excellent developers in the world who basically do nothing but maintenance, hell, what do you think Raymond Chen has spent much of his life doing?

Sometimes these guys are a lot better than the geniuses who just exited stage right, after all, anybody can write code, but reading somebody elses can be somewhat more difficult even when it's in a language you already know ...

### Making things easier

As I recall, one of the primary goals in designing the Smalltalk language was to let end users, including children, create and modify their own software as needed.

It seems strange to me that Lisp, Scheme, and Smalltalk are viewed as languages for "smart people", "academics", or "computer science types" while Java and C++ are viewed as languages for "normal programmers" or people of "more average intelligence". I would much rather have an inexperienced programmer working in a language like Scheme or Smalltalk where things are easy and simple than in a language like Java or C++ where things are hard and complicated.

Glomek

### Indeed

In the LfSP subthread I mostly spawned. I was thinking about a post along the lines that considering most languages considered LfSP, LfSP should better stand for "Languages for Stupid People" and it should be worn proudly. Languages like O'Caml, Haskell, Scheme, Smalltalk, etc. make it much easier to program and make it much harder to "shoot yourself in the foot". They also tend to be much simpler to learn and understand (at the very least, for a tabula rasa) than the "Languages for the Masses". Thus "stupid" people can get more done and are less of a "danger" to themselves and others in LfSP.

### "Hardcore" languages

That reminds me of some people's perception that C and Assembly are hard languages (I guess because they're close to the metal), but "scripting" languages are easy.

To me, Ruby would be a lot harder to learn than straight C.

### I would agree with that perception

Think about teaching C to some new programmers vs. teaching Python or Ruby. You can get to interesting problems a lot faster in Python or Ruby, because you don't have to teach pointers or about memory management or C-style strings, and so on. There's no comparison at all. Look how easy it is to use dictionaries/hashes in Python, for example.

Now getting to where you fully understood a language on all levels, which is fairly easy in regard to C, would certainly be much more work in Ruby.

### "Common Person" programming

It's a great idea, but a quick look at the daily wtf will show you why it could never work.

### Bottom line

Exactly. To put it differently: What's the bottom line? What's the least we can expect from "average" programmers?

### Prepare to be surprise

The bottom line is way lower than we expect ;)

### Unfortunately...

many programmers have egos. It seems to be a necessity for the profession. Some keep their egos in check; others--and this phenomenon is industry-wide, and not just confined to a couple of LtU posters :)--occasionally use terms like "monkey" to refer to programmers whom they consider to be of lesser ability.

In particular, many programmers who consider themselves to be expert programmers, get rather annoyed by industry's attempt to treat programmers as interchangeable commodities (and development tools and methodologies which de-emphasize expertise)--and for some reason, rather than complaining about industry and its management, instead dump on the poor schmoes who accept such jobs--many of whom, indeed, are less skilled and less educated than the experts are (but who are, in most cases, sufficiently skilled to perform the duties they are hired for).

Most such industrial programmers don't bother with forums like LtU, c2, cliki, etc.--programming for them is not important in and of itself; it's a job which pays for the things that they consider really important. (Which is fine--I have no issue with dispassionate professionalism). Thus, they are under-represented here. One of the longstanding stories on c2 wiki is of Sam Gentile, a not-unknown programmer/consultant who is an expert in the MS way of development. He was, for a long time, a regular on c2--and was routinely shouted down and insulted by the Smalltalk/LISP crowd there, who openly held him in contempt because his development tools of choice were Visual C++, COM, and other Microsoft kit. Eventually, and sick of the abuse, Sam quit c2 and summarily deleted all his contributions from the wiki; and to this day refuses to have anything further to do with the place. I don't blame Sam one bit (this occurred before I started posting to the wiki)--I wouldn't stay long at a forum which treated me in such a hostile fashion.

Occasionally--I'm a good programmer, and I have an ego too--I'll feel the urge to refer to other programmers (usually in the abstract) as "monkeys", "code grinders", or other such derogatory language. It's a cheap and easy way to stroke one's ego. But, it's entirely counter-productive--and it makes many of us who are academics and/or programming gurus, look like complete a**holes. 'Tis no wonder why industry frequently regards academic computer science as irrelevant.

### There's another usage of "mon

There's another usage of "monkey" though, which refers to the role rather than the people in it. I'm not sure it's necessarily all that much better, but the sentiment "I'm fed up of this job, you could hire a monkey to do it!" is also an understandable one.

### The LtU style

You make good points. I know exactly why people use terms like "code monkeys". However it is unproductive for the kind of discussion we like to have here on LtU.

So let's just say, "not on LtU please" to this sort of terminology.

It is quite on topic, indeed essential, to discuss programming ability when you discuss programming language features. But we should try to do this without insulting anyone more than we really have too...

### Actually

people with advanced degrees working in centers of higher learning are generally called some variant of "egghead ivorty tower intellectuals". Heck, saying that something is "academic" is another way of saying it's irrelevent. The sneering goes both ways.

But the term "code monkey" comes from phrase "if you pay peanuts, you get monkeys." Also the theory that if you just get enough monkeys banging away on typewritters, they'll eventually type Shakespear. The problem here is not that coder monkeys exist (who are working for peanuts in a million-monkey approach to software)- so long as I don't have to maintain or use their code, I don't much care one way or another. But the idea that programming (which is way more than just coding) can be mechanized and dumbed-down to the point where that approach will work is insulting.

Programming is about way more than coding. Yes, the end product is (usually) working code. But a large part of the job, as any experienced programmer can tell you, is figuring out what code needs to be written. Specs are ill defined and ephemeral (when they exist at all), requirements shift like smoke on the wind. The trick is to see the real problem that needs to be solved, and solve that. Programming is about solving problems. Which is what coder monkeys don't do.

### Not sure I agree

But the idea that programming (which is way more than just coding) can be mechanized and dumbed-down to the point where that approach will work is insulting.

Computers and the programs they run have been known to displace and monitor many sectors of the workforce. One goal of software engineering should be to make programming dirt simple. I don't see why programmers should be any more exempt from the march of progress than any other profession.

That said, software design and programming is a very mentally intensive process, requiring one to hold attention to many details. The ability to simplify this state of affairs has thus far seemingly eluded us. But that does not mean that software people are immune. Indeed, many of the skills I have exercised over the last 25 years have been automated. The job that I do today is similar but the tools that I use are very much different. I suspect that this process will continue well into the future.

### Programs versus programming

A certain program may replace a certain job, but software design is at the meta-level, so replacing that job is qualitatively different. There are hard problems in life we're trying to solve, and we're trying to get the computer to help us with the repetitive (least difficult) aspects.

Consider the analogy of writing an essay. The word and sentence level, the level of the language, is the easiest part. Having something to say and organizing it is the hard part. I never think "if only English was better (or I was using another language) this essay would be so much easier to write." Similarly, the most I expect from a general purpose programming language is for it to become unobtrusive. I still have to determine precisely what I want to solve, and precisely how to solve it algorithmically, leaving no gaps of creativity. My internal algorithmic language is at its core small, simple and essentially unchanging (though its library, the things I know how to reduce to the core, grows over time), and I simply want a programming language to reflect it (including the ability to name any repetitive pattern and add that to the library).

### One of the most repetitive tasks...

...is the task of programming itself. Can't speak for anyone else, but most projects I've worked on have some aspects that are quite tedious. I'd just as well that these aspects go away such that I was further up the meta-level ladder.

And that's the deal. As we make the simple things easier, the kinds of problems we solve become more and more abstract. The goal is not to say that we have somehow reached equilibrium and should not seek to upset the balance. The goal is to write programs that write programs that write programs.....

As for the analogy of essays, that kind of assumes that what we strive to do is art - with aestheticism instead of utilitarianism as the aim. Although I appreciate the beauty of algorithms as much as anyone, the ultimate goal is, as you say, solving problems. Creativity is a necessary component for solving problems, but it is not an end, in and of itself.

Anyhow, what I was trying to say is that the act of programming really is no different than any other endeavor. If non-programmers can be made more productive (or replaced) by programs, then why should that same force not act upon the programmers? The one thing that programmers have going for them, though, is that it is exactly a meta/abstraction process. We may automate what we do now, but we end up creating something that itself has to be maintained and moved forward. Our jobs are not static and we should not view them as such.

### The future...

"We may automate what we do now, but we end up creating something that itself has to be maintained and moved forward. Our jobs are not static and we should not view them as such."

One of my not-too-serious perspectives on the future: Formal methods finally succeed and we now have specification languages where we can completely generate the program from a specification. Programmers are obsolete and out-moded. Now we just write specifications and the computer programs. Of course writing an accurate specification is difficult so we have Specification Writers. There are also various specification languages with their own quirks and perks. And, of course, some specification languages attempt to make things easier by restricting what you can write to avoid obvious nonsense. Some Specification Writers love this, others find this a hindrance. There are great debates on the value of this and it marks a major divide in Specification Languages. Hail the march of progress!

### A specification is just a higher level language

Wouldn't a specification language just be like Prolog, or some other logic language?

### .

That's part of the "joke". A complete specification is equivalent to a program. As far as I can tell, the only formal way to differentiate between "specification language" and "programming language" is the support for (somewhat informally) partial specification. Of course the practice and pragmatics are totally different.

To answer your question: yes and no with more emphasis on no. Any programming language is a specification language, it doesn't need to be a logic language. Incidentally, the "future" described partially exists today with assembly and compilers except for the crucial aspect that programmers in "high-level" languages don't think they are doing something fundamentally different from what assembly programmers did/do.

### not all programming languages are specification languages

I believe specification is concerned only with what to do and not how to do it. Most imperative languages describe exactly how to compute something, while a pure logic language would only describe what conditions must hold, and allow the compiler freedom to choose exact algorithms satisfying the conditions. This would be the difference between a "specification" compiler and current optimizing compilers, which only do this on a small scale, and are not easily extensible to include new algorithms.

### Specs and programs are different

A specification is a relation between the input and the output.

A function is a procedure which takes the input and computes an output.

A specification doesn't nail down any particular implementation -- if your spec says that for all positive integers n, f(n) returns n prime numbers, then there are many different possible implementations of f, all of which realize this specification.

Loosely speaking, you can think of a spec as a type, and a procedure which meets the spec as a function with the spec as its type.

If you pretend this is exact, rather than loose, you could try to apply the Curry Howard isomorphism for logic programming, which interprets a goal as a type, and evaluation of the logic program as searching for a proof -- a term that has that type. However, you quickly run into the issue that you need to restrict your type language to get efficient proof search, and to specify programs concisely and clearly you want a very rich logic. So that's one difficulty.

Furthermore the reason this is loose is that there are plenty of requirements in a spec which can't or shouldn't be captured in a type. For example, a requirement that everyone use meaningful variable names or that the program is well-structured and modular are both sensible things to put in your spec, but trying to come up with a type theory for that is a questionable enterprise.

### That's why I said a logic language

A logic language such as first-order logic is a collection of relations. A theorem prover is capable of "executing" first-order logic. Theorem provers are also used in types. First-order logic can be used to specify (nearly) anything. Prolog is a limited theorem prover that executes a subset of first-order logic, although Prolog also has some cheats for efficiency. You can't specify that everyone use meaningful variable names, because that is ambiguous; meaningful to who? So your specification itself would not be enforcable or provable. Logic is already used to specify mathematics, and it is already being used for programs. See ACL2.

### Programming is different

Yes, there are a lot of aspects of programming which can- and therefor should be- automated. Two examples spring readily to mind- determining the types of variables, expressions, functions, etc. (type inference), and type (correctness) checking.

But there is a hard nut of problems at the core of programming, which is what makes it hard- what problems to solve, and how to solve them. The stuff that makes programming fun (in the way that working on an assembly line isn't), and requires creativity and intuition. We may "automate" that one day (via AIs), at which all humans can go home, because we'll also be able to automate art, and music, and story telling, and mathematics, and...

### Indeed

As someone who spends all his time in industry; I'm well aware of how industrial programmers regard their academic brethren. As many have observed, including myself in these pages, there is an extremely unhealty level of distrust between those engaging in pure research, and those engaging in application development. It could be worse, of course--industry and academia do speak to each other and collaborate--but far too many on both sides of the discipline hold the other in contempt.

Of course, the "code monkey" issue is a problem even within industry. I'm well aware of the source(s) of the phrase "code monkey"; it's still an insulting term. Master plumbers don't go about calling apprentices and journeymen "pipe monkeys", nor do CPAs refer to entry-level bookkeepers as "book monkeys". Those professions have well-defined roles and responsibilities for practitioners of different skill levels, and well-defined paths for advancement--as a result, the master practitioners do not consider the entry-level practitioners a threat. Those who make use of our profession (i.e. those who hire programmers), OTOH, often regards n entry-level programmers as equivalent to a guru. Furthermore, there isn't any easy well, at hiring time, of telling the guru from the entry-level programmer, as we have no certification for guru-hood. These factors, I think, create an unhealthy tension between practitioners of different skill.

One other unhealthy tendency, and I'm sure I'll get flamed for this, is many skilled practitioners view our profession as art and not science (using contemporary definitions of the terms). While some artistry may be involved--programming is usually a productive endeavor, and not a purely creative one; if we are to advance as a discipline, I believe the amount of artistry will need to be lessened. If you find this concept "insulting", expect to be insulted further as technology marches on.

The programming-as-art notion also seems to encourage bad attitudes among skilled programmers--how many primadonna structural engineers do you know?

One important issue you do touch on--translation of (vague) customer requirements to the machine--involves many skills beyond programming (and programming languages). I've met many skilled programmers who are horrible designers (at the product definition level), and vice versa.

### Programming as art

No, I don't flame, but I disagree. I don't think science and art must be mutually exclusive when it comes to programming. You're probably right if you think of an artist as someone who's smashing tomatoes on a canvas and sitting around whining because nobody's interested in his genius.

I personally regard art as gaining some level of perfection in a trade by constant practice, and looking back at one's earlier efforts and learning from your mistakes to improve your skills in matters where no clear "cookbook receipes" and checklists can help you to find "the one and only correct way of doing things" but just experience and thought. My personal view is not backed up by any "official" definitions, but I think art can be there wherever is personal style involved, and in programming there's certainly something such as style.

I also suggest looking up the definition of art in an encyclopedia, or read Knuth's writing about it. I'm sorry I don't have any of this handy right here for a quote of a better definition.

Programming has IMHO much in common with creative writing. Much of what makes a writer a good one is what makes a programmer a good programmer.

Essential for the art of writing is that the reader does not notice how hard it was to write that book. A good book must read like there was only one way to do it, it must feel like the words must be in that order and no other. The same applies to me for source code. I love reading source code which looks so clean and simple and where everything falls neatly into place that you just look at it and say, "wow, this is so simple ... I could do this too" but if you give it a deeper look you note that an enormous amount of thought and effort went into it to actually make it look that simple. OTOH, I've seen a lot of source code from which you can read that it was the result of a long trial-and-error session, with no efforts on behalf of the programmer to extract the essence of the solution when it was finally working and make it obvious from the code.

Or to express oneself clear and simple. Only inexperienced writers use flowery language and overly wordy prose. The goal is to strive for clarity. I once was staring at four lines of code, each >100 characters. It first took me awhile to recognize that all subexpressions were basically the same. It took another while and then it dawned on me that somebody had found a particularly obfuscated way of deciding whether a certain integer expression was odd or even with the help of some floating-point arithmetics.

Or revisioning. To go back and cut the dead wood. Remove superfluous words. Fold several sentences into one which has the same information. Again, as a programmer I love to go back and make old code simpler, try and see if I can't combine two concepts into one. When I add a new feature, I first see if I can't do this by simplifying other things.

One example for this. Somebody here was adding support for a new feature in the code. When he was done, one of my "model"-level classes had three new methods, six new instance variables, it had UI-level elements that certainly didn't belong at this level, such as a modal UI loop and creation of undo objects. Worse, the code was "oracling" which undo object to construct by assuming that if the function was called with a certain combination of parameters then it would have been called from UI function xyz. He claimed "that's the only way to do it". To top it off, it wasn't even working correctly in all cases. So I revised it. When I was done, all the extra methods and instance variables were gone, so were the modal loop and the undo assumptions. Working from the original code, I just split up one function in the class and added two lines in the calling code. It was much simpler. And it was working.

There's more I could mention, but I guess nobody's interested anyway (sorry for the long post). If I had to sum it up, I would probably say, to me the art of programming is to express oneself clearly so that others understand what you were doing, why you're doing this, and how you got there, and giving it a form that is enjoyable to follow. I sometimes think that denying that programming is art (I prefer to say can be art) is sometimes the other extreme of the "elitists and monkeys" opinion: "everybody is equal, and the one who spends lots of time of improving his skills and learning more is just as good as the one who reads 'C++ in 21 days' and thinks he's done." although you probably didn't mean it that way. There's lots of shades of grey between the black and white, and it's probably good that way.

Ok, and now you can go on and flame me for taking the "programming is art" debate to a new high by basically suggesting programmers should take creative writing courses ;) I know I'm nuts.

### Software MFA

You might be interested in the Master of Fine Arts in Software program, which actually had a trial run, though I dunno how successful it was...

### Our discussion of the MFA is

Our discussion of the MFA is here.

### Norvig's book

The blog author recommends Peter Norvig's 1991 book: Paradigms of AI Programming: Case Studies in Common Lisp. Norvig has since written another book, Artificial Intelligence - A Modern Approach, which many view as a successor.

Comments on either book? I've read neither... though I'm always looking for a chance to expand my mind, and my library. :)

### PAIP is great

I think PAIP is one of the best programming books I've ever read.

### AIMA

Artificial Intelligence: A Modern Approach is probably the most widely used text-book for AI these days. It's pretty comprehensive, although perhaps a bit overwhelming (there is a lot of material) at first, and from what I remember there could have been more on how to integrate the various capabilities discussed into agent architectures. Although, I only have the first edition, and I haven't read it for a while. I'd also recommend Nilsson's Artificial Intelligence: A New Synthesis which is less comprehensive and less up to date, but ties together different topics nicely. If you were just going to buy one book on AI, though, I'd make it the Russell/Norvig. The AIMA website is well worth a visit, too.

I've not read Norvig's PAIP, but I've heard good things before about the code in it. It is 15 years old now though, so perhaps not the best choice as an AI textbook.

### PAIP

PAIP may not be the best choice for AI anymore (that's also what Peter Norvig says on his website), but it is still an excellent book for learning Common Lisp and programming techniques in general.

### Me too post.

This book is really, really good. I can't put my finger on exactly why, though.

### Lisp doesn't represent lists

I am someone who has used lisp for many years. I have noticed one thing that unnecessarily makes lisp hard to learn. (in my opinion) Everyone thinks that Lisp represents lists and uses it this way. But Lisp represents dotted pairs. Lists are a special case of dotted pairs. This is why Lisp has a cons and a list function. Lists can be built using arrays as a basic structure element. This makes list processing easier to visualize and learn. I have used this in a language of my own called Lewis. Is the dotted pair notation in Lisp really worth the trouble? Isn't it really just an historical artifact?

### Lisp lists are binary trees

Lisp lists are more accurately described as binary trees. Each dotted pair is an internal node, and atoms are the leaf nodes. Binary trees are generally more flexible than arrays. I don't understand the problem with dotted pair notation. How else would you express (cons 'a 'b) without dotted pair notation?

### People learning Lisp think th

People learning Lisp think they are doing list processing but in fact they are doing list processing in terms of binary trees. This is why "cons" shows up so often the list processing functions. If lists are represented as arrays you don't need the "cons". However you loose the dotted pair representation. My point is how many people really understand this and if they did understand it would they do it this way?

### How many Lisp coders don't understand this?

How could somebody learn Lisp and somehow not understand cons, one of the handful of fundamental operators? If a programmer mistakes a list for an array and does heavy random access on it, then chances are, that's not the worst of their problems. Besides, choosing an efficient data structure for an algorithm is merely optimization.

### You understand it and it is O

You understand it and it is OK with you. Others who understand it or suspect it look for another way.

### Apparently some of those othe

Apparently some of those others don't understand that CL and Scheme provide a vector (array) datatype as well.

Horses for courses.

### It threw me for a curve..

When I picked up Scheme, I (intelectually) understood that lists were groups of cons cells, with binary tree structure, yadda, yadda..

When I sat down to write my first non-trivial program, it bit me in the behind. Specifically, I kept trying to (cdr x) to get the second item.. It took a good while to re-map my thinking (and a few helpful accessor functions).

I still think (cadr x) is backwards to this day.. Since I read "forward" it seems to me that this should do a car followed by a cdr, not the other way around.

### If your a math person, it actually does make sense.

I still think (cadr x) is backwards to this day.. Since I read "forward" it seems to me that this should do a car followed by a cdr, not the other way around.

It's just function composition. Since car and cdr are both functions, in the mathematical sense it makes since to call the function that does (car (cdr x)) (cadr x) because its just function composition:

f(g(x)) = fog(x)

So, g actually gets applied first, even thougth f is first in the ordering. Interestingly enough, the order of function application in function composition is something many students often struggle with in algebra too, so you're definitely not alone.

### first, second, third, fourth, last

Common Lisp provides first, second, third, fourth, and last as aliases for car, cadr, caddr, cadddr, and cdr respectively. In Scheme, you can define them easily. e.g. (define first car)

### And the other side of the coin

I once asked one of my lecturers who was teaching a programming languages course why he was promoting Haskell as the best thing since sliced bread, seeing as how I'd read so much about how awesome Lisp was (disclaimer: I know a little bit of both Lisp and Haskell but mostly work in more mainstream languages).

He went very quiet and actually asked if I was joking. I said no, I had really read these things, and how comes we weren't learning Lisp?

His answer was quite long so I won't reproduce it all here, but the essence was that the language design world has learned a lot since the days of Lisp. He felt that the sort of ultra-strong type system provided by Haskell was a huge improvement, and he also felt that the syntax of Lisp (or rather the lack of it) was a relic of the days when writing parsers was very difficult so languages were designed with ease of parsing in mind.

I found this answer quite credible - specifically the one about syntax. I see a lot of people treat the minimalist syntax of Lisp as an asset, even though when I've worked with Lisp I always found it a pain. I'm told that eventually you get used to it - yes, the brain is very adaptable so I'm sure you do. But it seems to me that syntax is the brain/compiler interface and it should be designed to be easy on the eye, fast to read, rapidly disambiguated and so on. From the perspective of human usability the Lisp syntax really isn't good, IMHO.

The comments on type systems I'll leave, as I'm far from an expert on this. But I understand the arguments for a strong type system that the compiler can reason about.

You may be interested in Philip Wadler's thoughts on the subject in "A critique of Abelson and Sussman - or -
Why calculating is better than scheming."

http://portal.acm.org/citation.cfm?id=24706

### free copy available?

(I now see a perk of my previous University job :)

### Syntax

The Lisp syntax has technical advantages. It makes the code=data idea work pretty well, in the sense that the syntax is equally good for representing code as for representing data. In that sense, it is indeed a compromise, but a compromise that gives you a number of technical benefits that go beyond mere readability. At some stage, one realizes (at least some people do - maybe that's subjective) that syntax doesn't really matter, but the concepts behind the syntax matter. Lisp's lack of syntax gives you better access to the concepts, including concepts that the language designers of other languages haven't thought of.

As for strong typing: You can get as strong as you want with Lisp. Check out Qi or ACL2.

### Not so fast

This characterization of Lisp syntax is more of a prejudice than a real argument.

Regarding the claim that ease of parsing is no longer as important, that would be more credible if it were common for languages today to have ways to reliably extend their syntax. However, the Lisp family are still pre-eminent in this regard. Perhaps the closest contender is the Camlp4 preprocessor for OCaml, which while very powerful, doesn't seem to have the same kind of power as e.g. Scheme macro systems such as syntax-case - for example, this message indicates that Camlp4 has issues dealing with hygiene (or did have; if this has since been dealt with, I trust someone will tell me).

The point is that there are still real, practical benefits to having a syntax that's easy to parse. You can say that syntactic abstraction isn't important to you, and choose a language which doesn't support it as well, or at all, but like many decisions which involves a choice between languages, it's a decision with pros and cons, which may depend greatly on the application domain.

For example, in teaching, the book EOPL makes good use of Scheme's syntactic flexibility to focus on the semantics issues of the languages it defines, without getting bogged down in parser issues that would be irrelevant in that context. In industry, Lisp/Scheme can provide a very effective way of implementing DSLs.

The idea that Lisp syntax is only tolerable because "the brain is very adaptable" presupposes that the brain is doing less adapting to other syntaxes, but there's really no evidence to support that. If anything, it's quicker to teach Lisp syntax to someone not previously exposed to computer languages. People who're very familiar with the C-like languages tend to find the syntax of the functional languages difficult to adapt to. I'm personally very familiar with both the common imperative languages, and with Scheme, and I find the Haskell syntax overly terse in large programs (as opposed to snippets), a bit like reading Pitman Shorthand - very compact, but not suited for easy reading. That's subjective, though, and one can't make any claims about what the human brain does or doesn't find easier without some pretty strong supporting evidence.

As for type systems, there's no doubt that it's important for computer science students to learn about modern type systems, and Haskell is a good language to do that with. However, that may not be the first thing you want to introduce students to. The PLT Scheme group makes this case effectively, and their Teach Scheme! project is used at many universities.

### Easy?

I'm more surprised at the idea that historical Lisp syntax is easy to parse. The Common Lisp reader is pretty serious stuff and I don't think that ancestoral Lisps like MACLISP were much easier. Lispers are famous gluttons for implementation complexity.

### Easier for some than others, perhaps?

Let's just say that Scheme has inherited the spirit of Lisp's original intent to be easy to parse, then. ;oP

More seriously, part of the point is that the reader is built into the language: for a user of Lisp or Scheme, using the READ procedure on source code gives you a very usable list representation of an abstract syntax tree. Whether you exploit that via macros, or direct processing of source code, it's a feature that's not easy to duplicate in most other languages. The syntax definitely has a lot to do with this — for example, Python lets you get at a representation of its AST, but it's nowhere near as usable, partly because there's a much greater distance between Python source and the corresponding AST.

BTW, it did occur to me that Mike's lecturer might have been thinking specifically of Common Lisp, in which case there are some reasons that it might not be considered an ideal teaching language. You can probably guess my answer to that.

### Scheme is.

Am working on a Haskell tutorial that involves writing a fully-functional Scheme interpreter for a good-sized subset of R5RS. The complete parser is 33 lines of Haskell.

(Incidentally, the full implementation is only about 350 lines. It's missing some important stuff, like vectors, continuations, dynamic-wind, set-car/cdr!, and most of the numeric tower, but it does have all the integer/string/symbol/list/bool primitives, the whole Scheme evaluation model (including varargs), load, set!/define, IO primitives including read, ad an interactive toplevel. Might add continuations, but the rest can all be left as an exercise for the reader. ;-))

### Smalltalk syntax is genius!

Smalltalk also lets you extend the syntax but is much easier to read than Lisp. I think Smalltalk's keyword argument syntax is woefully undersung.

### Smalltalk syntax extensible?

In what ways is the Smalltalk syntax extensible? I am not aware of a macro system for Smalltalk (only of Smalltalkers who would like to have a macro system in Smalltalk... ;)

### Most Smalltalk programmers do

Most Smalltalk programmers don't have the need for macros that Lisp programmers do. I think that's because Smalltalk's syntax for anonymous functions makes user-defined calls look identical to the built-in primitives. This isn't the case with LISP where macros are often used to hide lambda forms.

### A Smalltalk variant does have/use macros

Slate, the prototypes/multi-dispatch variant of Smalltalk that I work on, does have a macro facility detailed in the manual. We use it for syntactic extensions and the code itself is re-usable for code-walking and eventually for expressing and running refactorings. However, it is true that most of the cases where CL macros would be used, Slate gets by with lots and lots of closures - this just necessitates a better optimizing compiler.

### Misunderstood genius

Responding to your earlier comment, I can't agree that Smalltalk syntax is "much easier to read than Lisp". I worked with Smalltalk on and off for a number of years, and I really wanted to like it. I definitely liked its semantics, at the time — I even based a language add-on product on those semantics. But I never got used to its syntax. Multipart selectors with colons can sometimes be useful, but most of the time seem unnecessarily verbose to me, and get in the way. This is obviously highly subjective, but I'm beating the horse labeled "just because you're used to a syntax, doesn't mean it's universally easier to read than all the syntaxes you're not used to."

As for macros, hiding lambda forms is only one of the uses for macros. What about defining new binding forms, for example, or data languages, or full-blown DSLs? It's certainly possible to live without macros, but having them gives you options that you don't have otherwise, and when you need them, they can be very useful.

### Extensible syntax != macros

You don't need a macro system for extensible syntax. The only requirement is that the language extension mechanisms - be they functions, objects, blocks, or text strings - be sufficiently similar to the built-in constructs that you can't tell the difference.

Smalltalk, Ruby, Perl, TCL, Rebol, and Forth all fit this criteria. In fact, one of the big selling points of Ruby on Rails is that they've defined new "syntax" just by creating library functions.

### Nemerle Macros

Regarding the claim that ease of parsing is no longer as important, that would be more credible if it were common for languages today to have ways to reliably extend their syntax. However, the Lisp family are still pre-eminent in this regard. Perhaps the closest contender is the Camlp4 preprocessor for OCaml, which while very powerful, doesn't seem to have the same kind of power as e.g. Scheme macro systems such as syntax-case - for example, this message indicates that Camlp4 has issues dealing with hygiene (or did have; if this has since been dealt with, I trust someone will tell me).

Don't forget macros in Nemerle. Nemerle's macro system allows for both hygienic and non-hygienic arbitrary syntactical extensions and type-safe compile time metaprogramming. It works very much like a cross between Lisp or Scheme macros and Template Haskell. I believe it is probably just as powerful as anything in the Lisp family.

### No Type System == Dumb Compiler

Type systems formalize the knowledge a compiler needs to infer what you mean. The future of computer languages revolves around "do what I mean", not "do what I say". Type systems given us "do what I mean" with _precision_ and determinism.

If you are repeating yourself, the language shouldn't require you to. If you can look at your code and see that the compiler should have been able to figure something out (like the type of a variable), the language and compiler should do so.

### I disagree

I tend to think that (well-written) Lisp code remind me, in a sense, of proofs from THE BOOK: they seem to be the shortest, clearest way to write the piece of code, and they tend to stick with you. Scheme be very poetic (think ((call/cc call/cc) (call/cc call/cc))), but that's seldom the best part of it.

Scheme and Hakell tend to be prettier because they disregard irrelevant details that get in the way of seeing the point of the program. To write minimally correct and general C++ code, you need much more boilerplate than Scheme and Haskell, and so Scheme code is "sharper". Closures are not poetic, they are simpler.

### Lisp code as proof

Interesting you say that. Paul Graham says pretty much the same thing in his On Lisp book - a well written Lisp macro is like a finished (proof of a) theorem. Whether that is a good thing depends on how you view "finished". In my corner of the universe, most things don't stay finished. In fact, if a piece of code is "finished", it probably belongs in a library/framework, which is usually a small part of an application. And yes, a framework/macro is where Lisp shines the most.

I wrote about the "finished theorem" argument a couple of months ago at http://blogs.adventnet.com/perma/6783/finished-functional-code-is-like-a-finished-theorem-....html

Though I got a PhD in information theory, with a lot of theorems, I personally disliked the polished proof presentation style - in my papers, I wanted to also present the more human intuitive process by which these things are arrived at, but prevailing mathematical fashion dictates against it. I decided academia wasn't for me.

In real world commercial software, it is better to be obvious than elegant - often brevity sacrifices obviousness to purchase elegance. That sounds like a good definition of poetry to me!

### Interesting you should say that.

In real world commercial software, it is better to be obvious than elegant - often brevity sacrifices obviousness to purchase elegance. That sounds like a good definition of poetry to me!

To me a piece of code is elegant when it is so simple, that it is:

1. Obvious in it's intent.
2. Obviously the best way to solve the particular problem.

But not nessicarily what I would have considered obvious before I saw the code. That is, a piece of code that makes the problem look so extremely simple and makes me go "Duh, thats so simple! Why didn't I think of that? That's obviously the way to do it." is an elegant piece of code. It clear in it's intent, but at the same time it's short, simple, and to the point.

### Brownie Points

This piece from a Microsoft employee selling Microsoft toolchains is social engineering to reinforce a shop-worn myth about Lisp et al. Successful Lisp derivations like Mathematica and REBOL show its silliness. Java and C++ make my head hurt, not Lisp. And I am "Einstein" in all three languages.

Microsoft feels the heat from LtU and wants to keep the chickens in their cages laying eggs for Microsoft. The firm's success depends largely on monopoly social issues.

Microsoft innovation should empower Mort to write Einstein code by putting Einstein in the compiler. This article says instead that MS tools are dumb so you can be, too. IT executives should consider the implications of that approach.

### It all makes sense now

Microsoft feels the heat from LtU

So that's why they hired Simon Peyton-Jones!

### Exactly!

So now we all know the hidden agenda of Ehud too. What a devious plot! ;-)