## Functional Programming in the ACM CS Curriculum

The ACM Curriculum board has re-opened the 2001 design for review. Although ACM is a US-based organization, the curriculum is not only influential at the middle tier of US colleges and universities, but it is also taken seriously by many evolving and developing educational institutions overseas. In recent years, the study of non-OO PLs, and of other key PL topics such as type systems, has grown increasingly marginal in the undergraduate CS curriculum. In particular, the study of functional programming is not included in the ACM CS2001 core. We may now have an opening to make a small change in this situation.

The ACM Curriculum board has agreed to consider a proposal on including FP as an equal to OOP (10 "hours" each) in the standard curriculum. This was the most concrete outcome of the PLC workshop at Harvard two weeks ago. The proposal was drafted by Stuart Reges, Shriram Krishnamurthi, and Matthias Felleisen and was endorsed unanimously by the workshop attendees and by the SIGPLAN Executive Committee. It proceeds on the premise that inclusion of FP in the core curriculum is the most important single thing that the PL community can do for CS education. In particular, this will help prepare students for a properly designed though possibly optional PL course or courses.

Please consider contributing comments to the web site. A simple "Yes, I think this is a great idea" will be helpful. A short explanation is even better.

There is now a long list of comments supporting this proposal. However, we have very few comments from people in industry, so comments from the many non-academics on LtU would be particularly helpful. Examples of how FP has helped you would, I think, be particularly persuasive.

The web site is http://wiki.acm.org/cs2001/index.php?title=SIGPLAN_Proposal. Unfortunately, you must be an ACM member to view or submit comments.

[Edit: This is important enough that I am promoting this item to the front page even though the link is only accessible with an ACM account. The issue itself can be debated in the comments, and hopefully at some point an open mirror of the ACM discussion will be created. -- Ehud]

## Comment viewing options

### I think this would be a good thing.

At my institution (Ghent University), there are no longer any FP courses taught, not in the bachelor curriculum, nor in the master CS curricula. Until a few years ago, we had a single (optional) course that taught the basics of functional and logical programming. I always thought that course enriched the knowledge of the students, yet now they seem to be drenched in Java. Even if they have to write some C stuff, e.g., for programming some kernel device driver for the OS course, they often have problems (for example, char *info; if(info.length > 10) { /* ... */ }. And that's only scratching the top :-)). I doubt the inclusion of FP in this proposal will change anything at our institution, but it might get at least a few people thinking.

### Ghent University at our antipode

Who is teaching programming in Ghent? I would like to meet them. They are not so far geographically from my own university in Louvain-la-Neuve, but conceptually they are at our antipode!

### FP still taught at Oxford

When I was an undergrad in 1992 doing Mathematics and Computation (there was no Computer Science course in those days) our very first lecture was in Functional Programming. About half of the students were so shocked they immediately transferred to straight Mathematics!
My reaction was that that really sorted the wheat from the chaff! But to be a bit kinder... how does one convert the "I want to learn C++/Java" people into "This functional stuff is really interesting and powerful" people?
Is it possible? As knowing FP or Haskell "doesn't get you a job", you are only really going to interest those people who like computing theory for its own sake (and are prepared to find the applications later). You can bring a horse to water...

Let those worried about finding a job, and not about learning computer science, attend trade schools instead. There's plenty of (shorter) programs to help people with work-related skills.

### Wonderful

Then what do we do with all of those out of work computer science majors? Seriously, Microsoft is at least leading the mainstream towards a more functional future with C# 3.0. The Java folks....who knows.

### The Java folks.... ...will

The Java folks....

...will hopefully be replaced by Scala folks. ;-)

### Ain't gonna happen

...unless Sun or IBM or another major player picks it up. Microsoft and Sun don't pick up languages like Nemerle, Boo, Scala....because it displaces their babies....quite sad.

IronRuby, IronPython, JRuby, Jython...."all little playtoys for those kiddies that want to be "dynammic".

Of course we have Microsoft backing F#, and Sun struggles to survive. IBM to the rescue?

### Don't forget that some

Don't forget that some people are using dynamic languages to make big money. And I feel they have a point every time I use liftM2 in Haskell.

Well, currently most horses are being left in the desert. Bringing them to water is a good start, even if only a few choose to drink...

### Not as Pessimistic

I came into the computing field in 1992 with no formal background and only a BA in Mathematics. Through the years, I've used only imperative languages. Coincidentally learning of Scala and reading Simon Peyton Jones' chapter in Beautiful Code made me curious about functional programming. Now the following books are on my Amazon Wish List, which hopefully will be wish fulfilled for my next birthday.

The Haskell School of Expression... by Paul Hudak
Haskell: The Craft of Functional Programming by Simon Thompson
Purely Functional Data Structures by Chris Okasaki

My point is that tradesman developers can become excited by FP.

Having FP as part of the ACM CS curriculum is a good thing. Young developers should have their brains "stretched" as much as possible before the pressures of work and family make learning something new more difficult.

### Knowing FP can help you get

Knowing FP can help you get a job; the world is changing.
At my workplace we'd like to hire people with FP knowledge, and putting at least a little FP in the ACM curriculum is a step in that direction.

### Multicore and distributed

When I studied CS one of the most important, though maligned, course was the one that taught doing the proof of programs using OCaml. That introduced some formalism in the computer "science" we were taught.

But that is not the most important reason. I think one must look ahead and not just what is used now in the industry. Processors are going more multicore and computing more distributed, and Java is not going to make the best platform for the new computing landscape. It is time to look into higher-order, functional and logic programming.

### stop saying java sucks and do something about it

how does industry benefit from "functional"? if language geeks and acadameic dorks just high-and-mighty it on a chest thumping drum then nothing ever gets done.

At least Microsoft is productizing F# (superset of Ocaml)

### Java sucks, even if I don't do anything about it

teaching programming without side-effects is going to be critical to the machines that will be commonplace in 5-10 years.

### Not that obvious

FP-backers hope that FP will be the solution to using efficiently multi-core computers, but this is just one possibility.

STM or message based communication between processes is just as likely to be the future 'winner'.

I have absolutely no idea of which style will win, but I think that it'll take quite a long time before one winner is selected, and in the meantime multicore CPUs won't be very efficiently used..

### Most multicore CPUs right now...

spend most of their time in the HLT state, waiting for the user to do something.

And for many applications, large-scale parallelization is probably more practical than small-scale--a style that current OS/PL architectures can handle quite well.

### Purity and concurrency

In most implementations of STM in impure languages have problems with invisible side effects, as the language don't enforce that only reversible effects happen during a transaction. Also message passing is truly effective when there's a shared nothing paradigm, if the language allows global side effects it'll be very hard (bordering on impossible) to write correct highly concurrent programs using message-passing, as the language wouldn't prevent two processes accessing the same memory. In both cases purity matters and currently only FP languages provide purity. I know of no system today that offers the kind of robust concurrency (i.e. up to hundreds of thousands of tiny processes) found in Erlang or Haskell outside the FP languages and there's strong theoretical reasons to see it not coming to imperative languages in the next decade. OTOH Java is a clear winner in the mainstream imperative languages when we talk about tens to hundreds of threads, there's enough mature libraries in Java to make a difference and the memory model is very well specified and understood. The Azul Systems guys have a good VM that runs on machines with up to 864 cores (8 processors, each with 54 cores), Sun have machines with dozens of cores, all running Java code, so they have a decent amount of engineering effort to make it correct, robust and fast.

### Imperative concurrency

I know of no system today that offers the kind of robust concurrency (i.e. up to hundreds of thousands of tiny processes) found in Erlang or Haskell outside the FP languages and there's strong theoretical reasons to see it not coming to imperative languages in the next decade.

To be fair, occam was doing that kind of robust concurrency 20 years ago, and was an imperative rather than functional language: the fundamental building block of sequential code was assignment to mutable variables, interprocess communication was via message-passing, and static checks enforced "parallel usage rules" to prevent improper sharing of mutating variables. Occam was, in its heyday, used in a wide range of industrial ventures (i.e. it wasn't just a research language). Occam's modern descendant, occam-π offers the same capabilities as occam along with a hot of interesting extensions. However, to the best of my knowledge it remains primarily a research language.

Please note that I'm not (I think) disagreeing with your basic point. Occam was designed from the ground up to be concurrent, and the so-called "parallel usage rules" were built into the language in order to avoid problems with shared access to variables. That is a very different proposition than trying to graft concurrency capabilities onto an existing imperative language.

### Duh!

Now I feel dumb, how could I fail to remember Occam :(. I think you're disagreeing a little with my point and you're right, if the language is designed from the ground up to offer lots of concurrency then it can have additional rules that let it be imperative and robust. OTOH I believe that if you enforce all the necessary rules you end up with a system that is closer to a pure language with effects than an imperative language (in the same way that Haskell is the best imperative language).

### A couple other options

Both are currently more in the theoretical camp, but Eiffel with SCOOP and Chapel (from Cra) are both imperative languages that look like they could actually do concurrency quite well. SCOOP is not fully working yet, but a preprocessor+library systems seems to be doing okay, and it certainly looks decent on paper. Chapel I know less about (I read through the spec quickly once upon a time), and it's still an "on paper" language, but it certainly seems promising. Of course this tends to meet your requirements: Chapel is designed ground up to handle such things, and while SCOOP is beng grafted on, Eiffel already had command query separation and preconditions to act as guards.

### Occam and Erlang

Occam and Erlang are closer than one might think. The main difference is that message passing is synchronous in Occam (the sender waits until the receiver has taken the message) and asynchronous in Erlang (the sender does not wait). Synchronous sending is more natural in a shared-memory situation (multicore processor), since it avoids the need for a buffer. Asynchronous sending is more natural in a distributed memory situation (Internet), since you can't avoid a buffer.

I wouldn't call Occam an imperative concurrent language, though, just as I wouldn't call Erlang a functional language. Occam has local state only, no global state. Erlang is functional only within a process. The languages are actually quite similar.

### I have absolutely no idea of

I have absolutely no idea of which style will win

Judging on how things have gone in the past, and being realistic rather than hopeful, what will win will be whatever requires programmers to learn the least, and make the least amount of change to their current practice. As far as I can see that means STM. Personally I prefer message based systems, but unless someone manages to package them extremely nicely they involve programmers to learn more and think abotu problems differently. Side effect free is, I hate to say, completely off the table. As nice as it is, it requires people to learn to think in a very different way. Thqat's not going to catch on, no matter how good it is, as long as there are laxy learning options like STM.

### Side effect free

what will win will be whatever requires programmers to learn the least...that means STM. ... Side effect free is, I hate to say, completely off the table.

STM requires that atomic blocks be free of global side effects except, of course, for the manipulation of STM variables. To use STM effectively for anything even middling complex programmers will need to understand core functional concepts.

Programmers just aren't going to get a free ride into the high concurrency world no matter what technique gets used.

### ... F# (superset of

... F# (superset of Ocaml)

Out of curiosity, how is F# a superset? From what I can see, F# loses structural typing, modules, and labelled and default arguments, while it adds computation expressions (monads, basically), overloading, and a universal type. Seems more of a wash to me.

### Re: stop saying java sucks

I do not mean to say that Java sucks. In fact I like Java quite a bit.

However, it is obvious that Java concurrent programming model (that is similar to other languages) is not suited to a hardware architecture in which concurrency is a necessity, not just a convenience. I am not qualified to say what should change in Java to support that architecture.

### language geeks and acadameic dorks

"language geeks and acadameic dork" are the one that invented the genuine improvements that Java has over C++:

1. garbage collection
2. abstraction of pointers
3. garbage collection
4. virtual computers / virtual machines
5. A demand for vendor independence and libraries based on standards

It is the "COBOL of the 1990's like features" the language geeks aren't fond of.

### Well I'm No ACM Member, But...

Adding functional programming to the curricula is a bloody great idea. At my current uni we've only got one course at the undergrad level that's tasked with teaching both functional and logic programming. And now it's getting cut because they want to reduce the amount of damage that the worst teacher in the department can do to students.

Admittedly, I'd also like to see Java out of the curriculum. Give 'em C and Scheme to see imperative and functional programming, then add object orientation later on.

### FP is *not* in the ACM Curricula?

That's ... awful. CompSci is an evolving field, and there are too many developers who are ignorant of different programming paradigms. We should be looking to fix that, rather than force future generations of learners to ape it.

### Is ACM even relevant anymore?

Any decent computer science program teaches FP. Programs that depend on ACM for guidance are usually at the low end and you probably want to avoid them.

### Ha ha, only serious

The goal is a worthy goal, and the constraints on what can be done are all too familiar, so I am a bit reluctant to raise a more general question that might hurt the good fight... But I can't help it.

How many people agree with the premise that inclusion of FP in the core curriculum is the most important single thing that the PL community can do for CS education?

Suppose realistic constraints were not what they are, specifically that you could change how OO is taught, how algorithms are taught etc. Would you still think that including FP in the core is the best the PLT has to offer?

Put differently, it seems to me that we have a lot more to contribute, and that in some sense this proposal (which I support, mind you) may reinforce the notion that PLT researchers are a bunch of FP fanatics and have little to offer the mainstream. That would be a pyrrhic victory if ever there was one...

P.S
This is not a troll. Please try to take this seriously...

### I don't agree

Maybe I don't understand "realistic" but in my view the single most important thing that could be done about a high school curriculum is to work on a cannon of stuff from the literature and hand out a bibliography -- and don't skimp by aiming low, the same bibliography should be suitable for upper-level undergrads and for grad students, it should just start with basic stuff is all. Ideally, the bibliography would be organized with dependencies indicated (read A before B) and would form a broad tree. The second best thing would be to assemble lots of free (libre) software relevant to the reading list and make it easy for students to get at and play with. I mean that broader than just PLT -- that should just be some branches of the tree of the cannon.

Teachers and testing? Not sure what to do about them.

I realize that that's not the point of the exercise. Perhaps that's my point.

-t

### As the saying goes, "God is

As the saying goes, "God is not dead, he is just working on a less ambitious project"... The context here is the higher education system as it exists today. It might make sense to throw it all out and start over (I am with you as regards testing), but my point is that even inside these constraints the route chosen may be a little unambitious.

### I think programming with

I think emphasizing the benefits of immutability is fundamentally more important than FP; that lesson crosses language boundaries.

FP seems a natural route to evolve immutable programming, since it provides mechanisms for abstraction and composition, but I can imagine starting with an SSA-like language, perhaps similar to LLVM's IR, if it's discovered that novices come into programming with a more iterative mental process (which I did at least).

I think the difficulty of reasoning with mutable state is what causes novices to falter in their first steps, and causes most Java/C# OO systems to be needlessly complicated. I think such a novice language would be an interesting experiment at least. :-)

### I think the important lesson

I think the important lesson from FP is good run-time modularity, of which immutability provides. However, immutability is only one way to good run-time modularity, OO encapsulation is another way. There is also communication integrity as enforced in ArchJava as another way.

I'm mixed: learning FP can teach some good habits (avoid mutable variables) but can also lead to some bad idealism (always avoid mutable variables). Mutable state is natural and shouldn't be shunned, but it has to be used and protected properly.

I don't see any needless complexity in Java/C#, at least in this regards. These are actually very simple languages with very simple constructs, the languages are easier to learn than say Scala, but only because they are more limited. Speaking of which, I think Scala is the way to go as a first language because it supports FP, OO, and FP + OO, in this case, you can have your cake and eat it too.

### I don't see any needless

I don't see any needless complexity in Java/C#, at least in this regards.

The needless complexity is in systems built with Java/C#, because people try to scale mutable designs without thinking about how the side-effects propagate. In other words, they try to apply local reasoning where the program design requires global reasoning.

Encapsulation doesn't really help here IMO, but immutability would help significantly, since local reasoning would work. But yes, mutability can be very helpful, but starting with immutability seems to be the right default.

### But its a straight jacket

But its a straight jacket that is misapplied. The point is not to use mutability unless necessary, and even then it should be used in a controlled way. Languages that require immutability require "tricks" to do even the simplest of things, turning off the segment of programmers who don't like to solve puzzles just for the thrill of it. And it doesn't reflect reality where things change and you have to deal with that change. This is why FRP is so cool...you can have mutability and a way to reason about that mutability.

Encapsulation of course helps. Making something private and then controlling how it is written is a big deal. Message passing systems are all based on encapsulation (the mutable state is hidden inside and controlled by an actor, for example). I think a discussion about immutability should occur in the same venue a discussion about encapsulation (i.e., in the same course. This is how I learned about both concepts, in my CSE 341 programming languages class.

### I mostly agree with what you

I mostly agree with what you said, except for the immutability requiring so many tricks -- there's lots you can do before you need mutability!

The point I was emphasizing is that C#/Java have the wrong default, which is why I say they're not suitable for novices, as it leads them down the wrong path. An immutable language for beginners is the right start IMO. Immutable by default, mutable when absolutely necessary, where I'm defining "absolutely necessary" as 3rd or 4th year projects. ;-)

### But you forget how people

But you forget how people become interested in computer science, through pretty graphics, animation, scripts to make their life easier, and so on. Sure, to write FIB, you can be purely functional, you can even write some web apps and be fully functional. Now a game? Not without lots of FRP. Its important for the intro programming classes not only to teach but market programming, and FIB is not going to do that.

Not everyone will be sold on CS when they take their first programming class.

### I totally agree. But doing

I totally agree. But doing neat/pretty/cool things seems to depend more on the availability of suitable libraries than it does the programming paradigm. I still think there's a wide selection of neat graphical programs that can be written with an immutable language, and I think the more careful programming habits this requires would ultimately be very beneficial for the novice.

I sort of come from the background you're recommending. My engineering program taught Java and C, and many of the problems I've encountered in my professional programming work were due to the bad habits these languages encourage. Thankfully, I eventually started looking for something better, but many developers don't bother. If they're going to learn only one language, or one set of habits, I would strongly encourage immutability!

### universes of discourse

I think emphasizing the benefits of immutability is fundamentally more important than FP

Wow! Where I come from, those mean the same thing.

-t

### Wow! Where I come from,

Wow! Where I come from, those mean the same thing.

Hmm, most people would disagree.

### Discourse....

Look at the first paragraph of the wikipedia article you link with "people"... That doesn't seem like disagreement with my understanding... not quite.

Once you do away with mutable (in arbitrary ways) state, of course you're talking about functions. There are lots of different ways to talk about functions but that's the thing they have in common.

The first paragraph of the wikipedia article is arguably redundant with itself. It could just say "without mutable state" and that would be the essence. Everything else is just arguing over the best way to do that.

I'm not sure what conclusion you want me to draw from the "disagree" link.

It's true that a lot of static type theorists have a universe of discourse where FP refers to statically typed functions. That's a pretty recent shift in usage and I'm not sure it's a good one.

-t

### Purity != FP

Many FPL don't have purity or enforce it. All of the ML family, Scheme, Erlang, Lisp(s), are considered to be part of the FPL world and allow arbitrary side effects to be treated as values. Also you can have a pure calculus of objects that don't have functions as first class citizens, so purity is commonly a property of FPLs but it doesn't need to be so.

### purity == FP

All of the ML family, Scheme, Erlang, Lisp(s), are considered to be part of the FPL world and allow arbitrary side effects to be treated as values.

Side effects reified as values, such as the applications of a monad, are no longer arbitrary in the sense I've used the term.

In fact, such side effects are scarcely side effects at all, other than that a system that can reason about them can often only reason about them at the expense of being unable to reason about their absence (a practical limitation -- generalizations of those systems would allow that but would be impractical to implement).

Those languages don't allow arbitrary side effects in any ordinary meaning of the term. It is trivially true that there are natural duals between (ordinary sense) side-effecting programs and programs in those languages but that isn't the point.

-t

### ?!?!

Are you arguing that the ML family, Scheme, Erlang and Lisp aren't FPL? Or are you saying that they disallow arbitrary side-effects? Both of these statements are false. Perhaps my particular wording was confusing, I meant that in these languages you can treat a side-effecting function like any other value (technically calling functions with side effects is a expression in the term language).

### !

I'm saying that I don't regard Scheme and Lisp, at least, as functional languages -- only as languages in which functional programming is strongly encouraged. That jives with the usage of "functional language" that I grew up on and that I think fits most of the literature until recent years. In the ML family I understand (perhaps imperfectly) side effects to be customarily modeled via monads - a distinction I covered. I'm unfamiliar enough with Erlang to place it in this spectrum.

-t

### So culture matters

Common Lisp, Scheme, and ML all contain mutable references. All are eager by default. All are lexically scoped, and contain lambdas and first-class HOFs.

Unlike, say, C/C++ or Java, side effects are "less" encouraged in these language. ML makes bindings immutable by default; one needs to use ref types to model mutable state. In the Lisp family of languages, including scheme, things are mutable by default--but mutation of things, especially conses, is culturally discouraged--moreso in Scheme than CL.

Ignoring orthogonal things like the type system, call/cc, CLOS, etc.--the three languages are similar in what they offer the functional programmer. The main difference seems to be cultural; the ML community has a stronger cultural aversion to mutable state than does the Scheme community; Scheme programmers are more state-averse than programmers of Common Lisp. (Erlang is more anti-side-effect than ML though less so than Haskell; the language features single-assignment variables which can be initially unbound; not unlike the dataflow variables of Oz. It doesn't have arbitrarily-mutable variables; however it does have large-scale mutable collections).

I'm not sure how much a language should, or should not, be defined by its programming community. C++ has seen, in recent years, more programmers who write C++ code in a functional style. The language doesn't currently have type inference, algebraic types, or painless lambdas (you can fake the latter well enough), it doesn't permit arbitrary lexical scoping (though use of classes provides a suitable workaround), and certain paradigms are difficult without a GC--yet many write FP-ish code in C++, and a C++ entry won the ICFP a few years back. (It's template mechanism, OTOH, is a nasty lazy B&D functional language with the ugliest syntax imaginable...which probably contributes to its functional cachet).

To further that thought--I'm not sure a binary distinction between "functional" and "non-functional" (or "imperative") languages is terribly useful. FP code can be written in almost any reasonable language, though some make it easier than others, and a few make it hard to do anything but FP. An important question is how much a language supports the functional style. There have been flamewars (not here, fortunately) wherein Haskell users claim ML isn't functional, for various reasons--impurity being the main one. Such arguments are reminiscent of Smalltalkers complaining that C++ or Java aren't really OO languages, as if banning the latter languages from the OO fellowship would somehow garner Smalltalk design wins it has heretofore been denied.

All the languages discussed in your comment provide excellent support for the functional style. Debating which is more "pure" strikes me as a not-very-useful exercise.

### There have been flamewars

There have been flamewars (not here, fortunately) wherein Haskell users claim ML isn't functional, for various reasons--impurity being the main one. Such arguments are reminiscent of Smalltalkers complaining that C++ or Java aren't really OO languages, as if banning the latter languages from the OO fellowship would somehow garner Smalltalk design wins it has heretofore been denied.

There's a significant difference here though - FP has included impure FP in practice since the term was coined, whereas Smalltalk has a reasonable claim to having defined OO and Java and especially C++ clash strongly with Smalltalk's idea of what OO is.

Don't look at me, I mostly use Haskell as a metalanguage - a role in which it's more or less the anti-lisp.

### Smalltalk and C++

Both Smalltalk and C++ are intellectual descendants of Simula. Neither can claim priority over the other.

### Simula didn't define OO, it just gave birth to it

On the other hand, there's the issue of who coined the term in the first place and what it was coined to describe - as coupled with C++'s own designer insisting that it's not an OO language!

I've no intention of re-running the holy war here, but it's simply not the same as attempting to retroactively assert that FP = pure FP. The Smalltalkers aren't attempting a redefinition, they're arguing against one - albeit a widespread and mostly well-justified one.

### Functional Programming FAQ

The Functional Programming FAQ goes back quite a ways and gives the following definition:

Functional programming is a style of programming that emphasizes the evaluation of expressions, rather than execution of commands. The expressions in these language are formed by using functions to combine basic values. A functional language is a language that supports and encourages programming in a functional style.
The FP FAQ lists the following languages (though it's not been updated in years): Caml, Clean, Erlang, FP, Gofer, Haskell, Hope, Hugs, Id, J, Miranda, Mercury, ML, NESL, OPAL, Oz, Pizza, Scheme, Sisal.

To try and bring this back on topic: The question is whether the ACM recommendation should require certain programming languages (Haskell, ML, Scheme) be used in teaching functional programming? Or can any programming language (C++, Java, Perl) be used as a tool to teach FP?

### Or can any programming

Or can any programming language (C++, Java, Perl) be used as a tool to teach FP?

Certainly not. Teaching FP in C++ would be like teaching structured programming in x86 assembler.

### That's not entirely fair

A good macro assembler makes it reasonably easy to do structured programming.

### And a good class library

...in this case, Boost, makes teaching FP in C++ tolerable.

OTOH, there's still a lot of syntactic ugliness in this approach. If the point is to teach FP the paradigm, and there is no other reason to prefer one language over another; lots of other things are probably better choices than C++ (owing to the deficiencies of C++ enumerated, I think, above).

If, on the other hand, the goal is to teach FP to students who are already familiar with C++ and its idioms (or for whom C++ is part of the core curriculum), then there's no reason not to instruct 'em in how to do FP in C++.

### The template debugging out

The template debugging out problem for C++ is not solved: teaching FP in C++ may be tolerable for aficionados, but learning FP in C++ is probably not. This is just one problem in the selection of C++ as currently supported. Students will spend a significant amount of time dealing with implications of this language choice that they would not have encountered in common 'academic' and 'production' languages. There are legitimate reasons for teaching FP in the current C++ environment, but ease of learning is a stretch.

I'm more surprised that programming over large systems is not on the board. MPI, map/reduce, memcached, etc. - we're not at a point where we have a clear across-the-board answer on how to build large systems, but at least an understanding for the coming democratization of MPs is important for both CS majors and hard science majors. Heck, maybe even some sort of free version of parallel matlab is enough.

### To make this even more on

To make this even more on topic: Should the curriculum refer to teaching a pure functional programming language or is Scheme/ML "enough"?

### in my experience

In my experience as whatever I am (don't claim authority here):

It pays to have exposure to purity and then some links from that to the math (e.g., way basic -- like Church).

CMU used to and maybe still does have a course of study called "Math Studies" that spends N+1 semesters basically doing naive set theory, functions, relations, etc. with considerable precision and then ultimately winds up pretty far out, depending on tastes. When I was drifting through it (lousy student here, reporting for duty) I found it to be a superb complement to learning about FP, even though at the time I don't think that was the instructors' intention.

I did Church on my own, in that context. And, in that context, it was easy and intuitive and sometimes even useful.

"FWIW"
-t

### Some of both; though I agree with PVR...

that eager evaluation should be the default strategy.

Scheme and Oz both have the nice property that laziness is just a keyword away--no need to manufacture a closure and then call it with a possibly dummy argument.

### Debating which is more

Debating which is more "pure" strikes me as a not-very-useful exercise.

I agree, but that isn't the issue under discussion. The problem was the identity "functional == pure", despite the fact that the majority of "functional languages" in widespread use are impure.

### #!

Both the ML family and Erlang allow arbitrary side effects, the only (somewhat popular) language that uses monads to encode effects is Haskell. As others said ML has references, so any function call can update references behind the scenes and has impure functions (e.g. print : string -> unit, meaning a function that returns unit and has the side effect of printing a string to the console). Erlang doesn't have references in the language but message passing causes side effects and is unconstrained, it's easy to define a process that behaves as a reference and create an ADT to manipulate it. Also it offers impure functions.

Saying that it's pretty difficult to sustain the argument that purity = functional. The comp.lang.functional FAQ says:

Functional programming is a style of programming that emphasizes the evaluation of expressions, rather than execution of commands. The expressions in these language are formed by using functions to combine basic values. A functional language is a language that supports and encourages programming in a functional style.

I believe most in the community would agree with this definition. If purity is required for a language to be considered functional then even Haskell would suffer: Haskell 98 is pure, but if we add FFI (and the infamous unsafePerformIO) then suddenly the language loses referential transparency.

Also we could argue that functional languages require purity, but there can be pure languages that don't have functions as first class citizens so it's false that purity = functional.

### I'll be more specific then.

I'll be more specific then. The wikipedia page similarly lists recursion, and higher-order functions as characterizing FP. My point is simply that pervasive immutability is a characteristic of only some functional languages, though immutability is certainly encouraged.

### specificity

If I understand you correctly, you are highlighting point four of that list which says: "4) Imutable datatypes per default".

I happen to disagree with points 1..3 of that list but concentrating on (4):

No, that's wrong. It's sloppy usage. Scheme provides an excellent example. Scheme allows fairly arbitrary side effects. Scheme is sometimes said to be a "functional programming language" but I think that close examination of typical usages will reveal that what is usually meant is that Scheme is a language which encourages the use of a purely functional style -- which is not really quite the same thing as saying that Scheme is an FP language. Scheme is a language good for FP -- but definitely not an FP language.

-t

### A number of those what is

what is usually meant is that Scheme is a language which encourages the use of a purely functional style -- which is not really quite the same thing as saying that Scheme is an FP language. Scheme is a language good for FP -- but definitely not an FP language.

Personally, I think you're splitting hairs. But let's try a different tack: would you consider LLVM's SSA-based IR to be a functional language?

### please clarify the question

To spare me from reading the entire spec and trying to guess the point of your question, can you please elaborate your question slightly? To what details of LLVM are you trying to call my attention?

-t

### SSA-form is pure, but the IR

SSA-form is pure, but the IR is not what anyone would recognize as a functional language. The point being that the identity you are drawing between 'purity' and 'functional' is invalid. A language can be pure and not functional, and a language can be functional, and impure.

### bah

I think you're just reducing "functional" to a marketing term, with that line of reasoning. Sorry and, yes, I generally respect what you say here on LtU.

-t

### Marketing is popular

Andrew Appel wrote an expository piece SSA is functional programming, which I believe is behind the ACM paywall, though I think I have a PDF of it lying around somewhere. It's time this SSA ⇔ CPS correspondence had its own LtU story. Later today, perhaps.

### Purity <-?-> Functional

I'm too easily confused. :-} I don't understand why purity has to be a bijection with functional (or I'm misunderstanding the discussion entirely, another possibility!). E.g. Java can be made pure without becoming an FP language?

### Yet another purity post

I think it's been established that purity is a meaningful concepts outside of FP; for the sake of yet more validation, there is much talk of pure logic programming in the Pure, Declarative, and Constructive Arithmetic Relations thread; and Elf is an example of an entirely pure LP language.

I think a useful family of tests of the right of a PL to be called pure, or declarative, or whatever, comes from asking what it is like to program in that PL using just the restricted subset. It's pretty clear that scheme, ML, and Prolog support interesting pure subdialects, even if there are important things one wants to do in each of these languages that cannot be done using only pure means. SSA is a good call: Richard Kelsey observed that it is closely related to CPS translation.

Common lisp is an interesting case: it clearly supports an interesting pure CL dialect, and yet coding in the pure dialect is not the CL way: one might say that semantically it supports FP, but culturally it does not.

### No fib()

How many people agree with the premise that inclusion of FP in the core curriculum is the most important single thing that the PL community can do for CS education?

I am currently playing with Google App Engine, a Python-based web app development framework and hosting solution.

While the programming model, at the lowest level, is very stateful (Python itself, as well as the offered object database), if you squint a bit, it also has "FP" aspects: An application is modelled as a group of HTTP request handlers, "functions" that take a request and return a response; these handlers are executed transparently-parallel on a cluster, which works because they are stateless; the transactional data access feels slightly "monadic" (handlers taking in and outputting a new "world").

I think that a development environment such as Google App Engine would be great for first-year students: they can create relevant, world-accessible applications, get to know advanced concepts, such as e.g. shared-nothing for scalability, factoring code for reusability, ..., and have to deal with important real world issues such as Unicode, HTTP specs, message digests, network latencies, etc etc.

While this is not a true "FP" approach to programming, I think that it would stimulate people (newbies) much more to think for themselves about the same deep programming issues that we want them to care about, than a typical first year "FP" course doing fib(), building "business objects" out of conses, and computing the length of a list with pattern matching. (Yes, I am exaggerating -- but only slightly :)

So, I don't really know what you mean by FP in your quote, but I think that at least beginners would be better served with "App Engine" than with the usual fare of "Beginning FP with CONS and FIB". Because it would probably stimulate and engage them more, and maybe, hopefully inspire them to learn for themselves. Not because of technical merits, necessarily.

What about the pure PL perspective? Well, actually I'm a bit afraid of what I'll find if/when I ever look under Python's hood (maybe nothing, just a bunch of hashtables :) ?) But then again, I am typing this message on a multi-million-lines-of-C/C++ OS/browser combination, to be transmitted to LtU's PHP content management system, and delivered to your (probably) Python-based Google Reader, so these languages seem to work in the real world, which should be enough justification for teaching (one or the other of) them. Of course, students should also learn more "advanced" languages, but I don't think that their first encounter with (functional) programming should be fib().

### Motivation is an important

Motivation is an important factor, I agree.

I was thinking along different (not mutually exclusive) lines. For example, how to make it so formal reasoning skills and rigorous semantics are integrated into the entire CS curriculum (or at least the part that discusses programming and programming languages)...

### Forums for discussion require educated people

Well, I finally joined this forum because I realized that I have no-one at work to discuss programming with at this sort of level. One reason for this is that CS undergraduates are generally taught C++ (badly) and Java and little else. If they had a little more formal reasoning training we could get some interesting discussions going and move on from (for example) tedious debates about sub-dialects of C++.
These could possibly be taught in CS courses in the context of ways to avoid normal (imperative) programming pitfalls: aliasing problems, timing issues, space leaks and so on. Perhaps then the student will feel the pull of languages that enable them to express their programs more formally?

### Reasoning is the key

From my experience, in the US mathematical logic is not taught in public/middle tier schools. I didn't know what a proof was until my sophomore year of university when I took a 'bridge' course in the Math dept taught by a gifted young professor. In my opinion, learning the fundamentals of proofs was the most important thing I've ever learned. I think this is the key to introducing more rigor into undergraduate CS. Another important thing is introducing the 'big ideas' in CS early in the curriculum. The book 'Algorithmics' by David Harel should be the CS 101 book in my opinion.

Sidenote: Python seems like a really great language to teach compiler implementation with. It has enough functionality in it's primitives to allow students to concentrate on algorithms not pointer gymnastics as one would in C/C++. Plus seeing lists used in an effective, time saving manner would be a great intro to at least one aspect of fp languages; lists. That's my .02....

### The most important thing?

I guess I'm not on the side that says that teaching FP should be a goal in and of itself. Rather, it is a means to an end. The end product of any curriculum should be to convey methods of reasoning and logic.

### FP as a DSL for PLT?

It's been said by someone (can't remember whom) that FP languages are domain-specific languages for the programming community. While the "domain-specific" charge is certainly false--FP languages excel at lots of other things--they are by far the most natural way to write programs which analyze and process other programs. Strong logical semantics are helpful when doing this sort of thing; and it's hard to get that with mutable state everywhere.

(Of course, the "impedance match" may have to do with other features found in languages like Haskell and ML, but not in most mainstream statically-typed imperative languages, in particular things like algebraic types and pattern matching--the presence of these is of course orthogonal to the issue of mutable state).

That said--there's a phenomenon, readily observed (I don't know if it's been formally studied), that programmers who focus on a particular problem domain, often tend to value the PLs (and architectural features) which are of use in that domain. Enterprise app programmers in the business of translating *ML to SQL and back, find dynamically-typed languages to be of benefit, as they have to cross several linguistic boundaries where static type systems might get in the way. I know plenty of mathematicians and scientists who wonder why we don't all program in Mathematica. And, in embedded systems (where I make my living), C++ is still highly regarded--in particular on resource-starved platforms. Many DBAs wanna write everything in SQL, coupled with the stored procedure mechanism of their favorite DBMS.

Nothing wrong, of course, with identifying an architecture appropriate to a specific problem, and using (and advocating) it for that problem. The problem comes when people assert that what works for them, ought to work for everybody. We all roll our eyes when we hear colleagues suggesting that Ruby, or Java, or C++, or SQL, or Fortran, or Mathematica, or even Ada is the Only Language We Will Ever Need; we likewise roll our eyes when such claims are made concerning programming paradigms or styles. But the same observation can be made about PLT experts and functional programming. I like functional programming; I annoy my co-workers quite a bit with my anal-retentive quest for const-correctness in my C++ code at work. (Helps me write better code, though). But--many problems are inherently stateful, and are most naturally expressed in a stateful language.

So where should FP fit in the Ideal Curriculum? I think the functional style should be encouraged when it works--regardless of the language. HOFs should of course be covered, (and compared/contrasted with other means for doing the same thing, such as objects). PVR's book, or one similar (SiCp), should form the basis of any introductory programming course.

But is FP (or a FP language) the Most Important Thing in a CS curriculum? Of course not.

(I also thing, speaking more broadly, that more massive restructuring of CS/SWE in general is needed, including creation of "vocational" curriculum that might not result in a 4-year degree. Trying to produce a single curriculum that both gives industry competent programmers, and research programs competent computer scientists, does nobody any favors).

### Right!

So, like, can't we just push CTM on everybody and call it a day?

### Similar to "Physics" vs. "Engineering"

How much theory and "not useful" stuff (like FP, type theory, theorem provers, etc.) to include in a CS major is a lot like the debate about whether engineers ought to study relativity and quantum mechanics, or just stick to the more useful stuff like circuit diagrams, strength of materials, gas equations, and such. This got resolved a long time ago with the creation of Mechanical Engineering, Chemical Engineering, Electrical Engineering, etc. (I'm not saying the best engineers don't also know physics, or math, or have an interest in advanced research, just at the norms of each area. And even something like physics rapidly splits into Applied and Theoretical, echoing a lot of what I'm seeing in the programming and CS curriculum debate.)

As someone who once studied the arcane physics stuff, then worked on chips for a big company, and later got interested in FP (through Backus's 1978 "Can Programming be Liberated from the Von Neumann Style?" Turing Award paper), I see the debate about FP, theorem proviing, type theory, etc. all echoing this split between physicists and engineers.

Maybe it's time to push for much more than just putting some particular hot topic into a curriculum....maybe it's time for Computer Science to actually split into branches, reflecting the way the Chemistry Department is different from the Chemical Engineering Department, for example.

Most of my programmer friends here in the Valley (Silicon) are hip-deep in things like "the LAMP Stack" (the package of interrelated tools like Linux, Apache, MySQL, and Python/Perl/PHP, hence the name). The Web 2.0 companies want programmers with skills in Java, Javascript, AJAX, Ruby, Rails, Python, GWT (Google Web Toolkit), and the who alphabet soup of current stuff. Similar alphabet soups, with different names, 10 years ago, 20 years ago, etc.

I tell some of these friends about Haskell and how cool it all is, and try to explain monads and zippers and whatnot to them, or even wax on about lazy evaluation and referential transparency, but it's pretty clear that this is not stuff needed in industry at this time. (Maybe industry _should_ need it, but they haven't decided they do.) Likewise, object oriented approaches were "in the air" by 1980, yet teaching this stuff was mostly for only the most gifted or interested students. (OO didn't even make it into the normal CS curricula until the push for C++ and then Java, did it? I used it on my Lisp projects in the mid-80s, but except for a few schools (MIT, Indiana, Stanford, etc.), not a lot of OO traning in other schools back then, or so I recollect.

(Which is why I'm not too worried about FP not being taught widely. As its advantages for ATM, massively multicore, and message-passing systems become clearer, it'll get picked up. Friends of mine are already sniffing around Erlang, with Armstrong's new book garnering some attention. If some besides Ericsson deploys some financially successful app, interest in this kind of approach will surge. It always does.)

But rather than take out the cutting edge stuff from the CS curriculum, the stuff on type theory, on FP, on Curry-Howard and all, leave that for those interested in theory and a career in theoretical areas, and create real "engineering" programs for those expecting or wanting to go work for Oracle or MySpace or whatever Web 3.0 jobs are out there in coming years.

(And even trade schools, as some people have said. When I was at a chip company, we had the more theoretical types, the architects and designers and scientists, and we had the engineers implementing the designs and building the chips, and we had lots of "technicians," people often with AA degrees or "DeVry Technical Institute" types of training, or even military training. And we also had "operators," with less formal training.)

Computer science and programming is such a large fraction of the job market in so many cities (despite what is said about declining CS enrollment) that it's natural to expect CS to fork the way the once-monolithic "Natural Philosophy" forked into Physics, Chemistry, etc. and then into these areas plus all the Engineering versions.

--Tim May

### Oops

See my reply below... You took the words right out of my mouth!

### The real reason: FP is the foundation for all other paradigms

The proposal for adding FP emphasizes (rightly) the qualities and unique properties of FP, but it does not mention the most important reason, IMHO, for including FP early in the ACM CS curriculum: FP is the foundation for all other programming paradigms. General data abstractions (including OOP) can be expressed by FP + mutable state. Dataflow concurrency can be expressed by FP + concurrency. Understanding FP gets you a long way toward understanding *all* other paradigms. For example, most OO abstraction techniques are simple extrapolations of FP techniques, and this is clear when you understand how OOP can be expressed by FP + mutable state.

In my view, this is the real reason why FP should be included in the basic curriculum, and why it should be taught before other paradigms. I've been teaching this way since 2003, using a draft of CTM for the first courses. I have adapted the course and the current one is highly polished. Unfortunately, all the course materials are in French (see ctm.info.ucl.ac.be/fr; I have added a hyperlink to Google Translate). The course teaches three paradigms: FP, then FP+state (data abstraction and OOP), and finally FP+concurrency (dataflow). It covers lots of practical techniques but also gives a formal semantics. For many students (even CS students) this is the only time in their careers that they see a programming language semantics.

### Are dataflow variables necessary?

Oz (the language used in CTM) has dataflow variables, but most other (functional) languages don't. They play an important role in other paradigms like declarative concurrency and relational programming. Can you create a language which supports all the paradigms in CTM but with normal variables (like Scheme and OCaml) instead of dataflow variables (like Oz)? Or are dataflow variables necessary? Is FP the foundation of all other paradigms, or is it FP + dataflow variables?

I ask this not only because dataflow variables aren't a part of other functional languages, but also because I do not like certain aspects of dataflow variables. They don't feel like a basic concept, but like a data structure. Second, they don't play well with user defined data types. For example, you can use unification on lists very nicely, but if you create your own data type (for example a set data structure) then the built-in unification doesn't work with this data.

On the other hand, dataflow variables seem to be the central thing that ties all paradigms together.

### I'd say the lambda calculii

are the foundation, and these (the simple ones, at least) don't have dataflow variables; they just have bindings, functions, and functional application. Likewise, quite a few programming languages don't have anything resembling dataflow variables (even those with mutable state), so I'm not sure I'd consider it foundational.

Has Oz at any time considered "splitting" the current dataflow variable into the seemingly orthogonal concepts of

a) a simple binding; where an entity is bound to a name at the point of definition, and

b) a datatype (the dataflow variable) which has the Oz semantics, but need not necessarily be bound to an atom (its name; here I don't refer to the Oz use of "name" to be an unforgeable unique ID) in some scope?

A declare statement in Oz essentially gives you an instance of b bound to some name, which must be unique in the containing scope; but it might be nice to have anonymous dataflow variables, which can be a tuple element, but which aren't named. (The case for a without b is less compelling; as a dataflow variable can be bound at its point of definition; a case which I'm sure the tools could optimize).

### Lambda calculus with futures

If I understand you correctly, then yes, this has been considered, and is one of the ideas that led from Oz to Alice ML and the underlying lambda calculus with futures.

### One other issue with dataflow variables

at least as implemented in Oz, is that they need concurrency to work correctly. Without concurrency, trying to read from one that has not been bound causes deadlock; hardly a useful state of affairs.

Of course, other semantics for such variables can be thought of in the unbound case. Returning an error (which Oz supports as an alternate; though as PVR notes in the book this loses referential transparency), returning a closure of some sort which when evaluated, will reduce to the value if and when it ever DOES get bound (otherwise it stays in the unbound form), etc.

### Importance of dataflow variables

Is FP the foundation of all other paradigms, or is it FP + dataflow variables?

This is a very good question. My own view is that FP+dataflow variables (FP+DV) is a better foundational paradigm than FP, because of concurrency. Here's why:

• FP+DV, even without concurrency, lets you program a tail-recursive append in the naive way, which is not possible with FP alone (try it!). This is because you can build a partial list with a DV "hole" in it, which is filled in later by the recursive call. Programming with "holes" that are filled in later is an essential technique. We have known for thirty years that append written in Prolog has this property (since D.H.D. Warren's thesis in 1977); but its wider importance is less well-known.
• Combined with concurrency, tail-recursive list programming is essential for building concurrent agents that don't have memory leaks. The simple way, a thread containing a recursive function that reads a stream (a list with unbound DV tail) and outputs a stream, is the correct way.
• FP+DV can be extended easily to be lazy, by adding a single concept, wait until needed, which waits until the value of a DV is needed by another operation. This is amazingly powerful: most programs (concurrent or not) can be made lazy simply by sprinkling them with WaitNeeded calls without changing anything else.
• The kicker is that all these languages, FP+DV, FP+DV augmented with concurrency, and FP+DV augmented with concurrency and wait until needed, are declarative. For concurrent languages we need a generalized definition of "declarative"; see chapter 4 of CTM. The notion of "partial termination" (called "resting point" by Vijay Saraswat) is essential: whenever the input to a declarative program stops growing, then the output will eventually stop growing and all possible executions give equivalent outputs.
• A final point is that both DV and wait until needed are easy to implement and efficient. In the most naive implementation, you need to do occasional extra pointer indirections because of the transparent single assignment (see below). These extra indirections are a small price to pay for the huge increase in expressive power, and they only slightly increase the execution time of most programs. A little bit of work in the compiler can make this overhead go almost to zero.

By the way, these properties of dataflow variables do not depend on unification. Transparent single assignment and synchronization on single assignment are enough. By "transparent" I mean that when a DV is assigned it *becomes* its value. The value can be of any data type. Oz has unification for historical reasons, since its roots are in logic programming, and unification generalizes to become constraint programming. But if you just want lazy concurrent FP, you don't need unification.

Dataflow variables are not a new idea. The basic idea appeared in the 1980s, with the independent invention of futures in Halstead's MultiLisp, Arvind's I-structures (later taken up by pH), and synchronizing logic variables of concurrent logic programming. (The E language also uses DV, calling them "promises".)

### Almost tail-recursive (append)

You don't really need dataflow values to have "almost tail-recursive" function optimisation using a "hole". You can do it in the compiler.

Laziness does not require dataflow values either I believe.

### Laziness and dataflow

You can do it in the compiler.

Of course. The obvious way is for the compiler to use holes internally and do code optimizations with them. This approach is not so nice, for two reasons: (1) it hides a concept that the programmer should see, and (2) it leads naturally to a "smart-aleck" compiler, whose performance model is complicated (c.f. Hudak's definition).

Laziness does not require dataflow values either I believe.

I never claimed so. I just presented a declarative language that is both lazy and concurrent, and explained how easy and elegant this language is when you have dataflow variables. The declarative subset of Haskell is lazy but not concurrent (a pure lazy language executes by coroutining).

In my message I tried to explain things simply and clearly without much technical baggage. This does not mean that what I am saying is trivial. Instead of shooting from the hip, why not try to understand what I am saying?

### Re: Laziness and dataflow

In my message I tried to explain things simply and clearly without much technical baggage. This does not mean that what I am saying is trivial. Instead of shooting from the hip, why not try to understand what I am saying?

Will do, thanks for the advice. Does your post answer Jules' observation about dataflow values looking like data structures?

### Transparent single assignment

Does your post answer Jules' observation about dataflow values looking like data structures?

Yes, it does.

### Concurrency, eagerness, laziness

are all just different reduction strategies. (Of course, everyone here knows this already; I'm just repeating it to make a point). While there are excellent reasons, as discussed in CTM, to make eager evaluation the default (in which case laziness often does get implemented as a

At any point in a computation, after all, there will be some finite n terms which can be reduced. If n is zero, then you're done (or stuck). If n is one, then what happens next is obvious. If n > 1, then you have a choice.

Eager and lazy evaluation only focus on a subset (the "frontier") of the available reducible terms; and both make a rather simplistic choice. When beta-reducing a functional application, one reduces the function body first, the other reduces the argument. But neither will attempt to reduce, at any given step, an available computation which is higher "up the stack" than the limited view that the machine looks at--even if the limited subset that it looks at consists of the null set. As a result, if you obtain the value of an unbound dataflow variable, you get stuck if you are running a sequential program.

ByNeed variables in Oz provide one way for the computation to get unstuck--by providing a locally-visible alternate set of redexes which have the property that they will ultimately permit forward progress. (They have the nice property that they syntactially guarantee forward progress, assuming the ByNeed code terminates).

But what of concurrency? In thread-oriented concurrency, you expand the scope of available redexes from beyond the immediate view, to a larger subset of possible redexes; and generally select one nondeterministically. Practical implementations of concurrency will divide a computation into subsets of redexes, called "processes" or "threads", and then only consider reductions on the frontiers of each. Then there's maximal concurrency, in which any available redex in a computation may be selected for reduction; as is noted in CTM, this isn't practical to implement. In both cases, the thread-based and maximum concurrency models use nondeterminism to select a redex to reduce; and if hardware permits (i.e. you are running on a multicomputer), may in fact reduce multiple terms simultaneously. Nondeterministic selection has the advantage that it is often more "fair" than naive deterministic approaches (which, like eager evaluation, may cause part of the computation to not make progress). Of course, nondeterminism has its own issues--if you add things to make the programming model non-declarative, than the nondeterministic operational semantics become visible. Which may, or may not, cause incorrect results.

Has Oz ever considered any other strategies (between simple lazy/eager valuation, and nondeterministic concurrency) for dealing with stuck computations? Backtracking would be one obvious choice that comes to mind. If you evalute

{ foo A B }

and A depends on a DFV that is bound by B; the computation will get stuck. If, on the other hand, the Oz runtime were to backtrack, and try evaluating B if evaluating A gets stuck, then the computation as a whole can make progress. Compare to ByNeed synchronization, this permits the source and sink of the dataflow variable to be uncoupled (and of course, you lose the static guarantee that the variable will in fact be bound).

Of course, even that is no panacea; if you try to evaluate

{foo A B C}

where A refers to an unbound variable which is bound by C, and B fails to terminate, you're still screwed. :) (OTOH, nontermination of B would still cause nontermination of the program under an eager evaluation scheme, even if there were no data dependencies needing resolution).

[ed: fixed broken HTML that turned everything afterwards italics]

### Hear, hear.

(I also thing, speaking more broadly, that more massive restructuring of CS/SWE in general is needed, including creation of "vocational" curriculum that might not result in a 4-year degree. Trying to produce a single curriculum that both gives industry competent programmers, and research programs competent computer scientists, does nobody any favors).

YES. This is IMHO the most important point to come out of this and the Stroustrup thread. By the time you finish high school, if you're "interested in cars", you already know whether you want to be (a) an auto mechanic, (b) an automotive engineer, or (c) a physicist studying thermodynamics and the fundamental limits of combustion engines. In CS this is, for the most part, a complete mess. Some college CS students basically want to be auto mechanics, while some basically want to be mathematicians. And in the industry, no one is really sure where to go to hire the right kind of graduates. (Notice that I'm not talking about a hardware/software split, but something more subtle.) So it's only natural that there's no consensus on curriculum. I really hope to see this situation change within the next couple of decades.

### Specialization

Don't know how it is elsewhere, but at UTD there are three different degrees: Software Engineering which is aimed at industry, Computer Science which is more aimed at research, and Computer Engineering which is aimed towards hardware. And within these majors, there are course plans that further specialize. Also, an introductory course is also taken by many that have no interest in pursuing a computer related degree.

Probably worth noting that SICP was aimed as much at the EE's and ME's as it was the CS majors.

### algorithms & how OO is taught

Far be it from to sound like a Pollyanna but I think the colleges do a pretty job teaching OO given the constraints they face.

As for algorithms do we have any good ideas how to do that much better in an undergrad setting? The only one that come to mind is teaching people how to do them in languages that strip off the boilerplate. My algorithms course was in C and Modula-2. I understood the concepts no problem. I spent the course trying to get pointer manipulations to work right in all the edge cases. I would have learned 10x as much if they had used Scheme. I knock of much harder algorithms in 2% of the time today that I spent then in Haskell and I probably more out of practice today and I think less smart (though a lot more experienced). Doing them in Perl makes it about 5x as hard as Haskell and Perl is much much easier than C/Modula-2.

So yeah I really think FP is a good solution for classic algorithms courses. Mathematica strikes me as a great Freshman language for almost all the sciences. A perfect 101 that fulfills the traditional comp-sci curriculum and is useful for the rest of their education. No one is going to think you are a fanatic for suggesting Mathematica.

### FP not claimed to be a panacea

The proposal is not to make everybody use FP for everything. It's merely to put FP on an equal basis with OO in the CS core.

Right now OO is in the CS core curriculum (for 10 hours of lecture time), and FP is not. The proposal is simply to add 10 hours of FP education to the core curriculum, putting it on an equal footing with OO. And we figured out how to do this without adding hours to the core.

Also, this does not require that intro courses use FP; it only requires that the student get 10 hours of FP education somewhere in the curriculum. The OO requirement is similar. (And of course "require" is the wrong word, since the ACM curriculum is mandatory on nobody. But it's influential.)

Students who get the CS core should come out about equally competent in FP and OO.

### Well, I think there's no

Well, I think there's no reason to praise one paradigm and say that others are nonsense. If you have time, please look at what the new C++ standard committee is doing. C++ is not an object-oriented language from the beginning (that's what Stroustrup said). It's meant to be a multi-paradigm language. The problem is that few people try to learn new paradigms. In C++, we have the STL (originally an Ada and Scheme library) years ago and now we have generic programming, template meta programming. If you look at Boost C++ Libraries (www.boost.org if you don't know), you will get over 80 libraries including lambda expressions and many functional paradigm stuff. It's not which paradigm is best but it's how we can use various paradigm to make better programs.

### All the things you mention

All the things you mention were discussed here more than a few times.

Mark Guzdial, who has been exploring Media Computation as a pedagogical approach at Georgia Tech, makes some critical remarks about this proposal in his blog.

Alan Kay chimes in extensively in the comments.

### More FP sounds good to me

As a recent university graduate who works in industry--and as someone who would like to create a compiler--I would very much have liked to take a functional programming course. For one thing, some FP knowledge is a prerequisite for understanding a wide variety of academic papers, which in turn are a useful source of inspiration and knowledge for practical industrial applications. And I'm pretty sure FP knowledge is useful in some respects for structuring one's own thinking; isn't FP useful for designing programs, even in non-FP languages?

Also, teaching FP would probably allow it to gain more use in industry, which might lead to more reliable software in the long run. Of course, I'm no expert, given that I didn't have a healthy level of exposure to FP in my own education. Grr!

IIRC, the beginner FP course at my university was not eligible for credit towards my engineering degree. Ironically, the prof in my mandatory Compiler Construction course insisted that we use Haskell for our compilers, which has an incredibly steep learning curve for those with no prior introduction.

P.S. Having read Mark Guzdial's post, I suppose giving a full 50% of class time to FP might not be appropriate. But I'm convinced it deserves a lot more than a passing mention.

Personally, one of the things I'd really like to learn about FP is how the compilers work. In C#, for example, the traditional functional way of computing Fibonacci has horrifyingly bad performance:

Func fib=null;
fib = n => n > 1 ? fib(n - 1) + fib(n - 2) : n;

Write the same kind of thing in Haskell and the compiler magically makes it efficient. How does it do that? I'd also like to see FP be combined somehow with existing mainstream languages in a similarly performant manner (lambda functions are finally coming to mainstream languages, but without any FP compiler optimizations).

P.P.S. Wait a minute... only 20 "hours" total? How much could you teach about either OOP or FP in so little time?