Why Did M.I.T. Switch from Scheme to Python?

Daniel Weinreb has a short investigative piece about why MIT's well-known 6.001 course based around SICP and Scheme has been replaced with Python:

I’ve been seeing mail and blog postings, particularly from people in the Lisp community, why MIT has switched from using Scheme to Python in the freshman core curriculum for the department of Electrical Engineering and Computer Science.

At the International Lisp Conference, Prof. Gerry Sussman gave a short impromptu talk explaining the new freshman curriculum. Just to get a second opinion, I later called Prof. Jacob White, one of the designers of the curriculum and lecturers for the core courses. He confirmed Gerry’s description.

Asking why they changed languages is, in some sense, the wrong question.

Also discussed previously on LtU here.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

"Weinreb"

His name is misspelled in the OP. Sorry for the nitpick...

No Problem

Fixed... I know I noticed the spelling, but forgot to double check this post. Thanks.

Unbelievable

SICP reflected an aspiration to place computer science closer to the pure sciences, i.e. math ... “However, nowadays … engineer must learn to perform scientific experiments to find out how the software and hardware actually work”

Is this story a joke?

No, not a joke.

Here is a more detailed text of what Sussman said.

Perhaps Sussman thinks Python was a bad choice:

"Why did they choose python? Who knows, it’s probably because python has a good standard library for interacting with the robot."

Wrong

No, that wasn't it. I saw Sussman speak at the ILC, and it was clear that he endorsed the decision. I'm pretty sure he even said he was one of the ones that pushed for it.

Re: Unbelievable

This appears to be a serious post. The observation that experimentation is at least as important as simple APIs seems to be true of my own experience: all too often I need to interface with some kind of application, not necessarily "legacy", which I would much rather avoid completely but that I don't have the time or energy to come up with a proper replacement, not to mention the marketing effort that would be required to convince a reasonable segment of decision-makers that my hypothetical solution is indeed better.

I'm not saying that this is the way that things should be, and in all fairness SICP has not vanished from MIT's coursework. This debate reminds me of API Design Matters for some reason...

My understanding

My understanding of the subtext: Python gives a big software library, with a 300-page manual full of errors; it also gives an intepreter, whose exact semantics is extremely hard to characterize (what happens when the types are wrong?). :)

its not about the language choice

I think the comment -

Asking why they changed languages is, in some sense, the wrong question.

.. is dead on. However, I do think Abelson & Sussman will have to withdraw their maxim "programs must be written for people to read, and only incidentally for machines to execute." 'cos the choice of python seems to be because that's what the machines can execute ;)

Anyway, we're increasingly having to deal with a multilingual world even when it comes to just computer programs. So a language agnostic foundation, which scientific experimentation indeed is, is certainly desirable. That doesn't invalidate PLT as a discipline, but it certainly clarifies that programming machines and systems is not *only* about languages.

So a language agnostic

So a language agnostic foundation, which scientific experimentation indeed is, is certainly desirable.

As a child I was raised up with German as a first language. It's surely not the best language one could imagine but I'm still glad that I wasn't educated in a language agnostic way.

German

As a child I was raised up with German as a first language.

Well until WWII that was perceived as the country/language of philosophers. You could have done worse.

Language choice Does matter

The language choice of course matters! I would guess that all of us have heard what Dijkstra said about Basic and Cobol, and how this kind of languages destroy the potential of the programmers beyond reparation.

Personally, I think Python is one of the worst possible choices. Let's consider this: a language that seeks to completely control/babysit the programmers and a community ignorant enough to let an implementer brag about how Tail Call Optimization is evil and another member to compare Python with LISP in such an insulting and stupid way that almost make me puke. Let me ask you, how can we produce a good programmer out a slave of the language whose community only enslaves that programmer further?

Technical considerations

Dijsktra's amusing troll aside... Mathematicians, scientists and engineers consider measurement and testable models to be the only objective basis from which to derive conclusions. Is there anything in your statements that can be objectively measured? Does this really harbor the dénouement of the Eternal September? The rhetoric of opinion, politics and religion, rather than sound science for pedagogical techniques. Although many on LtU would agree that language choice does matter, I would think that the lack of statistically significant evidence would give rise to a greater sense of humility when comparing PLs and teaching techniques.

Ego is typically measured in nanoDijkstras...

...which is saying a lot in a profession driven largely by ego. We try to keep ego under control here at LtU, and avoid the kind of rhetoric, language, and drama exhibited in your post.

I'm one of those brain damaged programmers Dijkstra talked about: I grew up with Basic, and switching to Pascal was my first lesson that a language could be substantially better than another. Believe me, BASIC was better than nothing, and the early lesson in the value of language was also quite nice.

I'm definitely pro-TCO, though I can (somewhat) understand where Guido is coming from, even if I think his argument is flawed. But you most definitely can do much worse than Python, especially for a beginner.

python's new tag line!

"Could be worse. Could be PHP."

Is there CS in there somewhere?

The change to Python didn't concern me so much as the change in emphasis. From what I've read it seemed like SICP was concerned with the creation of "computers in software", creating new computing architectures. This is a valuable CS concept. Alan Turing said something like, "If you don't like the computer you have, you can create a better one," and this is one way to do it.

My read of Sussman's description of the new program was that instead students would study existing code libraries, come to understand them, and figure out how to piece them together into an efficient, effective whole. What struck me about this is this is not unlike what I did for several years in my IT work, and I would definitely not say that it gave me a good sense of what CS really is. I'd say the opposite. There's a lot of badly written code out there. Will there be an emphasis on good design, or is the idea to just "make it work" with the libraries you have, however you can?

Secondly, my read of Sussman's comments was that he was describing the rationale for the change, but he did not endorse it. He took himself out of the picture though saying he was just old and out of step with the times. I think that's unfortunate, because I think CS could use the wisdom of its sages. "The times" we are in (speaking of CS and the state of software in industry) are not that enlightened.

Reading Weinreb's post I get a sense that the intent of the MIT program was always to educate from the engineering perspective, and the move to the new program just updates that focus. Maybe that's true. Hopefully SICP is still in there somewhere.

I don't agree with Ben Hyde's characterization (in the comments of Weinreb's post) of CS's aspirations, because he confuses "pure science" with math. He's right though that CS started out as a combination of disciplines (math, science, engineering), and that it lost that focus in many institutions of learning after the 1960s. I've heard many times (I was taught this when I took CS) that "CS is just an offshoot of mathematics". I agree this is a very limited POV and it's unfortunate that kind of thinking was allowed to hold sway for so long.

I agree with Scott Burson's comments (again from Weinreb's post) that the curriculum change does not seem conducive to understanding some CS fundamentals such as design of operating environments (operating systems, virtual machines), compilers, etc. These still have a place in CS.

SICP is still used

Well, from what I gather SICP is still being used at MIT, just not in the introductory courses. But I don't know too many details.

Endorsement

I may have been unclear. Sussman definitely said that he was entirely in favor of the change in curriculum, so he does endorse it in that sense. He feels it's the right thing to teach.

MIT certainly still teaching operating systems, virtual machines, and so on. We're only talking about the freshman core curriculum, the one that not only the CS students but the EE students all take. The MIT web site shows the whole curriculum, and most of it is very similar to the way it was when I was an undergrad in the late 1970's.

Endorsement unclear

Jose Ortega-Ruiz was at the International Lisp conference. He listened to Sussman's remarks on this. Quoting from Programming Musings:Sussmania:

It was nice in a kind of sad way: at the end, while answering a question, Gerry mentioned that this new computing world was not his, and it wasn’t one that he liked. ‘But’, he said, ‘that’s because we’re old farts’.

So I don't get the sense that Sussman endorsed it, but he considered his own POV to be irrelevant. Maybe he saw it as necessary, because of the current reality, but he doesn't like the reality.

I disagree

I agree with Scott Burson's comments (again from Weinreb's post) that the curriculum change does not seem conducive to understanding some CS fundamentals such as design of operating environments (operating systems, virtual machines), compilers, etc. These still have a place in CS.

I disagree with Scott Burson's comment.

Fundamentally, experimentation is certainly a route that you can take to build these concepts "from scratch". It is not necessary to be taught operating system concepts, for example, from first principles in order for students to "get" them. After all, how did the OS pioneers arrive at these concepts in the first place? In this view, direct instruction (when possible) can be seen as a "mere optimization" of the learning process.

Paths to understanding systems

I agree that experimentation can be a path to what we're talking about. I don't see analysis of existing libraries as a path though. It's no better than the "mere optimization" you speak of. Instead of learning about fundamental ideas of computing you learn the libraries, which present a specific design description. If the only goal is to understand that description and use that knowledge to make a machine work then I think the experience will be limiting. If the libraries were used as an exercise in analysis and critique against the fundamentals I think that would be constructive, because students could learn to see what they're really working with, and that there are larger ideas to explore beyond the libraries themselves. They could place the libraries in a larger conceptual framework.

It is not necessary to be taught operating system concepts, for example, from first principles in order for students to "get" them.

Most undergrad CS students don't "get" these things now. Most of the CS students I knew from my college days (16 years ago) didn't either. I "got it" a little because I took a course in compilers, but even that was not great.

Frankly I think it would be nice if the idea of writing a language were not made into a seemingly sophisticated exercise as it has been in CS curricula. Some older computer scientists can remember a time when it was common to write small languages to bootstrap more sophisticated languages, and finally systems. I'd like to see a course (or a focus throughout several courses) that had students start out with simple languages using simple techniques, and as needs get more sophisticated, introduce more sophisticated techniques for parsing and grammar interpretation.

I'd like to see a broadened approach towards systems engineering as well, bringing in other disciplines that know something about systems, so as to expose students to not only what traditional CS has found out about the subject, but what scientists in other fields have found out about it, too. And I would like to see students being encouraged to model some system ideas in a computer system of their own (maybe even using a language they've previously developed).

That I think should be part of a real CS education.

Edit: Changed the last sentence from "That I think would be a real CS eduction".

the core ..

Instead of learning about fundamental ideas of computing you learn the libraries, which present a specific design description.

I understood the core change in the curriculum to be a focus on experimentation. If it takes learning a few libraries in order to engage in experimentation, I don't see any harm in that.

Most undergrad CS students don't "get" these things now.

To generalize, nobody really "gets" it :) New ways of thinking about computation keep coming up. For example, I felt like I "got it" when I learnt about monads and many concepts fell into place - sort of like what learning exterior calculus did to me. Did Church grok monads and "get it" in the way I did? I wouldn't know.

These are smart people, so I

These are smart people, so I don't want to rush to judgment, but this talk of experimentation as the reason to switch from Scheme is silly. First, Scheme is interpreted (yep, that's the read-eval-print-loop) and supports experimentation. Second, there are Scheme implementations that allow access to Java libraries, if it is libraries you want. So libraries and experimentation don't cut it as an explanation.

More fundamental, of course, is the question what you experiment with or about. SICP's great strength is exactly that it allows students to experiment with fundamental models of computation (that's one of the best things about the interpreter's approach more generally, of course).

experimentation ...

More fundamental, of course, is the question what you experiment with or about.

The choice of Python doesn't mean Scheme is unsuitable for experimentation with models of computation. That would be a very silly statement to make indeed. Mixing up two "different" fields, however, can get you something very fresh and unanticipated. When experimenting with robots, it is conceivable that parallel aspects of the world are more in your face than when working inside a sandbox like SICP where you're focused on "computation". It might have the potential to lead to new ways of thinking about and dealing with concurrency and modularity, for examples. The learning potential there probably transcends the medium used for exploration and so maybe after all it doesn't matter whether you use Scheme or Python. Whatever lets you talk to the robots easily goes.

The visual domain of logo is probably a useful analog - it got kids cooking up fractals that their teachers had no idea was possible in the kitchen ... and it started off as an physical pen-drawing turtle robot.

Personally, if I got to take this course, I'd be very excited about it ... even though I prefer Scheme to Python.

Did Church grok monads and

Did Church grok monads and "get it" in the way I did? I wouldn't know.

I don't know about Church, but Curry certainly did; in some 1948 lectures he proposed the lax modality (ie, a monadic type constructor) and suggested it had interesting proof-theoretic properties worth further study!

(I haven't seen the paper myself, though -- I'm repeating what Fairtlough and Mendler reported in their paper "Propositional Lax Logic".)

"New" ideas in Computer Science

That goes to show that few ideas in CS are truly "new", but rather rediscoveries of what somebody already knew long before.

That goes to show that few

That goes to show that few ideas in CS are truly "new", but rather rediscoveries of what somebody already knew long before.

I would replace "knew" with "suspected", or perhaps "believed". Wouldn't be worth further study if they already had the knowledge!

Curry's lax monad

See Chapter 5 of Curry's A Theory of Formal Deducibility.

First principles

Having spent some quality time with SICP, the thing that strikes me about the book is that it attempts to be an introduction from first principles. Given its foundational beginnings, it's amazing how much ground it covers. I'm not one that believes that such an Aristotelian approach to teaching programming is the right approach in all circumstances. How To Design Programs is more accessible to beginners. And the Robotics/Python approach probably has its own merits - though I wonder if robotics is not a misplaced abstraction. I suppose controlling bots, be they mechanical or turtles, has a long history in introductory material. But I can't help but think that the mental model of robots also carries with it certain baggage that may not be useful in the more abstract applications of programming.

From an LtU perspective, I suppose the most redeeming quality of SICP are chapters 4 and 5 (interpreters and compilers). Moving away from SICP amounts to relegating language tinkering to more advanced courses (though I can imagine less ambitious DSLs may make an appearance - ala Picture/Robotics Language).

Such a bad thing?

As much as I'm a fan of SICP, I'm not sure that it is such a bad thing to make it a 3rd year text instead of a first year text.

When I first read it I had already been programming for many years, and part of what I loved about it was how it presented ideas that I had come to believe from my own experience to be "tricky" in a simple and elegant way. I doubt that aspect of the book can be fully appreciated by someone coming to the material as a newbie.

Somewhat more facetiously, I have to suggest that maybe they are just being merciful to their students, since it seems that many people, seduced by the excitement of SICP, go on to suffer miserably in their career as API and framework plumbers, wishing that being a programmer was actually the elegant and rational process that Abelson and Sussman had made it out to be.

seduction

Funny, Matthias Felleisen was just heard recently on the PLT Scheme mailing list saying:

I have had at least two students tell me (in the words of one) that "you seduced me, you made me think computing and programming was sooo elegant and sooo beautiful, and now I am stuck with C++." :-)

Dissatisfaction with what is

I don't think dissatisfaction with what exists is a bad thing. It helps us realize that what passes for normal out in the work world is by and large messed up. It shows how much progress needs to be made to make it better.

I kind of had the same thing happen to me when I got my undergrad CS degree. My CS professors emphasized what a rational software engineering process should be like. I didn't see it when I got out into the work world. Everything was ad hoc. Nothing was well planned. In one place I worked I managed to get the ear of the VP of Engineering and introduced him to my course materials on software engineering. He was interested in it and decided to try to improve our processes based on it. Things did improve over time in terms of project planning. We also worked to make our code production process more disciplined using best practices (I think that was the VP's idea).

I got a bit of exposure to elegance in computing when I was taking CS, but it wasn't emphasized. Whatever sense I had about it came from somewhere else. We were not given the sense of creating a better computer through software. I wish we had, though I understand perfectly well that in industry it's far easier to improve SE practice than to improve the programming model.

Getting out of "programmer = plumber"

I know what this is like. I saw my job devolve into this. It didn't start out that way. One of my very first jobs out of college in the mid-1990s was working on a script interpreter written in C. I dealt with one database library, but that was it. The rest was working with the standard library, many linked lists, and tons of pointers. Aside from the development environment I found the project rewarding. My last IT job that I quit a few years ago largely involved working with framework plumbing. It got to the point that the only way I could justify it to myself was trying to make the plumbing more elegant by creating my own OO abstractions for it, but that took too long for my boss's liking.

I've talked to several other programmers these last few years who also feel like something wrong has happened, that they were meant for something more than just connecting "this" to "that", but it's all they see. They plead "What should I do about this?" The best answer I can give them is look for work where the company is doing something innovative with computing itself. What I've found is in order to head in this direction one has to get outside one's comfort zone, take some risks, and perhaps upgrade one's education. This doesn't necessarily mean a higher degree in CS, though it could. It could be another degree in a different subject, or doing some serious reading.

If they take the easy path into a career they're likely to end up as plumbers, because those jobs are the most numerous. To get to interesting work I think you have to take the road less travelled.

Plumbing geeks

There is a PL angle to this that I think is illuminating.

Many people who take up Python or Ruby talk about how these languages recapture the excitement of programming for them. But to my eye, a lot of a what people are actually doing in these languages is par excellence the kind of API and library plumbing we are talking about.

So for some people, I don't think that it's the plumbing that gets them down, they just want to operate in a culture where they get to be geekily enthusiastic about the plumbing.

Those of us who are PLT geeks, who are mesmerized by the Schemes and Haskells of the world, are probably a different breed of geek. I think we are more pragmatic about the plumbing (even if we are good at it), and instead are seeking some more abstract payoff. Those in this camp who really can't stand any plumbing should probably consider academic CS as a career.

The world needs both plumbers and researchers, and some researching plumbers too.

"The excitement of

"The excitement of programming", for a lot of people, is essentially the same as the excitement of making stuff. Even those of us who are in that camp, though, can be mesmerized by Scheme and Haskell and Erlang, but from the viewpoint of "how will this help me build better stuff, and faster?" rather than for its own sake, as if it were merely applied mathematics.

Illuminating

You and Marc provide some perspective, that the difference is larger than the languages used. It's a mindset. For years, growing up in the commercial computing culture myself, I believed that the task of a programmer was to build something useful, maybe fun. I'd experiment some, but I didn't feel like I was accomplishing anything unless I was finding something useful that hadn't been built yet, and building it.

I had an "awakening" a few years ago that started to get me out of that mindset. I found the idea that I could "write myself a better computer" as pretty interesting, and in that light CS is pretty interesting. I originally saw CS (about 20 years ago when I took it) as the price I had to pay to begin my career, and maybe I'd learn something valuable along the way (which I did). With a few exceptions the CS I was presented with in college didn't really grab me. I enjoyed the programming a lot, but the ideas (beyond the practical analytical and best practices stuff) was forgettable. My objective was to continue making useful stuff once I got out.

I started changing my mind with the realization that good design was more than just something that I fancied privately, but was important to writing code that made sense (in addition to working properly). I was also seeing that frameworks were not a ticket to automatically creating a good design. They, along with IDEs, can have a tendency to "steer" you in a particular design direction if you let them. Seeing the mess that can be made with frameworks really brought this home to me.

What really helped me decide to change my perspective was remembering an old, wishful vision I had when I was a teenager that computing would help create a more educated and thoughtful society. I was expecting to see this play out in my career, but what I saw in the real world didn't fit this at all. Instead people were keeping their old mindsets and using computers as we've used industrial machinery for decades. And I began to see that most of the technology I grew up with and thought was new had ideas that had already been developed by computer researchers decades earlier. So it wasn't that innovative. It was just smaller and cheaper than the original prototypes.

So I've been thinking about where I want to go with this, and academia is one of the options I've considered.

Plumbing geeks and Python

I think your eye is clouded with regard to Python. I too see a lot of blog posts of those new to Python stating that it brings back the excitement of programming. I also see in those posts that part of that excitement is how lightweight Python API's are compared to their previous language and how close they see Python is to 'executable pseudocode' allowing them more time to explore alternatives and generally do fun things whilst solving their problems. from the Python shell through ipython, scipy number matplotlib Django, Turbogears, AppEngine, ... They all allow the programmer to explore the solution space much more than what they might have used before.
If you insist on calling this plumbing, then plumbers tools have evolved.

Plumbing is a skilled (and lucrative) trade

If you insist on calling this plumbing, then plumbers tools have evolved.

Be clear that I'm not using "plumbing" disparagingly. I do a lot of plumbing too, and an improvement in the quality of such tools is a good thing.

But I also recognize that SICP stimulates a different perspective that is more focused on abstraction (for both major meanings of this word), and that this perspective is different from the "build things" outlook, if sometimes only subtly.

In the end, this is the old (Software) Engineer vs. (Computer) Scientist dichotomy. As someone who has a foot in both camps, I'm more interested in how each of these approaches informs the other.

Nonetheless, when you are designing an intro course, you probably have to pick your approach, because newbies probably don't have enough experience yet to be able to harmonize the two perspectives.

The culture of (programming) plumbing

I wouldn't say plumbing in general is a bad idea either, at least for software engineering. Unfortunately the problem I've seen with it is that the plumbing culture sets up expectations that "all problems," where you have plumbing infrastructure at hand that handle them, "are easy to solve." In the work I did with frameworks this was not the case most of the time. What I tended to find was that the frameworks and IDEs were designed with the idea of solving a small class of common design scenarios. And they did that well. I'm not knocking them for that. The problem was the stuff I was trying to create didn't fit those scenarios too well, but everyone expected that I should be able to solve the problem easily because they assumed the framework and IDE would do the heavy lifting for me, and therefor I shouldn't spend time trying to create an alternate framework that addressed the issues more sanely. There was a certain amount of truth to that. I could get the job done using the existing infrastructure. The end result was something that worked as expected, but inside it was such a mess I was embarrassed by it.

The other problem was I felt like I was spending at least half my time just wrangling the plumbing rather than working on the real problem I was trying to solve, and this was discouraging. There was a lot of repetitive coding when working with the framework, and I could see the rationale for using code generation tools.

I think one thing that's gotten people excited about Ruby and Python is that they've simplified the architecture of all this, so that you don't have to do a lot of repetitive coding to work in the design scenarios (plumbing) that are appropriate to what you're trying to accomplish. It's more straightforward.

In today's commercial software development culture you absolutely need plumbing. Nobody's going to give you the time to recreate all that stuff yourself, even if you wanted to. The opportunity I think is being missed in commercial situations is that you don't always have to use the "top level" of your infrastructure. Sometimes that's the worst thing you could do. Sometimes it's more appropriate to use some lower level plumbing, and then build your own framework on top of that. It's just a matter of the people in charge understanding this.

A whole other discussion is whether the runtime architecture itself is supporting what you're trying to accomplish in a powerful way. That's more of what I've been exploring. This is a thorny issue, because a lot depends on existing runtimes: the language you use, the supportive development tools and APIs. There's a whole ecosystem that depends on the runtime or VM. Depending on what you're doing though the one you're using may be hindering you more than it's helping. A solution that some have taken to, and we see this in the Java community in particular, is building another runtime on top of the existing one, so you don't lose your existing investments, but you gain more programming power. In order to build such a thing though, some people need to understand "how to write your own computer in software". That's what I was talking about earlier.

This is just the freshman core courses

Please keep in mind that the change is only in the freshman core courses. The rest of the curriculum is largely unaffected. It is absolutely not MIT's intention to turn out coding monkeys. Their intentions are exactly as they have always been. They just feel that this is a better way to achieve their goals within the context of today's engineering, at the freshman level, for both EE and CS students. Please keep in mind that Gerry Sussman and Hal Abelson are completely in favor of these changes, and they have not discarded their principles or concepts of how to teach.

Abstraction

I love this quote from "The Idea Factory: Learning to think at M.I.T":

Freshman double E's take six double oh one (a.k.a six double no fun) to learn to program in LISP... This is where they begin to leave the rest of the world behind them in their ability to solve problems and to make things work.

I don't have the book handy, but I recall the word "abstraction" is mentioned a lot in the chapter on 6.001...

My point was, of course,

My point was, of course, that while different language choices can make sense, one should choose a language that supports the right level of abstraction and abstraction building.

First and second year programming courses

It's not necessarily a bad thing to put Scheme beyond the first year. Freshmen (first-year students) are very immature, they know nothing of programming (or else they have been twisted by their own hacking), they cannot think conceptually, and SICP is dense with concepts and programming techniques. At UCL's engineering school we use Java in the first year (basic programming skills + algorithms + Java itself which has to be learned some time anyway) and Oz in the second year (my own course based on the French version of CTM, which is also dense with concepts and techniques).

We have been doing this for five years now and it works well (an 80% passing rate for all engineering students, the students give me and the course very good marks, and they accept the use of Oz). My understanding is that the second year is a good time to catch them: they have matured since the first year and they are not yet too conservative to accept non-mainstream languages. They understand that Oz lets me go much farther than Java. In addition to OOP, I teach higher-order functional programming, formal semantics, and dataflow concurrency. Our students are pretty good: UCL is one of the best universities in Belgium and the engineering school is highly selective with an entrance examination.

Great fan of CTM...

Its a great book. Its the best place I have to point people who get obsessed with state vs statelessness (extremities in either direction can easily ruin a project), models of concurrency that are novel, etc. The best thing about it is that it treats the models progressively, talks about when they are used and when avoided, without religiously declaiming things that may be necessary.

I think that it really would be good to follow it up with a course in a strongly typed language though - eg Haskell/OCaml. TAPL maybe. Thinking about types does make you a better programmer, even if you are going to use dynamically typed languages in "real life"... I think there is definitely space for a CTM style book that covered type systems with the same attitude and fluency.

My course at Imperial ( ic.ac.uk ) 10 years ago started with a half insane OO-ified Pascal variant - Turing, then went off the deep end with compilers in Haskell. That was a great course...

Thinking about types

I'm one that thinks that CTM is not incompatible with static types. But I do agree with your basic point that learning the discipline of types is valuable. Indeed, I'd go as far as saying that users of languages without static typing should be more keenly aware of types, since the language is not forcing discipline (it must be self-imposed).

Nine and sixty ways to make tribal lays ...

My course at Imperial [...] half insane [...] off the deep end [...] a great course

That sounds exactly like what a programming course should be: teaching programming but also awakening curiosity and teaching how to think. That's what I try to do too. Whether it's given in the first or second year is not so important, if it succeeds in getting students to think. It may even be that the second year is better than the first, if the first year course does not kill the students' curiosity, because students are more mature in the second year and so will get more from the course.

There is a very nice review of CTM in the March 2009 JFP by Peter Gammie. I think he captures well the book's spirit: The overarching achievement of this book is to be so provocative that one wants to engage the authors in debate about almost everything they say. Partly this is due to the chirpy writing style [...] but mostly it is their delicious iconoclasm.

Scientists [dagger] engineers

Why is there this rivalry between computer science and software/electrical engineering? I do not see the same hostility between civil engineers and material scientists, or between aeronautical engineers and physicists working in fluid dynamics.

I'd guess it is all an accidental consequence of the history of formal methods, but it seems to be so taken for granted. What else is there to it?

A young field?

Why is there this rivalry between computer science and software/electrical engineering?
I do not see the same hostility between civil engineers and material scientists...

If you've known any engineering and science students in other disciplines, you know that there is some rivalry between the science and its corresponding engineering discipline. Each thinks that it is their side of the pair that does the real work. ;-)

Having said that, the two streams usually have a well-established split: the jobs available are different, academic departments and funding are separate. They don't step on each other toes, and are aware of a complementarity between them.

It seems to me that, at this stage of development anyway, there isn't such a strong separation of roles and opportunities in the computing disciplines. It's not even clear to me at this point if I think there really should be such a separation, or if it can follow this pattern, since the distance between the concept and its reification are so much smaller with software.

Regardless of how things develop, there are always going to be some people who prefer the particular and the concrete, and and some who prefer the general and the abstract, and as with all people who see things differently, there is likely to be some friction between them.

Mindset

A lot of EEs (used to) think of software as an uninteresting trivial extra needed after all the work is done. Often, they have no feeling for the complexity of it. They likely believe most bugs are fixable in a day.

SEs (tend to) think hardware is the trivial part and all the work is done in building the right OSes/libraries. They usually loath formal methods since time after time it has been proven not to work^tm. They neither know how to design or debug hardware, don't know about crosstalk/interference/signal integrity, and are likely not in the right position to understand formal methods, or fundamental results.

CSs (are likely to) believe that software is easy and math is hard. They have no feeling for man-months and are mostly quite good at writing small programs, but are at a loss when it comes to building infrastructures, and are totally clueless about EE.

Of course, they are all wrong. But I think the Internet actually helps people understanding each other's position.

Software engineers think....right....

"SEs (tend to) think hardware is the trivial part...."

I highly doubt that. I've never worked on hardware beyond wiring together a few 7400-series circuits, but I've always been impressed both by the complexity and reliability of computer hardware, even after having seen the seamy side of it (via writing device drivers, reading the code for dealing with, er, "interesting" MMUs, and so on).

Our libraries are trivial by comparison, methinks.

Ah

I was generalizing of course; it was a joke, but not very far from the truth.

The hard part is to understand that everyone actually has a hard job, and everyone needs about four years education (at least) and another four years (at least) working as a professional.

If software would be trivial there would be no bugs. If EE would be trivial all hardware would work instantly. If CS would be trivial all problems would be solved.

Talking about one field as more concrete or abstract is just plain nonsense. All fields need people who can think abstractly, and in all fields even the best make big mistakes.

In general, you'll find mostly pretty intelligent and dumb people everywhere. The only thing is that they have a hard time communicating and different mindsets to solving problems.

In the end: There just is no difference in intellectual achievements in any of the fields.

[ The bad thing about SE is that it has a low-entry point: "I'm a programmer since I can write HTML!" But that is similar to buying an electronic kit from a play store, or following an entry level CS course. ]

Differences between SE's and CS's

I think marco's comments show that "your mileage may vary". I don't intend my description of CS's and SE's to describe particular people who self-identify with these labels, or hold them as a job description. It's just coming from my own experience with different groups of people who are technically skilled with particular proclivities. The distinctions between who's a CS and who's an SE often get blurred.

My own experience is that SE's think a lot more about management issues: how software components in a project should be organized (more often nowadays this is taken on by an architect), what software components should be purchased and integrated into the product, and timelines. Often they're the ones doing the coding. They may care about the OS and the runtime/language that's being used. They don't think about the construction of either of these (unless they're in charge of such a product). While they may think about the architecture of the thing they're building they don't think at all about the architecture of the tools they're using to build the product. They're focused on resources and capabilities, and getting the computer system to carry out the project's objectives.

CS's are focused more on the individual elements that make up software. In my experience they value good design. SE's might be puzzled about this emphasis on design (or may encourage it). My sense is SE's think that CS's sweat the small stuff and are focused too much on impractical issues. My sense is that SE's do value the problem solving abilities of CS's, though.

There are divisions even among CS's. There are some CS's in academia who really get into the theoretical stuff, but unlike theoretical physicists, they produce tangible artifacts. In effect they are rather like mathematicians (though I don't mean to imply that all they use is mathematics) in the sense that what they produce seems obscure to most people. SE's and many CS's don't understand this fascination that some have with this stuff, and tend to think the artifacts are "weird" and impractical. The truth is many of them don't understand the stuff, and if they've used it they totally missed the point of them. The end result is they often toss them aside as irrelevant gibberish. They'll say "We have real work to do." There is a gulf here, and what tends to happen is the CS's are standing on one side trying in various ways to bring everyone else on the other side closer to their side in terms of understanding, but this rarely works, and so the CS's feel frustrated. They'll complain that everyone else is ignorantly plodding along with low expectations, and struggling to deal with problems that they themselves created. To them it all looks pitiful, but they'll say amongst themselves "Hey, I tried. They wouldn't listen." Their entreaties are ignored as the rantings of ivory tower theoreticians.

The main symptom of this divide I notice is whenever a new language comes out that some SE's are attracted to, they don't think to themselves, "How did someone come up with this?" They seem to act like it was created by magic and just say, "This is really cool!" never thinking that someone or some group of people like themselves with a different perspective and different skills came up with the thing. If they do recognize that real people were behind it they see them as gods whose "skillz" are beyond anything they could possibly comprehend, and are just thankful they came up with it. The truth is that CS's, in most cases understanding SE issues, were the ones who came up with it.

We haven't seen a new OS that's caught fire in a while, but I'm pretty sure the dominant ones were all crafted by CS's. I'm thinking of the kernel, not all the services and cruft on top of them, which were probably crafted by SE's.

To come back to your question

Why is there this rivalry between computer science and software/electrical engineering?

I once knew a professor who tried to sell his product because it was obviously more general and easier to write compilers with, and has a wide application domain. It was also an academically nice tiny language.

The manager he tried to sell it to was flabbergasted by all the intellectual statements, of course.

The nitwit was thrown out of the building, very politely, since he had no idea that a manager thinks about questions like: "Where can I make a profit out of this product?", "Where do I find the programmers?", "How much does it cost to get this infrastructure in place or maintain it?", and "How much does it cost to solve similar problems in C++/Java, without upsetting the toolchain I, and my team, are used to?"

[ Btw, this didn't happen at my university and I guess the story was sufficiently spiced up. But it shows the different mindsets in the late eighties. ]

Delegation

It's an old, old story, and I think it shows more the failings of many managers to manage properly than academics. A good manager would have thought: I can't make sense of this! But this chap seems like he might not be talking nonsense. Who can I ask who can tell me in plain business terms what this product might mean?

The manager could as easily be flumoxed by an EE prof excitedly telling him about the technical advantages of his new process for silicon doping that supports new capacitors with interesting properties. But I guess the SE prof would be able to talk to the manager in plain business terms.

Theoretical computer science

Theoretical computer science really IS theoretical. It is as fundamental as mathematics or physics. This is not the mind-set of an engineer. Engineers want to speed up and not have to think so much, so they can get more stuff done, and scientists want to slow down and think more, so they can understand better.

FWIW, as an engineer who likes theory, after wading through tons of languages, including a little Lisp and Scheme, I find I use C++ for stuff that must run fast, and Python for everything else.

That is because nothing runs as fast as C++, and nothing is as easy to write as Python. YMMV about Python.

So long as we all realize

So long as we all realize that these stereotypes are (sigh) only stereotypes...

These steroetypes are rather old

Been reading A History of Mathematics by Carl Boyer. The section on Archimedes sounded vaguely familiar:

A clear distinction was made in Greek antiquity not only between theory and application, but also between routine mechanical computation and the theoretical study of the properties of number. The former, for which Greek scholars are said to have shown scorn, was given the name logistic, while the arithmetic, an honorable philosophical pursuit, was understood to be concerned solely with the latter. It has even been maintained that the classical attitude toward routine calculation mirrored the social structure of antiquity in which computations were relegated to slaves.
Perhaps if Archimedes had spent more time on the practical engineering side, and less on the theoretical side of mathematics... maybe Carthage would have overcome the Roman foes... and history would have been much different. :-)

That's the point: They are

That's the point: They are not descriptions of reality - they are how we actively fashion reality.

Declining enrollment

For no readily apparent reason, I would like to note that enrollment in "Electrical Engineering and Computer Science, VI" declined from a peak of 1004 in 2000-2001 (for the 10 years they have listed) to 614 in 2008-2009, according to the office of the registrar. This drop was also seen in most CS departments and thoroughly discussed in the CACM for example. According to my fallible memory, the solution declining enrollment problem was determined to be "More robots!".

Before you draw too many conclusions about the nature of computer science education, consider that they may just be trying to draw more students into the program.

"More robots!"

According to my fallible memory, the solution declining enrollment problem was determined to be "More robots!"
I don't think your memory is that fallible. For example, here's a recent report from CMU that evaluates the use of robots in first-year CS education. Quoting from the abstract of that report:
Enrollments in Computer Science have dropped significantly, and part of the blame rests on an introductory course that has not kept pace with the progress of computational technology. We aim to use robotics to motivate students, provide a sense of relevance to the course work, and improve learning.
Note that they use Java rather than Python as their robot programming language.

We compared the overall

We compared the overall retention rate, and found no significant difference between our pilot and prior years

To be able to make stronger implications than what we get from that statement, a more robust evaluation approach would have been nice.

These are your father's parentheses....

Dijkstra's Critique

Dijkstra, in a typical fit of prescience, made a rather scathing criticism of this move in EWD1036.

math, not hacking

what production-usable language do people think best fits with what Dijkstra was saying? presumably something with support for some level of doing proofs?

There is no such language

There is no such language yet. I hope in a time not too far away it will be Babel-17.

Languages with Powerful Type Systems

I would say that languages with fairly powerful type systems, such as SML and Haskell, go some way towards this, since the type systems essentially allow you to prove (simple) theorems about your code.

CS is Math is CS

I'd say CS should be mathematically-inclined. The greatness of math is its ability to model the real world. It has done so so beautifully that we hear the question "was God a mathematician?" I've been away from all things computer now for a few years. I got extremely burnt out on the production world, especially the grimy production world of the Web. I also taught at a second-tier U and was appalled by the uselessness of the students. For example, having worked at MS I knew exactly what they were looking for in an OS programmer newbie. We taught Stevens' UNIX and I handled the labs, likewise in an OS course. The short is they rioted against me! I was spoon-feeding them good under-the-hood C techniques (reproduce tar's folder crawl, do a version of the the buddy memory system), but they nearly lynched me for all the work I made them do. Again in short, I'm fed up with "modern computing" and am coming back only to be theoretical and mathematic. Let's face it, computers are in a rut. I call it the iPod Rut, because, like an artificial dam, we're spreading out horizontally doing "cool apps" instead of running free and deep and fast towards better things. As I said in a book I wrote, "the machines will get only incrementally smarter (iPods etc.) and the human definitely dumber and dumber -- and they'll meet in the middle at sentient heat death." I favor the Victorian model of education: give them brain-busting subjects like Latin, Greek, classic philosophy, math, etc. and then turn 'em loose. The Victorian Age was full of amazing scientific progress as a result. Without real brain work, we spiral down into brain incest, which is where the computing world led by Wall Street always tends.

Updated blog links

Weinreb's blog is down and Cemerick's (snowtide) has moved. New links:

Dan Weinreb's post

Cemerick's post

nobody cares about their first language

MIT made a right choice, which I hope Indiana also take. The fact is that everybody consider their first programming language as lame. So making Python the intro language will make student like it less, which is a good thing because, well, that's Python. Do you know how many students whose first programming language is Scheme hate Scheme? Everything they learn later, whatever they are, are considered better and more advanced!

My first programming was Pascal. When I learned that people were using C and C++, I was so unhappy with my school. "Why do you teach me this language which doesn't even have pointers or ++ operator!" "It's so lame that in Pascal you CAN write nested procedures. WTF is the use of it? C doesn't allow that and that's why it's faster!"

I never considered using Pascal or see its worth after my first programming class. I was wrong, but that is human nature.

I'm so glad that MIT is saving Scheme from the fate of intro programming languages.