De-Scheming MIT?

Will freshman scheming be the same if their schemes are more about robots and less about Scheme?

The MIT is going to change its curriculum structure that was famous for teaching Scheme in introductory courses. One force behind the reform is no one else than Harold Abelson, famous for his marvelous Scheme opus SICP. But why changing?

The new curriculum is designed with three goals in mind: greater flexibility in requirements, better integration of electrical engineering and computer science, and more depth to better prepare students for graduate school or real-world design challenges, he said.

And programming language wise:

Content-wise, the class is a mix as well. The first four weeks of C1 will be a lot like the first four weeks of 6.001, Abelson said. The difference is that programming will be done in Python and not Scheme.

A sign of the times?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I am speechless...

I am speechless...

I presume the idea with

I presume the idea with taking a focus on robotics is to expose students right from the beginning to ideas like concurrency, distribution and real-time response. That sounds like a good idea to me. Using Python, not so much, though.

Just great

While programmers are still trying to get basic concepts like looping and subprograms down, we're trying to shove seriously tricky concepts like concurrency, distribution, and real-time response at them. Because, dammit, learning to program is simply not difficult enough!

This isn't to say those aren't important concepts, and should be taught- just that they shouldn't be taught first.

Target audience

While programmers are still trying to get basic concepts like looping and subprograms down

Presumably, if you've made it into MIT, you shouldn't have too much of a problem figuring out looping and subprograms. Many freshmen would have long since passed that stage at high school or on their own.

Here's another quote from the article which hints at what they're aiming for:

Faculty from both courses stress that neither will just be a survey course. They will “be in the Course VI tradition of ‘barely-doable,’”

It's not as though there's a shortage of schools that cater to less capable students.

Re: Target audience

Many freshmen would have long since passed that stage at high school or on their own.

Yes, but many wouldn't. MIT prides itself on being able to cater to smart but totally inexperienced freshmen.

One of the nice things about the use of Scheme was that it was a great leveller. Sure, there were 1 or 2 students each year who had seen it before, but for the most part using Scheme put everybody into the same boat of learning a new language.

A new language can be useful for the "more prepared" students, as well. I had been programming in various languages for over ten years by the time I took 6.001, and if it had been a review of, say, Pascal, I probably wouldn't have paid as close attention. Scheme sure frustrated me at times, but it made me think about programming in brand new ways, which was one of the points of the course.

Also, don't believe the article when it talks about how this whole thing is just "an experiment." A not-so-great MIT tradition is for the administration to decide on a change, and win buy-in by announcing that it's only "a multi-year experiment." Well, in 3 years there's been huge turnover among the students and no one really remembers that the whole thing was only a trial run.

Re: Target audience

My point in the grandparent comment was not to defend the choice of Python, but rather to defend the tackling of a problem such as mobile robots at that level. Even for MIT students that haven't previously been exposed to programming, loops and "subprograms" should not be a problem.

I don't know enough about what's going on at MIT to critique the choice of Python in the context of a course like C1. I once did a course project which involved writing a program in HP Basic to control a microscope to scan a rock sample. The choice of language was hardly relevant in that case (and besides, in many ways, Python is the new Basic). The C1 situation sounds somewhat similar, and supposedly it isn't intended as a direct replacement for 6.001.


I love Python but I'm not sure I see it as a language that excels in the field of electrical engineering. C, C++, Forth, Java, Ada are probably the most commonly used. But is Python really in widespead use for the domain? Any more than Lisp? Lisp, after all, has a history with Autocad (though I don't know whether it still uses Lisp as the scripting language). I could see why Python would be a good choice for teaching CS students, but the goal here is to integrate the curriculum for both CS and EEs.

yeah, python

I guess it makes some sense to me. Anyone using computers for anything is well-served by knowing at least one scripting language (pace, terminology warriors), so it's not bad to put one in the curriculum at some point. I see Perl and Python in this niche, with Ruby also entering as the various shell languages leave. O'Caml and Bigloo Scheme are close, but I've tried to use them for this sort of thing before and always given up. Given these 2-3 choices, Python is probably the cleanest. We're using it in a PL course here (along with ML and Prolog), and the students seem to pick it up quickly.

yeah, python

Python learned some things from Lisp, like first-class functions and other nieceties I wouldn't want to miss out. It also suports multiple paradigms to some extent and has a rather huge user base.

In honor of our new python overlords

I've added SICP chapter #1 in Python to the list of other languages. Since you seem to be a python advocate, perhaps you can explain exercise 1.1 (if can not be used as an expression) and section 1.3.2 (the lack of let bindings). :-)

Python the penultimate

Python 2.5 (the current version) does has an if expression form:

def abs(x): 
    return -x if (x<0) else x

A fairly Pythonic (as we say) way to translate the let construct is to use default argument values:

x = 2
def plambda(x=3,y=x+2):
    return x*y
print plambda()

or, in this case, simply

print (lambda x=3,y=x+2: x*y)()

But it's certainly true that the statement/expression distinction and the absence of anonymous functions do make some functional programming constructs more unwieldy.


Downloaded the latest version and made changes to the python code to reflect the two suggestions you made.

Some ideas about style

Good idea to add this translation, I've got only some minor objections: Wouldn't it be better to adjust the code more to the coding style thet most pythoneers use, the PEP8? I do see the point of keeping the code short to be more comparable with the original Scheme code but it should be thought of.

Python Section 1.1.7 needs less float

Apart from the other comment on indentation style, in 1.1.7 the use of float(x) in the definitions of average and improve looks to be too limiting. I have not read the book, but if writing Python code i would use plain x instead of float(x) in those functions.
Scanning a little further ahead, it seems there may be other places where float is used too often (although its use in the definition of sqrt may be what is wanted). - Paddy.


The value is divided (/) by a float (2.0) so the coercion is automatic. But I consider the problem of the overloaded division operator to be an open PL problem. Should it follow a set of rules for automatic casting (int/int=>int int/float=>float;) like Python, Ruby, JavaScript? Or should int/float be a type error like in ML and Oz? Or should the result of divide always be a floating point number like Haskell?

Anyhow, I usually get a little carried away with the forced coercion whenever I deal with languages that follow the first path. It effectively intersperses some type control. Not always necessary to be explicit, but sometimes I dislike having to play compiler in my head.

Or should the result of

> Or should the result of divide always be a floating point number like Haskell?

Note that in a future version of Python this will also be the case, on the rational that 1/3 = 0 is highly surprising to non-programmers..

Floor division

Using the statement

from __future__ import division

on top of a script one can import the behaviour you describe right now. The current default behaviour is reproduced by floor division with the doubleslash operator // assigned to it. So one has:

1/3  -> 0.3333333333
1//3 -> 0


To clarify, Python does have anonymous functions - the lambda you just used - but thanks to its indentation-only code layout the lambda form is restricted to only having one expression as its body

The whitespace policy does

The whitespace policy does not suffice to explain why lambdas just contain one expression.

Simple statements like print, raise, assert, assignment, exec and import are statements just like colon separated sequences of expressions or sequences of these simple statements. No one of them does contain a code block and is therefore affected by any whitespace policy. Secondly the whitespace policy doesn't also affect logical line continuations. So you can create arbitrarily large and unwieldy expressions without taking care for any indendation. Whitespace does affect only nested code blocks consisting of many statements not separated by colons.

lambda is really considered as a design wart in Python and its use is not encouraged. It is a somewhat accidental language construct that is a reminder to FP but has only little to do with the motivational base of describing scope and variable bindings using anonymous functions. That lambda is going to stay in Py3K is ought to group pressure by people who like "tiny little functions" as callbacks everywhere.

As a point of comparison...

Translated chapter #1 of SICP to Ruby as well. Seems to trade off one form of weirdness for even more weirdness. No doubt that Ruby has a more capable lambda (though it has its own set of problems with scoping). But at least Python treats named and anonymous functions the same.

Of course, neither language is really centered on the FP domain, so it's probably not a deciding factor one way or the other.

Not so fast ...

Chris, You had better nix the 'lambda's -- Guido is a lambda hater. ;-)


It's there, it's still used... And it's terrible. Don't get me wrong, it's far better than having something like C for the same set of tasks, but it's like a little joke Lisp in most senses; it was a reduced version of an already minimalist language (XLisp) for early 80's home PCs and the central language has not changed.

Emacs Lisp is both a better demonstration of mainstream Lisp use and dynamic scoping.

I can't say a lot about the central issue. Persecution Complex on my left shoulder says that, probably, sponsors of MIT demanded OO indoctrination and the higher-ups at MIT took minimal damage by choosing Python. There's nothing on my right shoulder, so I'm not really sure how to balance that out.

OO indoctrinations

Except that 6.001 teaches OOP by actually implementing an object oriented system in terms of closures with side effect, teaching the student how C++, Java, smalltalk, etc implement OOP, and rather trivially at that. The student then walks away with a deep understanding of OO because the [rather simple] details have not yet been abstracted away until the student chooses to abstract them away.

6.170, the accompaning lab course, tasks the student with implementing complex OO systems in Java, applying the techniques of OOP *after* they have properly learned what OOP is. So, the OO indoctinisation is still there, but introduced properly.

Scheme is the perfect language for this because it's core forms give enough power to easily implement and abstract away all the breeds of OO (single/multiple inheritence, privatisation of functions and data, interfaces, etc..) without tying the language to a subset of these breeds. The idea is that presumably, after having completed 6.001 (Computer program structure, ie. Basic theory) and 6.170 (Comp Engr. Lab, ie. applied theory) the student will be fully versed in the state of the art to understand all mainstream languages. By using Python, a language where some of the basic concepts taught in 6.001 are abstracted away, the student may not be fully versed in the basic foundational concepts, having skipped a couple steps along the way.

My personal opinion is that, if they want to teach python and applied robotics as an alternative, it should be an alternative to 6.004 (Computation Structure).. the last course in their CS Core. Otherwise, they threaten to leave the student ill prepared for future advances in CS theory.

Surveys are good

Disclaimer: I know nothing about MIT's curriculum (except I've read SICP).

Scheme is the perfect language for this because it's core forms give enough power to easily implement and abstract away all the breeds of OO[...]

I would like to agree with this, but I can't.

Starting with FP and adding state and then objects is a great approach, and Scheme is great for that kind of course--a Euclid-style building-up. But when you're done, somehow you've built something that's generally unpleasant to use. It would be nice to end up with something at least as pleasant as, say, Python. Otherwise, the student may have learned some metaprogramming, but she certainly won't have learned OOP.

I like the Euclid approach. But given Scheme as it is, I think learning multiple languages is better than building them all out of any one base language. I wish it were otherwise.

Scheme is still an excellent place to start.

Easy to use

SICP's OO system isn't "easy to use" but it is pretty comprehensive and given a background in SICP, you should be able to feel at home with the OO concepts of any language, by having an understanding of what's going on under the covers... hence why I feel it's a great introduction. Concievably, if you were to use SICP's OO system in a real life scheme program, you would abstract away the ugliness with macros, but introducing that in SICP would probably do more to cloud the concepts and focus the introductory course too much on lisp-isms. IIRC, macros aren't covered at all in SICP or 6.001.

And no, the student would not have learned Object Oriented design and higher OOP concepts from SICP. I think that's outside the scope of the class, and the class covers so much as it is there isn't room to add more. But it is a great background to get the user ready to learn any flavor of OOP.

As for learning multiple languages rather than building everything out of one language, I agree. But not in an introductory course on basic CS. I favor the Euclidean approach to educate the student on all the basic concepts of every language, then the student can move onto more specialized languages to learn mastery of specific paradigms. I can't see any way to squeeze as much information into a single class as SICP is able to do if the student also must learn new syntactic ideas on top of the underlying concepts. It is also no small point that by building everything in a Euclidean approach, the relationships between different concepts is readily apparent, which wouldn't be the case in a multi-language course.

Hence, the course, as it is, is as close to perfection in my mind as you are going to get, as an introductory course. There is no end of debate for the proper course of action for follow up courses. The proposed courses sound to me like they would be an excellent alternative core to be taught with SICP as a prerequisite.

Will too. Will not. Will too. Will not.

I both used SICP and implemented OOP in Scheme (and then later Prolog in SML) in college. (Not at MIT so I dunno what they do with it.) My own experience is that doing so actually did de-mystify OO along the lines of "oh, virtual function pointer tables make sense" when you look under the covers of C++.

However, IIRC it did not get into the issues of how to architect and design properly with OO as your tool e.g. prefer has-a to is-a to a fault, how functional approaches can be better, etc.

[Implementing Prolog, on the other hand, turned out to be one step beyond the mind-bending-ness most of us in the class could handle in those couple of weeks. I recall the instructor having to throw up his hands and basically give us all the answer to the final homework assignment.]

Sign of the times, indeed

Computer "science" isn't a branch of math anymore to most people, nor is it about studying programming languages. That's not to say that these aren't valuable or interesting aspects, but only that they are a tiny minority of what goes on in our bloated field.

Maybe this is an attempt to raise the level of abstraction in discourse. We might have taught about plugboards or short code in 1950; until recently we taught about scoping, functions, and recursion; maybe now it's time to move on to larger architectural concerns and assume that students are smart enough to figure out how to write functions and use variables along the way.

Wishful thinking

Given that I've seen textbooks by both professionals (mcconnell, code complete) and professors (reek, pointers on C) that still get tail recursion wrong, I'd say this is wishful thinking at best.

Which makes my point

Given that tail recursion is such a small part of what needs to be known to be an effective software engineer (i.e. to design robust software to solve real problems, like controlling our robot overlords), this doesn't seem like much of an objection. Students can do fine without for awhile, and pick it up at their leisure.

Further details

For anyone interested, it's worth reading the entire linked article to get some of the nuances. Some relevant highlights:

The C1 course which will use Python is "designed to give students a hands-on feel for electrical engineering and computer science, using a mobile robot as a case study." Abelson says that "In the long run, [C1] by itself will not replace 6.001". The article mentions that "students who enroll in C1 as well as a 3-unit Scheme class will get credit for 6.001".

C1 is part of a larger redesign of "Course VI" which is still underway:

But C1 is a trial, and the proposed larger Course VI redesign is still in its early stages. “Nothing is going to happen irreversibly in the next few years,” Lozano-Perez said.

The proposal is currently being shown to the faculty, will then be opened up for student comment, and then must jump through all the hoops of various MIT committees to be approved, meaning the changes are still fairly tentative.

“If it doesn’t work, we’ll back away from it. It’s an experiment,” said W. Eric L. Grimson PhD ’80, the Course VI department head.


One force behind the reform is no one else than Harold Abelson, famous for his marvelous Scheme opus SICP.

It doesn't shock me that Abelson is backing a proposal that involves mobile robots. Wasn't he involved in the turtle research back when? I know he and diSessa published Turtle Geometry.

sorry to say, but

Perhaps he's trying to put some pressure onto the R6RS team to get it completed faster?...

or perhaps Lisp is done. Lisp is no more "cool" in the eyes of youngsters as it was during the first years of Linux and Emacs-editing. Autodesk autolisp and elisp were the main driving forces behind Lisp during the 1990s, but that age is long gone. Eclipse and java replaced Emacs and elisp for good for serious software development while Unixers still use vi/vim as a light text-editor for administrative jobs. I fear most users of Autocad these days are much more aquainted to VB as well.

Lisp/Scheme never really managed to get integrated libs as the scripting languages from the open-source movement got. Reddit sent a very powerful message for those listening. Where's Paul Graham and Arc?

It remained popular as a niche language for teaching advanced concepts in computer science, but there's really little Science in IT today and most courses are heading for Haskell/OCaml as examples of modern functional languages rather than Lisp/Scheme.

I like Scheme, but am just stating the facts. hopefuly, i'm just doing a "Chicken Little"...


Perhaps he's trying to put some pressure onto the R6RS team to get it completed faster?...

Doubtful. The subset of Scheme taught in SICP has not really changed over the last 20 years.

I like Scheme, but am just stating the facts. hopefuly, i'm just doing a "Chicken Little"...

I don't know about that. If anything, the more "obscure" languages like Lisp and Scheme have enjoyed a revival in the last few years due to the widespread appearance of web apps. As pointed out repeatedly by PG in his talks and essays, when you're working on server-based applications like web apps your effective choice of languages is much greater than when you have to deliver apps to the desktop.

R6RS is mostly an attempt to

R6RS is mostly an attempt to formalize how pure scheme can portably use libraries and such (and clean up a couple trival points along the way). Since 6.001 teaches foundational concepts without libraries, R6RS is inconsequential to the course.

Scheme does not have to be "cool" for 6.001 to do it's job. It could be a completely defunct language and still be ideal for the course. The course deals with recursion vs. iteration, OOP, mutation vs. functional, eager vs. lazy evaluation, and implementation on register machines... many concepts which would be obscured by replacing scheme with a language that abstracts away such things. The point of the course is not to teach "functional programming", but rather to teach "What is functional programming vs. imperative?" Teaching Haskell/OCaml in place of scheme would IMO be just as bad a move as those languages also abstract away too much.

6 weeks of programming without assignment

The greatest thing in 6.001 using scheme was that the class went on for more than 6 weeks before teaching the assignment statements "set" and its variants. What's amazing about that was that, before it was introduced, we really didn't miss it. We could do everything we needed with functional programming, returning values rather than modifying state. Even today when I look over awful spaghetti code and try to clean it up, the first thing I do is try to convert as much state-modification code to functional code as possible, and it becomes clearer and more error-free.

An MIT education is not about learning useful programming languages, it's about learning useful programming concepts that will outlast decades of changes in programming languages. With all the languages available today, there's not a single new concept - templates, object-oriented programming, iterators, garbage collection, run-time type identification, functors, etc. - that wasn't covered in MIT's 6.001 or 6.170 classes 20 years ago.

So as they change programming languages from scheme to Python, I certainly hope they don't change their overall approach to teaching programming language concepts.

the actual course

Perhaps people may be interested in looking at the actual course website (this is the second term it is being taught, but for now it is just an alternative and 6.00x are still around).

(For what it's worth, this has the absolute support of Sussman and Abelson, who I've heard say they're surprised that something that was revolutionary twenty years ago hasn't been replaced by something similarly revolutionary since then. I hear great things about the first class, though I worry that some of the good things may not scale to a large class size. (And for perspective, I've been a member of 6.001 course staff three times, think it's an amazing class (though the recent choice to switch from MIT Scheme to DrScheme was too long coming), but see no reason not to continue to try new experiments in education.)

Visual Basic at MIT

``The difference is that programming will be done in Python and not Scheme...''
Pythonists, you may now rejoice, but be warned: this is just a preparation for introducing Visual Basic. :)
Seriously, if they wanted to change the language, why did they choose Python?!
`... the course is about learning “how to build up abstractions, and how abstractions help solve engineering problems”'
what makes Python the right choice?

my guess

The bit about solving engineering problems, maybe?

(Sorry, but if you serve up a straight line like that, you have to expect someone to take a swing at it...)

It's Bitwize's Corollary to Greenspun's Tenth Law.

"Any sufficiently complicated Common Lisp or Scheme program will eventually be rewritten in C++, Java, or Python."

I guess SICP just became "sufficiently complicated"...

Other languages

Let's put Python-bashing aside for a momnet, and consider the possible reasons for moving a way from Scheme as valid. Supose the goal of the change is genuinely to improve the education of MIT students. If I understand the issues that call for improvemnet are concurrency and distributed processing, and software engineering facilities (i.e., objects, modules etc.) If these are indeed the goals, which language (or languages) would you have chosen?

The first answer the comes to mind is to use erlang and Scala. Say erlang for the first half of the semester and Scala in the second half. I can see the benfits such an approach, though I still think SICP remains a better choice.

Interesting that you brought this up.

Interesting that you brought this up. I have been teaching SICP in Scala at EPFL for 5 years now. I think the course works overall quite OK, and that Scala is well-suited for the subject material. Compared to the original SICP, the course puts somewhat more emphasis on program structuring and modularity. For instance, the famous digital event simulator is factorized into a general simulation class, a digital simulation subclass, user code which inherits from both. Here are the
slides for the course
(alas in French).

Some of the course material is also found in the Scala by Example report.

It's great to hear about

It's great to hear about this. I think that's a really superb idea. I'll take a look at the slides (I hope that my mediocre French and Scala will work well together).

While we are on the subject, it'd be interesting to hear more about your experiences with this course. My feeling about CS1 courses is that teaching software engineering constructs (specifically OOP) in them often misses the point. Students who don't have experience programming rarely understand the point of mechanisms aimed at helping with the software lifecycle in general, and fail to appreciate program readability in particular. They do as they are told, but don't really internalize the ideas they are being taught. I have my ideas about how this situation can be improved, of course, but I am interested to hear what you think.

I teach the SICP in Scala

I teach the SICP in Scala course at EPFL in 2nd year, so I can't comment
first-hand on how beginners would cope with it. Our students had an introduction to programming in Java before, followed by a course on systems programming in C. I don't mind the Java intro, but the prior exposure to C makes it hard for some of the students to go to a higher level of abstraction afterwards. I try to explain recursive data types and they think in pointers and updates ...

How can you not mind the

How can you not mind the Java?! ;-)

My experience was the opposite, by the way: C is a problem but Java is a curse... Java is a strange mix of high level (e.g., GC) and low level (e.g., references, pervasive state), making it problematic for teaching introductory programming concepts, I think.

On Python

I hope you're not counting me amongst the Python bashers. For building real-world applications, and even more for crufting them up in a hurry, Python is one of the best languages out there and bears some advantages over pert-near any Lisp dialect.

But for elucidating the cool deep concepts of computer science that need to be elucidated, you really can't beat Scheme with a bat.

Oh, wait, I'm forgetting about Ruby...


The real issue, of course, isn't whether it's Scheme or Python but rather whether the presentation will continue to rely heavily on interpreters (as a means of abstraction, no less!)

what's unfortunate ...

... about these kinds of changes is that the key insight of LISP and the lambda calculus is being lost. teaching introductory computation oughtn't IMO be like shop, metal working, and tuning up cars. it ought to be about giving students the key insight that all these languages, whatever their constituency and form, are really all the same, and lambda calculus provides that common framework. that's the key thing behind "Lambda the Ultimate Imperative" or "Lambda the Ultimate Declarative". or, for that matter, Lambda the ultimate picture drawing language in some of Henderson's work ("Functional Geometry") or the ultimate macro language (in Henderson's work with Gimson). it isn't only about abstraction as a powerful idea, it's that a key set of core abstractions can model everything else.

i see Python as the latest entry in the tradition of feature-rich programming languages, a lineage tied back through ADA and PL/1 and Java, a lineage all about having dozens of syntactic nuances for expressing different things. the alternative is the minimalist school, choosing to simplify languages by removing limitations.

whether van Rossum and company want to write pseudo-existential and universal quantifiers or not, they are readily modelled as compositions of lambda expressions at heart. people might think that makes what's going on clearer but in a real sense, it just gets in the way.

no doubt, there will always be Complexifiers in this business, and they will maintain a zealous following. cargo cult programming usually has a priesthood.

I'm strongly tempted to

I'm strongly tempted to flame this...

The goal of a first computer science course is to teach algorithmic thinking. As such, whatever language is used should get out of the way as much as possible, producing the clearest and fewest "stupid compiler errors." Whether regular syntax is better than familiar syntax for this purpose or vice versa is an open question.

Why do we start teaching physics with simple, frictionless Newtonian mechanics when it's all just the wave equation? Why do we start teaching math with univariate calculus and simple algebra instead of group theory or something? We do so to teach students the basic habits of physical and mathematical thought. In almost any field, we teach starting from the familiar rather than the "fundamental." Why should computer science be any different?

You obviously get some pleasure from "LISP and the lambda calculus" (though modern Lisp bears little more resemblance to the lambda calculus than a host of other languages), but your predilection is far from universal.

Why do we start teaching

Why do we start teaching math with univariate calculus and simple algebra instead of group theory or something?

The reasons we do that seem similar to me to the reasons to start teaching computation using something similar to the lambda calculus. The symbolic substitution mechanism used in simple algebra is essentially the same one used in lambda calculus -- anyone who's learned simple algebra can quite easily learn the basics of lambda calculus reduction (and IMO, they should).

This seems to support at least part of the grandparent's point. I'd agree that Lisp in the sense of e.g. CL isn't a good concrete language to use to teach this, but either Scheme or one of the more sugary LC-based languages would work. The TeachScheme! project, which is used at high schools as well as colleges, provides some evidence that this approach can succeed.

I guess I'm not sure what

I guess I'm not sure what people mean by "the lambda calculus" or "similar to the lambda calculus." A google search for "lambda calculus" on returns a single hit in some weird appendix, so the "similar" is important. From what I've seen in a couple of courses at different universities and some random web perusal, "teaching the lambda calculus" means "doing 'clever' things with abstraction and substitution to define booleans and natural numbers." This activity is "interesting" in the same way as doing sudoku and crossword puzzles.

If "similar" means "side-effect free so that it can be reasoned about by substitution," then I think we more or less agree ("more or less" because I don't think the side-effecting BASIC model is much harder to reason about, and it seems to have been an avenue for many students to learn to program in the old days). This substitution model can be supported by a large number of programming languages, possibly by teaching a restricted subset at first (Felleisen did this with Scheme back in the day, with mixed results). But I don't see what's gained in an intro programming course by calling it "based on the lambda calculus" rather than "just like the substitution you used to solve algebra problems."

If I had to choose a teaching language in this vein, I might use a restricted Haskell with less-inscrutable type errors. Syntactically, Python is very similar, and the semantic differences don't come up at first, so it seems to me it would work at least as well. But pretty much anything with a REPL (that isn't Forth ;) would probably work.

PS -- TeachScheme looks interesting, but I think being better than current US High School CS isn't that hard. For example, this student's account of a previous BASIC course seems representative: a teacher who doesn't know much about computing treats it like grammar or History, and makes students memorize facts and rules.

LC similarities

If "similar" means "side-effect free so that it can be reasoned about by substitution," then I think we more or less agree

The similarities go beyond that. In HtDP, see e.g. Local Definitions and Lexical Scope (which includes Lexical Scope and Block Structure), and Functions are Values. A great deal of other relevant material is embedded throughout the book.

The language used up until section VI of HtDP is a pure CBV lambda calculus enriched in a few standard ways. Although the term "lambda calculus" isn't used, an important part of what students are being taught are the features of a language which has a general approach to lexical scope (i.e. nesting isn't arbitrarily limited) and a general approach to abstraction and values (i.e. even procedures are values). IOW, the features provided by the language are very general, without arbitrary restrictions, and the underlying theory is far simpler than most alternative languages one might choose.

When learning, starting with a simple yet general framework is good, because it allows students to absorb and integrate new concepts without having to throw out what they've already learned. For the features which lambda calculus directly addresses, corresponding core features in other programming languages can mostly be understood as either equivalent to, or more often restrictions on, the general features of the lambda calculus.

But I don't see what's gained in an intro programming course by calling it "based on the lambda calculus" rather than "just like the substitution you used to solve algebra problems."

I don't care what it's called, I'm talking about the features being taught, which go a bit beyond simple algebraic substitution. I was just saying that the similarities to algebraic substitution argue for using a similarly simple yet general approach when teaching programming.

If I had to choose a

If I had to choose a teaching language in this vein, I might use a restricted Haskell with less-inscrutable type errors.

You mean Helium?

Yes! Thanks for the

Yes! Thanks for the reminder.

Sussman's comment at ILC’09

Andy Wingo writes:

The “debate” had an interlude, in which Costanza asked Sussman why MIT had switched away from Scheme for their introductory programming course, 6.001. This was a gem. He said that the reason that happened was because engineering in 1980 was not what it was in the mid-90s or in 2000. In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work. Code ran close to the metal, even Scheme — it was understandable all the way down. Like a resistor, where you could read the bands and know the power rating and the tolerance and the resistance and V=IR and that's all there was to know. 6.001 had been conceived to teach engineers how to take small parts that they understood entirely and use simple techniques to compose them into larger things that do what you want.

But programming now isn't so much like that, said Sussman. Nowadays you muck around with incomprehensible or nonexistent man pages for software you don't know who wrote. You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts. This is a fundamentally different job, and it needed a different course.

So the good thing about the new 6.001 was that it was robot-centered — you had to program a little robot to move around. And robots are not like resistors, behaving according to ideal functions. Wheels slip, the environment changes, etc — you have to build in robustness to the system, in a different way than the one SICP discusses.

Dan Weinreb offered his take on Sussman's comments at ILC’09.

Also on LtU

Dan Weinreb's take is also covered in an LtU story. =)