Ada, the Ultimate Lambda?

Chris Oakasaki has a blog post about teaching functional programming using Ada. He says

Why, with excellent functional programming languages like Haskell or ML easily available, would I choose to do a functional programming project in Ada? (I CAN STILL HEAR YOU LAUGHING!) Partly because we use Ada in some courses at our institution, and I wanted a library that could help in teaching recursion to novices. But, honestly, mostly I just wanted to see if it could be done

The idea of crossing paradigms has been around awhile. SICP rather famously introduces object orientation by building it on top of Scheme.

What do you think about teaching or learning paradigm A in a language strongly oriented towards paradigm B? What's gained? What's lost?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

a lot is gained

A lot is gained that way but the best thing is to go from very low level on up so that students can toss around abstractions and reason via their math -- but also quickly work out essentially what machine codes are implied and how they are implemented. You either know what is really happening or you don't. (Well, ok, really there's a spectrum in between but those are the extremes.)

-t

Well, I don't think building

Well, I don't think building an OO system with Scheme is really "crossing paradigms". As far as I know, Steele and Sussman designed it deliberately to study the actor model, closures being objects which understand only one message (apply).

Memories

I did several similar things back in the day (I am not working with
Ada at the moment), and even went as far as implementing a
continuations data type (a la EOPL1).

You might be interested in the following:

- A discussion of the restrictions on access-to-function from the Ada
design mailing list. It's quite an interesting story.

- Arthur Duncan wrote a couple of papers on functional techniques in
Ada (infinite lists, for example:
http://doi.acm.org/10.1145/289524.289537).

even went as far as

even went as far as implementing a continuations data type (a la EOPL1).

Was this in pure Ada, or did you escape to C or assembly?

Pure Ada. It's rather a

Pure Ada. It's rather a simple thing (a datatype representation of continuations used by an interpreter). But it was fun to implement them using tagged types (Ada's object oriented feature), and thus implement continuations by types that essentially encapsulate function pointers (i.e., a vtable). Going full circle, kinda.

A popular study approach today...

Todays it seems to be pretty popular for programmers to elucidate ideas about computation (1000 faces of iteration?) by studying multiple programming languages rather than studying computation alone.

Honestly, I'm not sure how to do the latter. Any ideas? My current supposition is that HTDP will take me down that path, and I am pursuing it.

It is only after Studying Scheme and Haskell that I was led down the path towards more academic, and pedagogical, resources, which seem closer to the latter than the former.

Fundamental formalisms

Yes. Look at mathematical structures that try to represent computation directly.

Examples:

  • Robin Milner's work has been pretty distinguished. His work on CCS, pi-calculus and bigraphs all represent attempts to capture what it means for a computational system to interact with another;
  • Cellular automata in particular deserve a look, and automata theory in general has been a successful approach to treat computation algebraically;
  • Attempts to wrestle with or design novel hardware often throws up interesting formalisms. The Von Neumann representation lies in this line. More recently, I'd recommend Alan Bawden's connection grammars [1], which arose out of an attempt to abstract to the essentials of interprocessor communication in MIT's Connection Machine.
  • Look generally at good frameworks for representing models of computation. I've mentioned automata theory; also the pushout theory of graph transformation [2] provides an elegant categorical foundation for just about any concrete representation of the basic steps of computation you might come up with.

Programming languages generally are not good attempts to directly represent computation, since their purpose is to help programmers manipulate and reason about complex systems that will be realised as computations, and so they are as much about human cognition as they are about computational models. Eg. the origins of lexical scoping had nothing to do with machine representations.

[1]:Implementing Distributed Systems Using Linear Naming (pdf), where it is known as linear graph grammars. Talked about many times here, mostly be me, but not yet got its own story...
[2] A tutorial on graph transformation.

Excellent points, as usual.

Excellent points, as usual. One puzzlement though:

Programming languages generally are not good attempts to directly represent computation, since their purpose is to help programmers manipulate and reason about complex systems that will be realised as computations, and so they are as much about human cognition as they are about computational models.

While the point is well taken, there is an underlying assumption here that "computation" is a Platonic concept, existing independently of how we cognize about computations. I am pretty sure that's not what Charles meant to imply, and the distinction he points to clearly holds, so it should be stressed that this strong assumption is problematic.

What is computation?

I'm actually agnostic about what computation is, because I don't know of a truly successful theory of computation. I lean towards saying that such a theory identifies computations as some sort of physical process, and that a successful theory should give us a criteria for saying which physical processes express computations, and identify what computation is performed, and wouldn't exclude anything that ought to count as computation. Such a theory would be somewhat like Shannon's theory of information, and David Deutsch, among others, has done work along these lines.

But maybe we can't achieve a completely adequate, purely physically-grounded theory of computation. Then maybe the theory of computation will need to take account of a creative subject, or whatever.

In any case, this foundational issue isn't really what the point Ehud singled out was trying to make. The point is that you shouldn't expect that formalisms good for getting machinery to do what you want it to do will also be good formalisms to understand better what computation is.

We are agreed on the point

We are agreed on the point that I tried to emphasize, and on the distinction between programming languages and what for want of a better word might be described as abstract machines. I am, I suppose, more agnostic than you since I don't think physical theories are going to cut it. There's a vast literature on the subject, of course, which I try to hint at from time to time on LtU, though the subject is only tangential to the main focus ere - once again pointing to the original point you were making.

Rewriting logic

I'm really just a beginner at this stuff, but rewriting logic and an approach to programming à la Goguen's Algebraic Semantics also seem a worthwhile investment.

In particular, you get reflection and concurrency "for free" with this formalism.

See Maude programming language and the OBJ family (CafeOBJ, Maude, OBJ3, etc).

Nerd Humor

Somewhat related to the topic, here's an extreme example:

A bit of humor based on using Java to express the fix point operation in an applicative order lambda calculus.

The question is, if you strip the humor and replace it with some explanatory material, is there anything actually to be learned from this? (Other than something about universality, I suppose).