Eff - Language of the Future

This is just a series of blog posts so far, as far as I can tell. But Andrej Bauer's work has been mentioned here many times, I am finding these posts extremely interesting, and I'm sure I will not be alone. So without further ado...

Programming With Effects. Andrej Bauer and Matija Pretnar.

I just returned from Paris where I was visiting the INRIA πr² team. It was a great visit, everyone was very hospitable, the food was great, and the weather was nice. I spoke at their seminar where I presented a new programming language eff which is based on the idea that computational effects are algebras. The language has been designed and implemented jointly by Matija Pretnar and myself. Eff is far from being finished, but I think it is ready to be shown to the world. What follows is an extended transcript of the talk I gave in Paris. It is divided into two posts. The present one reviews the basic theory of algebras for a signature and how they are related to computational effects. The impatient readers can skip ahead to the second part, which is about the programming language.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Off-topic comment

This is just a short off-topic comment from me...

It was always kind of interesting to note how few of the discussions on LtU are about new languages, or even about specific languages. This makes total sense, of course, but may be kind of surprising to some. More to the point, even while this makes sense, we are language geeks, aren't we?! So I am glad to note that the FOUR most recent home page items are about new and exciting languages!

Off-topic: When are you going to build a FORTH implementation?

You promised yourself years ago!

using continuations for backtracking

I did not think of using continuations for backtracking Prolog-style. Something to look at...

On the theory: do we get state by replacing each reference with a function which is reduced using the 'algebra'?

That's funny, I learned

That's funny, I learned continuations by reading Leroy's online course material, which motivates continuations that way.

One-shot

That is because I mostly restricted myself to one shot continuations. Although I already added resume which I planned to simulate using a function.

I also have a system of rules in the plans, which would benefit from backtracking. That does not need to be exposed to the programmer.

I am a little wary of multi-shot continuations...

What?

Denis, what do you mean when you say "do we get state by replacing each reference with a function which is reduced using the 'algebra'?" I can probably answer your question once I understand it.

from part one I understand...

So, from part one of the post I understand that each mutable memory cell gets associated with a function. You define a map k from memory cells to these functions and:

lookup(k)=λs.k s s, update(t,k)=λs.k t

Then there are rules to reduce eg. update(t,update(u,k)) into update(u,k)
Since Eff is for effect, I believe that mechanism allows for mutable state?

It's not quite right to say

It's not quite right to say that k is a map from memory cells to functions. In what you wrote k is the continuation. And yes, of course, eff has mutable store, which is implemented exactly with lookup and update that you wrote down (it is in prelude.eff).

Why does the particular form

Why does the particular form of lookup() models storage access?

The content of the cell can be read with operation lookup and written to with operation update.

When the content of the cell is read with lookup() I'd expect a value of type S to be returned and not a function of type T^S.

Note that I read this as an introductory text for the language and not as a supplement to a bigger theory where even the most basic operation is modelled as higher order function. I guess this is somehow meaningful for the author but it doesn't shine through in the introductory exposition.

Incomplete Picture

I think that the lookup() example is just that, an example for the concepts of a carrier set, algebra, etc. Bauer doesn't explain why this choice captures state exactly. But he does give an informal explanation why:

In the single-cell example, our programs cannot be represented simply by return values, as they depend on the current state of the memory.

Thus, programs are functions: for each initial content of the cell, the program gives the corresponding return value. In other words, they are functions from S to T.

Note that this is no different than implementing state in a purely functional language.

In the general theory, covered by Plotkin, Power, Hyland, Pretnar and more, higher-order functions are not a central ingredient. Rather, the theory is presented using equational logic terminology. Programs are interpreted as terms denoting the effects that arise in them, modulo the equations for each particular theory.

So the program that simply looks up the contents of the memory will be denoted by the equivalence class of lookup(\s . return s). You can show that if the program deals with value of type T, then the set of all equivalence classes of terms involving lookup() and update() is isomorphic to (SxT)^S.

As a side note, one consequence of the general theory is that if we want to define state in terms lookup() and update(), satisfying Plotkin's and Power's seven equations of state (which are intuitive for any non-relaxed memory model), then we necessarily get the monad A |-> (SxA)^S. This follows from the argument in the previous paragraph, and the fact that the theory for state is Hilbert-Post complete.

But you have raised a good point. Programming in CPS is not something programmers always like to do. When we write "lookup()" we would like a single S-value, and not a complicated computation. Note that this "lookup()" is nothing else but "lookup(\s. return s)". This idea, of replacing an operation "op(t)" by "op(\s.return s)" holds in general, and gives rise to the concept of a generic operation.

I don't think that generic operations are covered in the blog (yet?), so if you want to learn more about them, try Matija's thesis.