Ken Shan: Shift to Control

Ken Shan's Shift to Control (slides) presented at the recent Scheme workshop.

Ken shows that shift/reset, prompt/control, prompt/cupto and lots of other delimited continuation operators are all equally expressible, and all can be modeled by ordinary CPS.

The paper shows that shift and reset can macro-express control and prompt, as well as the other operators, without capturing undelimited continuations or keeping mutable state. This translation is previously unknown in the literature.

Good stuff! But keep in mind that, as the cartoon in the slide says, control operators can make your head hurt...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Scheme standardisation

While we are on the topic of the scheme workshop, let me guide this thread off topic: did anything come up about developments with the apparently static scheme standardization process? I saw from the timetable there was something about an R6RS progress report?


I was there. There were mixed opinions about the process, but I was pleasantly surprised to see that the editors really had been working on it. The editors' discussion list is private in order to keep forward motion, which is good, but a side-effect is that it's sometimes hard to tell things are happening.

Anyway, there's been some discussion about releasing the workshop proceedings, which has apparently been the tradition at past Scheme workshops. The proceedings included the R6RS status report. Apparently this will be available as a tech report from IU and possibly Georgia Tech as well.

(A couple take-home points from the workshop discussion: the highest-priority items seem to be modules, records, and exceptions. Most of all modules.)

found 'em!

I've posted a new topic.

shifting back to the subject at hand...

I also got to see Ken's talk, which I enjoyed very much (as I always do!). Right now I'm working through Amr Sabry's paper from ICFP, which seems relevant, and Ken's paper is next on my list.

In his talk, Ken pointed out that some of the reasons control operators make people's heads hurt are that there are so many of them, they are all so similar to one another, and the relationships between them are subtle. The Ariola/Sabry/Herbelin paper seeks to build a common framework in which to talk about all control operators.

On a more practical note, I'm curious what people's favorite control operators are, and what their reasons are for preferring them over others. There's a clear application for delimited continuations in programs that run programs (e.g., PDEs and REPLs), but control/prompt vs. shift/reset is a big question mark in my mind.

favorite control operators

I found the slides from ICFP helpful.

Control operators interest me because something better than exceptions is needed in languages like C++ and Java. For example, a recent comment on the Haskell-cafe mailing list said, "One of the most important features of a modern language is, to me, exceptions." It's unfortunate that a language feature so prone to pitfalls is generally regarded as essential. IMO it's the semantics of 'catch' being dynamic, that leads to problems.

So I looked at shift and reset, hoping that this would be the kind of thing that could be added to conventional languages as an alternative to exceptions. But the dynamic aspect disqualifies this as a candidate for my favorite. The above-mentioned slides mention control and prompt, but describe them as equivalent to shift and reset. Is there a statically bound alternative?

Static Exceptions?

IMO it's the semantics of 'catch' being dynamic, that leads to problems.

I'm not sure what you are suggesting here. By their very nature, exceptions are dynamic: they represent unanticipated problems at runtime.

If they were static, we would just call them compiler errors.

Or do you mean something else here?

Dynamic scoping of catches

I guess the idea was that it is not statically known which throws will meet which catches.

At least, I had such dynamic feeling about exceptions when last time studying control operators (a nice opportunity for a shameless plug).

Swing your partner round and round...

I guess the idea was that it is not statically known which throws will meet which catches.

I'm not clear why this is inherently so.

Obviously you don't necessarily know where the exception will be caught when you write the throw (any more than you know where a function will be called when you write it).

But when you write a catch, assuming you have access to the source for the called code and/or your language requires declared exceptions, you should be able to tell at compile time what throws will meet what catches if they occur.

Or am I missing something?

Unchecked exceptions

I think, declared exceptions are equivalent in rights to the "normal" return path, so yes, you are right, they are statically determined. I actually meant "unchecked" exceptions, and probably should have not posted... Just got bored with hype papers I must read recently :(

On edit: We can see Java methods as taking two continuations: one for return, and one for all (checked and unchecked) exceptions. Compiler adds some magik to propagate uncaught exceptions. Hmm... Does it really mean even unchecked exceptions are determined statically? Umm... Aha, we can determine statically that _every_ method can throw any unchecked exception, though will not necessarily do that... I have to go to bed, too much WFMC reading can hurt my brain :)

Where's the catch?

you should be able to tell at compile time what throws will meet what catches

Access to the source is not enough. You could be using a module that has both throws and catches of the same exception type. If you write a catch for that exception, you may not know whether it will be receiving exceptions from the module. In any case, needing the source indicates a violation of modularity.

As you say, when you write the 'throw' you don't necessarily know where it will be caught. You are likely not even to know whether it will be caught. So writing a 'throw' usually raises alarms from my programming instincts. But in C++ you don't have a decent alternative, and exceptions are what everyone expects, so you live with it.

In Scheme you can find the site where the problem is best handled, grab a continuation with call/cc, and retain the continuation in a variety of ways. Then rather than throw an exception, you invoke the problem-handling continuation. Invoking the continuation is like a goto that passes a value, and the call/cc setup like a label that receives a value.

Call/cc is not my favorite control operator, because virtually whatever the purpose, its use appears convoluted. Could it be that my favorite control operators are goto/label? (but be sure the label supplies a first class continuation)

Speaking from ignorance

(I am just starting to read up on the control stuff, so I can only speak from the exception side of things.)

The claim I have heard re: programming with exceptions is that you are allowing the system to separate the concerns of failure handling from the core recipie for getting something done. Thus it sounds like ideally you don't want to know who is going to catch what you throw.

Is it possible that if you instead use Call/cc then you are not allowing for as much separation?

Of course, I think the reality with exceptions is that there is not a consensus on how to properly employ them. Combined with people's natural aversion to worrying about the error handling (because it quickly outweighs the core 'fun' stuff, and because it is hard to get right, and because the culture doesn't even agree on how to do them properly), and with the fact that most folks don't use anything remotely like AOP, you end up with try/catch embedded everywhere anyway!


Could you give a small concrete example where you think that call/cc is better than throw/catch?


call/cc allows you to back up in time and try another path. Haven't figured out how to make it work properly just yet, but techniques that can use backtracking are better suited to use call/cc.

As opposed to throw?

techniques that can use backtracking are better suited to use call/cc
Do you mean call/cc is better suited than throw/catch, or that call/cc is the best for this purpose?

I guess it's time to recall msplit from LogicT.

No call/cc in Haskell

I can't help but think that msplit is a method of spinning off continuations sans call/cc. Wonder if the authors use call/cc in similar circumstances when doing backtracking in Scheme?

Me, I'm still trying to figure out how to map Oz's Choice points into Alice ML (CTM chap9). Tried a backtracking approach in SML-NJ (Alice don't have callcc support), but I can't get the right effect. Perhaps the solution I need to look at is similar to LogicT?

No future?

I'm still trying to figure out how to map Oz's Choice points into Alice ML

I have to check my copy of CTM, but shooting from the hip I would suggest AliceML futures to simulate ask/tell operations of constraint programming. The last time I was playing with controll effects in Alice (crude co-routines), I had to use mutable state, though. Have to re-read papers on mutual expressivity of control operators to learn whether it was my fault, or a grim necessity.


Well, there's lots of contexts where you want to do backtracking, but without knowing which ones you had in mind, I'm not yet convinced. Let's pick a specific instance and look at it in detail.

A couple of the top of my head

(a) non-deterministic searches - backtracking; (b). coroutines; (c). Icon's success-failure construct.


I agree that it would be retarded to use try/catch to perform a non-deterministic search. But then, I think that is comparing apples to oranges. I also happen to think that coroutines are probably not an appropriate use of exceptions. I'm not familiar with c, so perhaps you can provide a reference or explain it briefly?

Exceptional non-deterministic search

I agree that it would be retarded to use try/catch to perform a non-deterministic search.

Well, that's never stopped me before. It's possible to do, but not as nice as a continuation-based implementation. Nor as performant. (Actually, looking at it now, that code is quite ugly in places...)