Continuations for Parallel Logic Programming

Continuations for Parallel Logic Programming
by Eneia Todoran and Nikolaos S. Papaspyrou

This paper gives denotational models for three logic programming
languages of progressive complexity, adopting the
"logic programming without logic" approach. The first language
is the control ow kernel of sequential Prolog, featuring
sequential composition and backtracking. A committed-choice
concurrent logic language with parallel composition
(parallel AND) and don't care nondeterminism is studied
next. The third language is the core of Warren's basic Andorra
model, combining parallel composition and don't care
nondeterminism with two forms of don't know nondeterminism
(interpreted as sequential and parallel OR) and favoring
deterministic over nondeterministic computation. We show
that continuations are a valuable tool in the analysis and
design of semantic models for both sequential and parallel
logic programming. Instead of using mathematical notation,
we use the functional programming language Haskell
as a metalanguage for our denotational semantics, and employ
monads in order to facilitate the transition from one
language under study to another.

This paper happens to combine several topics that interest me lately - AKL (a precursor of Oz), denotational semantics, continuations, and implementing programming languages in Haskell.

If you share at least some of these interests - take a look!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

AKL as a superset of join calculus

It recently occurred to me that the features of join calculus that appealed to me are in fact all present in AKL (plus some more). I am really starting to believe in logic programming as an interesting paradigm to play with. Well, CTM almost convinced me of this when I read it, but for some reason I didn't see the full beauty of the approach until I started playing with STM in Haskell, and later trying to implement an E-like language (in Haskell again) - suddenly I found myself thinking in terms of constraints and logic variables. I cannot describe this leap of intuition rationally, but for some reason I wanted to share this Zen-like experience in this post :-)

What I am missing at the moment is a good type system for AKL/Oz.


What I am missing at the moment is a good type system for AKL/Oz.

Isn't that what Alice ML is supposed to provide?

Alice is a conservative extension of ML...

...and IIRC, the designers chose not to implement full unification, so as not to be too radical in their extension. They do, however, have grand designs wrt Constraint Programming using the Gecode implementation. Anyhow, Alice is very much about applying many of the ideas of Oz to a static programming language, with some new ideas thrown in as well. But, as far as I know, the relational model of AKL and Oz was not one of the primary design goals.

[Edit Note: But yes, if you are interested in the closest approximation of Oz in a static language, Alice is the place to look].

Alice is too big

At the moment I am more interested in a simple, calculus-like language - so to say Scheme of logic/constraint programming, not Common Lisp. That may be because I do not really need it as a means, but more as an end (a PL as an object of interest as opposed to a tool).

BTW, regarding the OP paper - I found the "logic programming without logic" approach of separating control flow from unification etc. very interesting. A pity the original paper of de Bakker is not freely available online.

Which features?

> the features of join calculus that appealed to me are in fact all
> present in AKL

Which features? As I mentioned previously, I find the Join calculus distressingly non-monotonic in its scheduling behavior, leading to vulnerabilities to deadlock. Concurrent horn-clause logic languages (presumably including AKL) generally avoid this class of problems, since logic variables monotonically resolve. (E's promises are largely inspired by Concurrent Prolog logic variables.)

Simple and powerful

Well, I approached join calculus from the side of pi calculus, and it immediately stroke me how easy is to synchronize in join compared to pi. This feel of transaction-like quality of join patterns was refreshing. It was both simpler than pi, and more powerful.
I agree that join calculus behaves non-monotonically (but what name-based concurrent calculus does not? messages and agents react, and then cannot react anymore, thus breaking monotonicity), but they are nicely localized both by the syntax and the semantics.

Now I am gradually leaning towards learning more about constraint-based concurrency, as it seems to me more expressive than name-based concurrency (expressive at least from the subjective point of view, the feel of what can be done and with what effort).

As a side-note, I believe any realistic PL implementation must be non-monotonic, it's just that no one is able to catch them doing that (think garbage collection).

Using logic to model control

Now I am gradually leaning towards learning more about constraint-based concurrency

The concurrent constraint model (the semantic model underlying Oz and AKL) uses logical conditions (in particular, entailment and disentailment) to model control. E.g., in Oz the if and case statements do an 'ask': they wait until a particular logical condition is either entailed by the store or is disentailed (will never be entailed, i.e., its negation is entailed) by the store. Both entailment and disentailment are monotonic: once true, they stay true forever. For example, the condition x=3 is entailed by a store that contains (x=y and y=3) and disentailed by a store that contains (x=y and y=4). As a consequence of this, when a thread in Oz becomes runnable, it stays runnable until it executes at least one step. Becoming runnable is monotonic in Oz. This is a very nice property that is not satisfied, e.g., by shared-state languages such as Java. In Java, a thread that is runnable may become no longer runnable because of the actions of another thread. See chapter 13 of CTM for the complete semantics of Oz.


I've read CTM (and I suggested it for our corporate library), and it can claim indeed a big share of credit for making me considering logical programming more than "just old Prolog". Thanks for the book, by the way!

Via CTM (though not very directly) I learned about AKL, and CCP, and was amazed by works of Vijay Saraswat. This world is so huge I still cannot grasp the connections between CCP and concurrent logic programming, and more "imperative" process calculi of pi heritage. On of the papers I've read recently suggests there might be a nice way to view them as instances of more general framework (A Paradigm for Asynchronous Communication and its Application to Concurrent Constraint Programming).

I wonder, whether trying to use Concurrent Constraint Programming for Business Process Management might be a great business idea... :-)


I've mentioned LolliMon (and CLF; after neelk brought up LolliMon) before. Have you ever played with it and/or read the papers around it and CLF? This isn't really a response to your question, but I think you'd find that LolliMon has a "feel" different from most languages you've encountered (actually "all" or else I wouldn't bother mentioning it). It should definitely suggest logic programming as interesting. It doesn't do constraints though.


Thanks for the suggestion, I ran across LolliMon several times, including once today while surfing CiteSeer, but never had enough courage to try it seriously. Now, I really want to do that.

The thing I like about logic

The thing I like about logic programming is the way I can directly represent the semantics of a target domain. Often you can directly lift a semantics from a book (never intended to be a logic program). Design procedures are a good example. Transformer design is one I am familiar with. I have a little book here called the "Atlas of Economic Indicators" that is quite easily turned into one or more logic programs. The semantics is there, all you need to do is copy it. With other programming languages there is always a disconnect between the program and the semantics. It seems to me that this is quite valuable.