A Framework for Comparing Models of Computation

A Framework for Comparing Models of Computation by Edward A. Lee and Alberto Sangiovanni-Vincentelli, 1998.
We give a denotational framework (a “meta model”) within which certain properties of models of computation can be compared. It describes concurrent processes in general terms as sets of possible behaviors. A process is determinate if, given the constraints imposed by the inputs, there are exactly one or exactly zero behaviors. Compositions of processes are processes with behaviors in the intersection of the behaviors of the component processes. The interaction between processes is through signals, which are collections of events. Each event is a value-tag pair, where the tags can come from a partially ordered or totally ordered set. Timed models are where the set of tags is totally ordered. Synchronous events share the same tag, and synchronous signals contain events with the same set of tags. Synchronous processes have only synchronous signals as behaviors. Strict causality (in timed tag systems) and continuity (in untimed tag systems) ensure determinacy under certain technical conditions. The framework is used to compare certain essential features of various models of computation, including Kahn process networks, dataflow, sequential processes, concurrent sequential processes with rendezvous, Petri nets, and discrete-event systems.
The generality of the approach looks very impressive. Can anyone share first-hand experience with this framework? Would be great to see E compared to Oz!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Embedded systems

Unfortunately, I can't share any first-hand experience. But I can tell you that most of the applications of the tagged signal model so far have been on embedded systems and other kinds of systems that contain heterogeneous models of computation, rather than on semantic comparisons of languages like E and Oz. For example, the team building Berkeley's Ptolemy II tool for simulating heterogeneous systems have used the tagged signal model to understand how a model composed of a mix of continuous time and discrete event models should work. Similarly, there's been some work on analyzing automotive controller by using the TSM to understand how finite state machines, discrete event and sequential processes, powertrain models, sensors and actuators all play together.

Ideas for extension?

...rather than on semantic comparisons of languages like E and Oz.

My comment was a bit tongue-in-cheekish, given that the framework directly supports only languages with fixed communication topologies (as in CSP or Petri nets, but not pi or join calculi).
I was wondering, whether it's possible to conservatively extend the framework to cover creation of signals (=names).

I really liked how inputs and outputs came out as just special cases of signals, very interesting! Sounds a bit like constraint programming...

(... without having read

(... without having read this paper), this sounds close to what you'd use to describe FRP or a clock calculus with the tags. For the FRP case, the most explicit version I've seen was the in the recent paper by Conal Elliot ("Simply Efficient.."). Somewhat intuitive approach, as they're for pure but expressive stream languages, and the former has a history of being described denotationally..

Embedded systems


But I can tell you that most of the applications of the tagged signal model so far have been on embedded systems and other kinds of systems that contain heterogeneous models of computation, rather than on semantic comparisons of languages like E and Oz.

It is just so. I've studied some of Edward A. Lee's and his team's articles. Most of them devoted to embedded system semantics, or more precisely to the semantics of real-time systems interactions. Someone could look into Ptolemy II project, in which such framework has been implemented.