Minimal implementation of state machines

Dear all, I fear this may be a really strange question, but LtU seems the best place to ask it anyway. I am working on a programming language, Casanova, built for interactive applications and games in particular. The language is built in order to enable applications to implicitly update their state with continuously run "rules" which can also be suspended easily. For example, a lightswitch would be implemented quite easily as:

world LightSwitch = {
  Sprite  : Sprite
  Pressed : bool
  rule Pressed = 
    when(IsKeyDown(Keys.Space))
    yield true
    yield false
    when(IsKeyUp(Keys.Space))
  rule S.Color = 
    yield Color.White
    when Pressed
    yield Color.Black
    when Pressed
}

I would like to try and formalize why this approach is better than existing patterns commonly found in the literature for encoding state machines. I do not wish to do a survey of existing techniques and count lines of code or number of statements. This is clearly a viable approach, but for once I would like to go formal instead of empirical.

My goal is to show a way to encode the program above in a simple imperative language. I would then like to prove that this encoding is minimal with respect to the number of statements used. Finally, I would like to show that the encoding produces code which contains far more statements than the Casanova implementation.

The issues I have are: I have no clue how to go with a proof of minimality for some encoding, and I wonder if the overall approach sounds convincing to you guys.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Reactive programming

What you're describing sounds a lot like 'reactive programming', so you might want to look up a few such systems for comparison.

As for your example, it reminds me of code like the following, which always annoys me when I come across it:

if (foo) {
  return true;
}
else {
  return false;
}

Of course, this is completely redundant since we can just say:

return foo;

Likewise, could your example be simplified like this?

world LightSwitch = {
  Sprite  : Sprite
  Pressed : bool
  rule Pressed = yield IsKeyDown(Keys.Space)
  rule S.Color = yield (Pressed? Colour.White : Colour.Black)
}

In fact, since we're just defining a relationship between values, is it necessary to "yield" at all?

world LightSwitch = {
  Sprite  : Sprite
  Pressed : bool
  rule Pressed = IsKeyDown(Keys.Space)
  rule S.Color = Pressed? Colour.White : Colour.Black
}

I don't think that's what it means

The four lines of Pressed are, I think, to be executed sequentially: wait for the space key to go down, yield true, then when resumed yield false, then when resumed wait for the space key to go up, repeat.

cycles? etc.

What happens re cycles? Are "world" fields the only mutable state? How does I/O work? What are the order-of-update rules?

Rules cycle back when they

Rules cycle back when they are completed. world fields are indeed the only mutable state. I/O works in a rather intuitive manner (keyboard/screen buffers refilled at every suspension of the various rules, that is at wait/yield statements). Rules are updated in the order of declaration; an alternate implementation is that rules all work on double buffers, so essentially they are all run at the same time.

The FRP solution would

The FRP solution would assume no time passed (declarative with respect to an implicit now), and since the rules are acyclic, no time is really needed.

Yes, exactly.

Yes, exactly.

Your rewriting does not look

Your rewriting does not look much right to me, sorry.

Also, yes, I know the link between this work and FRP. Thanks.

Proving minimality of number

Proving minimality of the number of statements needed for an encoding seems like an odd and difficult thing to attempt.

Kolmogorov complexity

That's basically Kolmogorov complexity, and while it provides many useful insights and some intriguing theorems, is essentially impossible to can calculate for any actual problem.

I suspected as much. Still,

I suspected as much. Still, I will look it up...

Check out YinYang

I implemented a similar language a while ago; the key for reactive rule processing is to check put subsumption architectures in behavior oriented programming. Subsumption is similar to a state machine but lacks the "state" component (your example doesn't use state either).

Oh yes, I remember it well.

Oh yes, I remember it well. Inspiring work for my current effort, even though I am aiming at something that supports games up to a medium size.

We are working a lot on combining networking with our primitives, since networking benefits *a lot* from the ability to wait and suspend operations.

Synchronous languages

Hello Giuseppe,

I suggest you take a look at synchronous languages, in particular Esterel which seems very similar to what you sketch here. The language's web-page is getting old but still holds valuable resources.

Your question about translating your programs to ordinary imperative code in a minimal way has also been studied extensively in the context of Esterel, with the potential difference that the latter is finite-state. In this setting, one can in theory translate a program to a finite automaton and minimize the result using textbook algorithms. Doing that in practice is obviously a challenge, and there is a whole book dedicated to this subject (among others): Compiling Esterel (prohibitively expensive, unfortunately :-( ...).

Statechart compilation

For the translation of hierarchical statecharts to resource constrained environments, see On Efficient Program Synthesis from Statecharts (2003) by Andrzej Wasowski

Harel's Statecharts

I'm working on a language with similar goals. My approach is to have a "Controller" construct representing a statechart hierarchy, with an indexed vtbl. Change of current states are reflected by changing the Controller's "current" vtbl index.

I'm not very far in, so there may be fairly obvious flaws to this approach I haven't seen yet.