The Simplicity of Concurrency

My first post. I've been reading LtU regularly for a while now and it is truly great. Many thanks to all who are involved.

Karl Fant of Theseus Research recently gave an interesting presentation, titled "The Simplicity of Concurrency", on what he believes to be a conceptual model that reverses the traditional view of sequentiality as simple and concurrency as complex. He focuses somewhat on a hardware implementation, but explains how it can be applied to compilers and programming at any level. The presentation is available as streaming audio or audio+video from

http://www.parc.com/cms/get_article.php?id=465

My summary would be:
* Introduce the idea of "data not available" (NULL) as an explicit concept (different from a NULL pointer or Lisp's nil).
* Introduce the conventions that
a) functions receiving all NULL inputs produce NULL output.
b) functions receiving all non-NULL inputs produce non-NULL output.
c) functions receiving mixed NULL and non-NULL inputs do not proceed.
* Given that functions do not proceed until they receive all of their inputs, they become self-synchronizing for one use.
* Blocks of functions can be reset for further use by feeding their output, switched through a NULL-notter, back into the input to form a latch. This concept is easier to explain with a circuit diagram.
* Given that whole systems can be built upon this model to be fractally self-synchronizing, such systems can be partitioned arbitrarily into sequential components such as threads.

Anyone care to correct me or Karl? I am by no means a language guru, so I may have a few concepts crossed.
Do you think this could be a foundation for building an automatic-threading compiler?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Dataflow concurrency

Yes, it sounds very similar to dataflow concurrency with streams. This is an old idea -- it dates back to at least the 1970s. It's explained clearly in chapter 4 of CTM.

A similar mechanism exists in

A similar mechanism exists in backward chaining systems. A rule (ie function) will only fire if all all the If-clauses are found in the data or can be omputed by other rules. Something that can't be computed or is not in the data (ie a fact) can be defined as NULL or not found (or simply doesn't exist) as opposed to boolean T or F. When a rule fires (or works) its result becomes a fact in the data base. Repeatedly trying such a system over time or under different situations may eventually get it to work.

Also closely related to forward chaining but with a different spin.

futures?

Is there a difference between 'Non-Data' variables of the presentation and the idea of futures (where if a value is not yet available, the computation thread blocks until it does become available--my understanding anyway)?