subjective but hopefully less flamebaid-lame

in an attempt to make up for the bad "subjective aside" that happened in a parallel universe and which will now go unmentioned, i'm interested in learning more about deterministic concurrency after reading Peter Van Roy's educational chapter. speaking for my self, i enjoy having constraints which can help me avoid bad code; the linked paper uses model checking as well as deterministic concurrency.

We believe the way forward is to provide more constrained par-
allel programming models that avoid some of these challenges by
construction. While restrictions mean that certain kinds of algo-
rithms are ruled out (or at least very awkward to code), this seems
a reasonable price to pay for correct programs.

the subjective thing is that i feel like most industrial developers are not willing to take on some constraints in order to be more safe. i find that frustrating, and wonder if/when/what could contribute to changing the mind set from "i'll do anything i darned well please, thank you very much" to more one of "hey, i'm going to be a responsible driver"?

also, since the paper targets the Cell processor, i wonder if the fact that people complain about how hard the PS3 is to develop for could lead some industry folks to become more interested in systems like that of the paper; systems which (to my mind) have chocolate+peanutbutter: a development system that makes using big parallelism safer and easier.

lastly, the results aren't the be-all, end-all. even though the system is helping us be safer and in some tests do better, there is still plenty of work to figure out how to really take advantage of all the possible power. which gets into the good/bad points of relying on a sufficiently smart compiler, another subjective area. i hope that things like GC, as imperfect as it is in many situations, help us work our way towards 'sufficiently smart' systems that really do work well overall.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I think the problem with

I think the problem with languages that places constraints on programs is that it's not much of an excuse for missing deadlines or not delivering working products.

That's because we aren't

That's because we aren't very professional about it...

Every other profession is rife with constraints. And the alleged shortcuts from working without constraints don't seem to be helping us make deadlines.

Most used database in the world: Excel

People love Excel because it lets them do what they want, when they want, how they want.

In addition, they do not care or want to know what end of the gun the bullet comes out of.

There are programming language researchers who want to build the next Excel, besting Pito Salas and coming up with the next thing beyond pivot tables.

As for the quote raould provided and its surrounding context he links to, I'm not sure what to think of it. I think they are effectively saying we shouldn't be using high-level assembly languages to do concurrency. This makes sense, when you consider the direction memory systems are going in. (Raould's mention of the cell processor is a nice example).

However, for the "sufficiently smart compiler" article, I just flat out will say it is one of those things where "the solution is obvious, once you understand what the problem is." Yet, the realization of that solution is not cheap.

Engineering is mostly about continuously building better bootstrapping techniques. You could not build the house I live in today in the 17th century, certainly not in the blink-of-an-eye time it took to build, either. There are more bootstrapping components available today; the primitives for building houses have changed/expanded into new metaphors.

Broken link

Raoul, the link you've provided to Edwards' paper appears to be broken (it includes both the LtU URL and the URL you presumably intended). The correct link is http://www1.cs.columbia.edu/~sedwards/papers/vasudevan2009celling.pdf

thanks!

i missed the "http://" part in the link, fixed it.

Multiple models of computation?

SHIM looks very like occam with a C-like syntax, with the exception that they deliberately exclude occam's ALTing construct in order to guarantee deterministic execution. The C-like syntax may well help with adoption. Whether developers are willing to be constrained-and-therefore-safe remains to be seen (but I'll note in passing that recent developments in occam have involved removing the constraints that previously mandated static allocation and static topologies).

Part of the issue may be a fear on the part of developers to committing to a language that is somehow constrained, only to find further down the track (perhaps because of changing customer requirements) that you really need to do something outside of those constraints. Granted, the same fear applies to (for example) microcontroller selection. But I think software developers are much more used to having flexibility available when they need it than hardware designers are. Languages like Oz, which offer a well-integrated mix of several different models of computation, seem like they could be a win here (assuming that you can ensure that extending a particular program to encompass a less constrained model of computation didn't break the guarantees provided by those parts of the program in the original model).

syntax + adoption

that theory sounds very plausible to me, and something like Oz makes sense as you say; maybe one could come up with a painful-but-popular C-style syntax for Oz. i assume cilk++ and tbb are getting some 'Greenspun' type traction. i sure wish it were s-exprs that had taken popular root rather than C-style.

if one is a programing language researcher/developer, what makes one choose to go with something more vs. less esoteric looking? does it offend if something solid gets draped in curly-bracket style? since syntax is only skin deep, it seems like it should not be a big deal. maybe we just need a standard front-end for everything, a front-end that supports {}, (), and ML styles?

then after syntax one would see if people are also turned off by the actual ideas in the system, such as promoting determinism.

at the end of the day i pessimistically think that what industry folks really care about isn't reducing bug count or anything, it is to some degree honestly just "can i hire people off the street who can use this language?" / "how hard will it be when i want to outsource this project?"

(which i should perhaps optimistically, a la Graham, see as a competitive advantage to me, should i ever get my angel funding.)

where is SHIM?

hm, trying to find SHIM online doesn't give me much other than PDFs. i haven't found actual code yet?! more is the pity.

I'm not sure industrial programmers object...

to the specific constraints--its that they object (and the folks that pay their salaries object) to using something they are uncomfortable with, even if it works better.

Many of us in industry have also been burned by numerous alleged silver bullets that fail to deliver on promises of increased code quality or productivity; so we are very wary of claims made about things we are not familiar with. Of course, claims made by commercial tool-peddlers and claims made by research scientists are of two different sorts, generally; but its the commercial tool-peddlers who generally have the ear of application programmers.

Deterministic concurrency and typing

Are you aware of Lucid Synchrone and work on synchronous data-flow programming by Pouzet, Caspi et al.? Lucid Synchrone is a higher-order functional programming language with streams as primitive values, currently built atop OCaml. It uses a special type system, dubbed « clock calculus », to statically reject programs that cannot be executed synchronously. Some interesting properties of synchronous data-flow programming languages:

  • Programs written in these languages can be efficiently compiled to more traditional languages, including very optimized C code for first-order languages.
  • Various extensions to Lucid Synchrone have been proposed, including control/mode automata or more recently a lightweight form of object-orientation, bridging the gap between synchronous imperative languages à la Esterel and synchronous data-flow languages, of a more "functional" flavor.
  • Clock calculi implementing relaxed notions of synchrony use well-known techniques imported from type-systems, e.g. sub-typing.
  • They made good inroads into the embedded world, see for example SCADE.

You should read Marc's papers for more information. Synchronous Kahn Networks (ten years later) gives a good historical account.

driving ...

if/when/what could contribute to changing the mind set from "i'll do anything i darned well please, thank you very much" to more one of "hey, i'm going to be a responsible driver"?

There are always going to be these two factions. It has been shown that automatically driven cars can make roads more safe and efficient, but nobody at the moment is willing to give up the steering wheel. Even in the fully automatic case, you can expect to see demand for a manual override. Driving isn't fun otherwise :)

It has been shown?

Darpa Urban Challenge was only a semi-realistic driving situation, and they didn't go quite as fast as human drivers could have.

I have little doubt that eventually, yes, computer driven cars could make roads safer and more efficient, but a big hurdle is liability. Even if computer drivers are safer, and lives on the whole are saved, the first time a computer-driven car is involved in a fatal crash... oh my.

The Washington DC Metro system

is getting lots of flak over the recent Red Line crash, involving automatically-driven trains. It appears that the cause of the crash was a failed sensor, not a software fault or an operator error--but society certainly places higher expectations on automated systems than they do on ones with a human at the controls.

Expectations

society certainly places higher expectations on automated systems than they do on ones with a human at the controls

Hmm...I wonder if that plays into raould's original comment? That is, our expectations are lowered in a human-driven system relative to a more-automated system; we're willing to put up with a certain level of bugginess in "hand-written" code, but heavily "assisted" programming systems are sometimes denigrated for much lower-impact faults.

Perhaps we're a bit frightened that if we embrace all the constraints and assists, we'll discover that programming is even harder than we thought -- we'll trade all the low-level stumbling we currently do (and know how to make excuses for) for higher-level mistakes that will be harder to explicate.

I don't think so

Have you ever heard of Dr. W. Edwards Deming? He was one of the two founding fathers of statistical process control theory. Many statisticians know his name, but other fields often don't.

He argues that quite often management places unrealistic expectations on subordinates, often expecting better performance from workers than is statistically possible. At some point, assessing 'good performance' needs to be taken into context with what some quality control measure allows. Past a certain point, errors will simply occur in any chaotically ordered system. The Red Bead Game Experiment is his classical example.

I would state my own opinions on the matter, but Deming has corrupted my thinking as a statistician and software designer/product manager. Thus, I just point to him.

sounds plausible to me

i sure dislike the low-level had-wringing stuff that is the bread and butter in industry; i'd be a lot happier if the bug reports we all worked through were at those more rarefied levels.