I've stumbled across something in my compiler work and wondered if anyone has run into anything similar, or knows of any existing literature about it.
The problem I'm working on is dealing with the template bloat problem in an AOT language, without resorting to type erasure. I know that there are some compilers which, during the code generation phase, will merge functions and data structures which have identical machine representations, but I'm attempting to do this in a more formal and abstract way, and at an earlier stage of compilation.
In mathematics, a differential is a description of how the output of a function varies with respect to a parameter. In my scheme, a "type differential" is a description of how the meaning, behavior, and machine representation of a function changes with respect to changes in a template parameter. An analysis of the expression tree for a function yields a set of these differentials. Once you know all of the differentials for a function, you can then easily compute whether any of the template arguments can be weakened, creating a more generalized version of the function that has the same meaning.
A trivial example is the "CallingConvention" differential. Let's say we have a generic function F which has a generic type parameter T, and which takes a function argument also of type T. Different values of T will have different machine representations, which changes the function signature - however, many different types have identical machine representations, so the set of possible machine representations is much smaller. On the other hand, if we change the type of the parameter from T to ArrayList[T], things are a little different - since ArrayList is a reference type, it's always passed as a pointer, so the value of T no longer has any effect on the function signature at the machine level. So CallConv[List[T] == CallConv[Object].
Other examples of type differential objects include LocalVariable, InstanceOfTest, InstanceMethodCall, FieldOffset, and so on. In some cases, the presence of a differential can be used to decide whether the compiled code should contain a static reference to a type identifier, or whether the type identifier should be passed at runtime as a hidden field or parameter.
Java's type erasure can be thought of as a system in which the set of type differentials for all functions is the empty set.
Now, what's interesting about this is that I realized that there's a kind of algebra of differentials: There are operators that can be used to sum and compose them. For example, if I have a composition of two templates, it's possible to compute the differential of the result directly by composing the differentials of the individual pieces.
Anyway, I'm sure that there's probably some pre-existing term for this - which I'd be able to search for if I knew what to name it.
The Programming Language Wars: Questions and Responsibilities for the Programming Language Community
The Programming Language Wars: Questions and Responsibilities for the Programming Language Community by Andreas Stefik & Stefan Hanenberg
A presentation of the results of a number of randomized controlled trial studies of the affects programming language designs on users.
He spends a great deal of time on the need for the programming language creation community to do more of these kinds of trials and why he thinks they are so necessary.
Among the conclusions are
He says that languages shouldn't be created or changed without this kind of research to back up the ideas.
Only half the interesting detail is in the slides.
questions start @ 34:00
the Perl vs java research wave previously discussed as Perl vs. Random Syntax here on LTU
Types are fundamental to both logic and computation:
The issue of types arose in another LtU discussion where it seemed tangential but since the topic is fundamental, it is worthy of discussion because there seems to be a great deal of confusion and controversy on the part of both theoreticians and practitioners.
Our programming tools should be more democratic, egalitarian, said (Dr.? Mr.? I was assuming Dr.) Edwards a while back. (I, a peon, have great respect for and mostly agree with him, of course, utterly seriously.)
What I see happen in the real world is something like: (1) hack something up to get it working asap; (2) oh no we got actual customers [remember the famous quote about not being able to change Makefile syntax?] we'd better keep on truckin' with this since they want new features and are already bought in to how it works today; (3) who the hell wrote this crap, I can't possibly maintain it! [remember the famous death of Netscape?]
So if we hand people Visual Basic, they might make something great that would never have existed before. BUT what kind of rats nest of hell code will they produce, and what zillions of crazy bugs will they have, and and and.
If that is a real problem somewhere in the world sometimes, how best to address it? Rewrite it? Outsource/offshore rewriting it? Sell it to CA, Inc.?
Or something else? What could we have in our programming languages and ecosystems that would let the masses have their cake and eat it, too?
Ignorant question: Might anybody be able to succinctly say where we are vis-a-vie effects-continuations-monads? Do we have languages where we can have fun managing effects? Did we give up on the monadic approaches to this? I am too completely out of my depth / behind the times in terms of this research. As a Joe Programmer in the Street, I just want it now and in a usable fashion :-)
E.g. many years back, Filinksi's thesis seemed to talk about a nice way to get all this via monads in standard ML.
Did that turn out to not have good usability or something? I have been trying to skim things related / citing that stuff, but it is all over my head at this point. And I don't yet see if/how/when any such things are reified in a language I can use.
Based on Engelbart's famous "mother of all demos" in 1968 that showed off multiple input and output technologies, what could a MOD for PL look like today? Did we already see one with Bret Victor's work, or do we need something more?
There is an ongoing controversy with Kowalski about the definition of "Logic Program":
* A logic program is one in which each computation step can be logically inferred.
* A logic program is written in logical clausal form and procedurally interpreted using forward and backward chaining (which was first done in the Planner programming language).
See the following references:
Visual programming has been an unfulfilled prophecy for years. As so many other areas, like virtual reality, artificial intelligence, or speech recognition, when the hype was high, the underlying technology wasn’t there yet.
But that was not its only problem…
Feedback would be appreciated.
Please see Video
* coherent many-core (soon to be every-word-tagged) architectures, for which Actor concurrency is ideally suited.
In carrying out the above work, it is helpful to distinguish between the following:
* The Actor Model (which can be used to directly model all physically possible computation) is being increasingly used in industrial products, e.g., eBay, Microsoft, Twitter, etc.
Specific examples are discussed of how to use ActorScript and the Actor Model for engineering large-scale information systems in datacenters and for the Internet of Things. In particular, Actors can help avoid many pitfalls and problems that are commonly encountered.
Technical and engineering aspects of Actors for CyberThings even bear on possible resolutions of the current FBI/NSA proposal for mandatory backdoors in the Internet of Things.
Please see the following background reading:
Active forum topics
New forum topics