archives

HipHop: Facebook runs compiled PHP on its servers

While PHP deservedly gets a terrible rep around programming language folks, this is still an interesting announcement: HipHop compiles PHP down to C++ and gets about a 2x speedup. HipHop will be released as open source, and is currently in production use, serving 90% of Facebook's traffic. It makes me wish Facebook used Python: a large-scale deployment like this would be a great boon to the PyPy project.

Qi4J released: OO done right?

Apparently Qi4J has been released (Qi4J was previously noted very briefly on LtU here).

Qi4j is a significant step forward for the Java platform, with its focus on Domain Driven Design and support for the Data-Context-Interactionexternal link paradigm that is currently being worked on by Trygve Reenskaug as a way to revive object-orientation.

See the tutorials to get a feel for it, I guess.

So, with the understanding that algorithms should define roles for collaborating objects, and objects should then implement these in order to participate in these algorithms, it should be trivial to realize that what has passed for OOP so far, in terms of "class oriented programming", where this role-focus of objects is difficult to achieve, if not even impossible, is just plain wrong. I mean seriously, catastrophically, terminally wrong.

Formal treatments (or examples of) of "function concatenation"?

We're familiar with the notion of function composition which is essentially sticking functions end-to-end:

(a -> b) -> (b -> c) -> (a -> c)

However, is there any formal treatment of function "concatenation" for functions that operate on n-tuples that would be essentially sticking functions "side-by-side"? The type would be something like this (where a, b, c, and d are n-tuples and ++ is the concatenation operator):

(a -> b) -> (c -> d) -> (a ++ c) -> (b ++ d)

I think a visual example allows a good intuition for what I'm talking about. If F is a function from a 1-tuple to a 2-tuple, and G is a function from a 2-tuple to a 1-tuple, then their composition would be this:

     IN
      |
   .--o--------.
   |     F     |
   |--o-----o--|
      |     |
   .--o-----o--.
   |     G     |
   |--o--------|
      |
     OUT

and their concatenation would be this:

     IN          IN    IN
      |           |     |
   .--o--------.--o-----o--.
   |     F     |     G     |
   |--o-----o--|--o--------|
      |     |     |
     OUT   OUT   OUT

Composition indicates a dependence (e.g. G is dependent on the output of F) that requires sequential evaluation whereas concatenation indicates an independence (e.g. F and G do not share inputs) that allows for parallel evaluation. Both operations are associative.

There exist operations with a similar spirit to the concatenation expressed above: the functional form of construction in Backus's FP, the *** operator for the -> arrow in Haskell, and the family of spread combinators in Factor to name a few. However, none of these deal with concatenated inputs/outputs and none of these are associative, so they're not quite what I'm after.

Any pointers towards related mathematical notions or languages with a similar feature would be much appreciated.

Extreme non-choosiness

As a mathematician who's quite new to type theory, I have only vaguely internalised the fact that forall a.a and Pi a.a (or whatever the appropriate ASCII-isations are) mean the same thing. Thus, although it seems obvious to me that forall a.a is empty (or does one say uninhabited?), I was nonetheless quite startled to read the following in Barendregt's "Introduction to generalised type systems", p. 132:

… write \bot \equiv (\Pi \alpha : *. \alpha), which is the second-order definition of falsum.

I tried to re-cast this in language more familiar to me, and wound up with the statement that the product of all sets is empty.

Now, I know that type theories tend (understandably) to be biased more towards constructivist than traditional ZFC-based axiomatisations; but it seems to me that, beyond just saying that we don't assume the Axiom of Choice, this statement is saying that we take that axiom as the definition of ‘false’! Is the rejection of choosiness really so definitive, or am I just skipping over some point (like, say, that some sets are empty, so that including them in the product will naturally make it, too, empty)?

Monads in Action

Monads in Action, Andrzej Filinski, POPL 2010.

In functional programming, monadic characterizations of computational effects are normally understood denotationally: they describe how an effectful program can be systematically expanded or translated into a larger, pure program, which can then be evaluated according to an effect-free semantics. Any effect-specific operations expressible in the monad are also given purely functional definitions, but these definitions are only directly executable in the context of an already translated program. This approach thus takes an inherently Church-style view of effects: the nominal meaning of every effectful term in the program depends crucially on its type.

We present here a complementary, operational view of monadic effects, in which an effect definition directly induces an imperative behavior of the new operations expressible in the monad. This behavior is formalized as additional operational rules for only the new constructs; it does not require any structural changes to the evaluation judgment. Specifically, we give a small-step operational semantics of a prototypical functional language supporting programmer-definable, layered effects, and show how this semantics naturally supports reasoning by familiar syntactic techniques, such as showing soundness of a Curry-style effect-type system by the progress+preservation method.

The idea of monadic reflection was one I never felt I really understood properly until I read this paper, so now I'll have to go back and re-read some of his older papers on the subject.

Course focusing on JIT compilers?

Is anybody aware of university courses (or textbooks) with a focus on the design of just-in-time compilers (JITs)? (Basically, as alternative to a standard compiler construction course)

In the last few years there has been some interesting research on the topic of JITs, (many being mentioned here on LtU), e.g., the Javascript JITs (TraceMonkey, Google V8), the PyPy JIT, LuaJIT, the recent story about verified JITs, Factor's compilation strategy might also be of interest, etc..

So far, I have not found any course focusing on JITs. At most, some compiler construction courses mention JIT technology (usually Java's) in passing. The textbook situation seems to be equally bland.