LtU Forum

techniques for JIT (parallel?) compilation of straight line numerical code

Reading A Methodology for Generating Verified Combinatorial Circuits, which involved some code generation/staging and AST optimization, I found myself wondering if there are libraries for just-in-time compilation (and then dynamic loading/linking) of simple numerical code. C seems like a decent language as a generation target (in that it's fairly assembly-like, and in its later (C99?) incarnations has fairly portable support for low level types like (and operations for) e.g. 80-bit floating point values.

Further, although I'm tempted to write everything in one language (currently C++) since it has made interoperability trivial, there must be some reasonable argument that the overhead of launching an optimizing C compiler surely gives you the leeway to say, invoke Meta-Ocaml routines that do the code-generation magic (never having embedded another language in a C++ program or called wrapped C++ in another language, I'm not sure how expensive or hard this is).

Anyone have some good references (to theory or practical implementations) or advice? Even the basic technique necessary to say, compile and dynamically load/link C code into another language (while probably somewhat platform specific) is foreign to me :)

I'm not at all committed to C as the code generation target; I just want something that compiles and optimizes loopless straight-line numerical computation well, and bonus points if it can split up the computation so that it runs in a distributed parallel fashion (N cpus per node, local memory on each node, relatively expensive network communication). It would also be great if code could be reordered to take advantage of cache locality and processor parallel-numeric abilities as well (of course this should be handled by the compiler and not addressed directly by the person generating the numerical code?). I have no idea which compilers (Intel's C compiler might be good) can do such analyses and how to encourage them to do so; by default I use gcc, which I understand is somewhat basic.

My intended application is repeated Estimation-Maximization (EM) parameter optimization; the computation for accumulating counts for each parameter as a function of the previous iteration's count can be compiled and made branchless, optimized (sharing common subexpressions, etc.), and then run a few hundred times. Thus I want to compile and load compiled code on the fly (the generated code of course depends on the input automata I'm estimating parameters for). For historical reasons, my work so far has been entirely C++, but obviously, switching to code generation/compilation would allow me to use any language as the driver as long as it can access the results of the compiled code without expensive marshalling or interprocess communication.

(apologies if this is more a compilers/tools than a language question)

IronPython: First public version released

IronPython is a new Python port for the CLR. The recently released version 0.6 is the first public version.

Mind the Gap

I am very happy that I found GAP as it allowed me the opportunity to make a bad pun, I just had to try it when I read their description of how their web site had been organized.

I think the Grouping capabilities are pretty cool

Vyper is missing

In case any language collector around here still has a copy of Vyper (the Python interpreter written in Ocaml by John Skaller that used to live at http://vyper.sourceforge.net/) flying around: Don't delete it, it may be the last existing copy, since neiter the author nor sourceforge still have one.

(So much for the permanence of electronic reccords in the internet age...)

A Functional Semantics of Attribute Grammars

A Functional Semantics of Attribute Grammars

A definition of the semantics of attribute grammars is given,
using the lambda calculus. We show how this semantics allows us to prove
results about attribute grammars in a calculational style.

I didn't encounter attribute grammars since school, so it was refreshing to see them in this light.
I have to check yet, whether "abstract interpretation" used in this paper is related to the one from Oleg's paper.

Mozart Oz 1.3.1 Released

Mozart Oz 1.3.1 final is released for all platforms.

The Sphere Online Judge

This site will automatically judge solutions to 121 problems in 21 languages.

Some interesting forum posts:

Type systems and software evolution

Since the "why are type systems interesting" thread is getting so long, I thought I'd start a new thread on this somewhat more specific issue.

I agree with Frank that software development is concerned with building artefacts (i.e, program, program modules, classes, libraries and what not) about which we can reason statically - which seems to me to be tantamount to saying that we must be able to understand the product we are building, not just be able to run it.

But let's back up a bit, and consider the problematic issue of gathring software requirements. This is one of the issues software engineering is struggling with. We aren't very good at establishing what the real customer requirements are, and even when we do, customers change their mind, or want more features in the next release. The iterative life cycle model isn't only (or even primarily) about fixing bugs. It's about evolving software per customer requirements.

Now let's assume a best case scenario, where we have consistent, even formal, requirements, and we design a software system that fulfills them. We can reason statically, with the aid of a static type system, and convince ourselves that the software does what's expected from it.

Now comes the next cycle, because users want changes and additional features ("show longer names on the tracker page", "enable keywords for comments" etc.) Some would argue that unless we are talking about making trivial changes, we should start form scratch and redesign the system. For various reasons, most of them obvious, this isn't what usually happens. We modify the exisiting sytem. Techniques like regression testing help us make sure we haven't caused too much damage.

Now, let's consider the question of whether the type system helps us with this sort of activity. Obviously, types give that same static guarantees for the changed program they give any type correct program. But can the type system do more? It's likely that some of the changes require chaneging the types we use in our system. Which type systems help us make such changes locally? Do types introduce unnecessary coupling to the sytem, making evolution harder? How about support for (sound) refinement, which would allow checking that changes are consistent with prior specification, while not propagating the the changes in the system in such a way that the impact of the change becomes unmanagable?

I think these sort of questions are worth considering. Thye may help us design better type systems, which would help solve a real need.

Eclipse C/C++ Dev Tools 2.0 Released

It's a milestone season for Eclipse. The Eclipse C/C++ Development Tools project has released CDT 2.0 final for Eclipse 3.0. They posted some nice CDT screenshots. There are screenshots for Eclipse, too.

Having trouble with 'purity'

I am having trouble wrapping my mind around the semantics of a pure functional programming language. In writing my current programming language I chose to treat the variable assignment expression:

x = 5

as the creation of a function named x.

x = 5 is semantically equivalent to:

x = fn () -> 5

Is this sort of 'variable assignment' considered to be an example of pure in the FP sense?

NOTE: if x (in the above example) were to be redefined, a new closure would be created with the associated value.

Sorry to ask such a trivial question, but I am having trouble finding semantic discussions concerning simple functional programming idioms.

Best regards,

M.J. Stahl

XML feed