Cross language runtimes

SIGPLAN's first Programming Languages Software Award goes to LLVM

ACM Press Release:

The ACM Special Interest Group on Programming Languages (SIGPLAN) today presents its first-ever Programming Languages Software Award to Chris Lattner of Apple Inc. for his design and development of the Low Level Virtual Machine (LLVM), a compiler infrastructure that has been quickly adopted by a wide array of industry and academic organizations. Since LLVM’s release as an open source compiler infrastructure in October 2003, companies including Apple, Adobe, and Cray have incorporated it into their commercial products, reflecting its simplicity, flexibility, and versatility.

VMKit: a Substrate for Managed Runtime Environments, VEE '10

VMKit: a Substrate for Managed Runtime Environments, VEE '10

Managed Runtime Environments (MREs), such as the JVM and the CLI, form an attractive environment for program execution, by providing portability and safety, via the use of a bytecode language and automatic memory management, as well as good performance, via just-in-time (JIT) compilation. Nevertheless, developing a fully featured MRE, including e.g. a garbage collector and JIT compiler, is a herculean task. As a result, new languages cannot easily take advantage of the benefits of MREs, and it is difficult to experiment with extensions of existing MRE based languages.

This paper describes and evaluates VMKit, a first attempt to build a common substrate that eases the development of high-level MREs. We have successfully used VMKit to build two MREs: a Java Virtual Machine and a Common Language Runtime. We provide an extensive study of the lessons learned in developing this infrastructure, and assess the ease of implementing new MREs or MRE extensions and the resulting performance. In particular, it took one of the authors only one month to develop a Common Language Runtime using VMKit. VMKit furthermore has performance comparable to the well established open source MREs Cacao, Apache Harmony and Mono, and is 1.2 to 3 times slower than JikesRVM on most of the DaCapo benchmarks.

So... One person built a CLR using VMKit in one month. One consequence of such faster development speeds is that language designers do not have to feel so restricted when targeting a Managed Runtime Environment for their language. If the MRE they want to target has restrictions, they can fork it. If the MRE specification has a gray area, then they can quickly prototype a solution to clarify what the behavior should be for that gray area of the specification. If you are a researcher/student and want to experiment with a new language design and implementation, then you can do so incrementally by first augmenting the MRE and then targeting your language to that new MRE; you can then benchmark the improvements by using the original MRE as a baseline.

Delimited Control in OCaml, Abstractly and Concretely, System Description

Delimited Control in OCaml, Abstractly and Concretely, System Description

We describe the first implementation of multi-prompt delimited control operators in OCaml that is direct in that it captures only the needed part of the control stack. The implementation is a library that requires no changes to the OCaml compiler or run-time, so it is perfectly compatible with existing OCaml source code and byte-code. The library has been in fruitful practical use for four years.

We present the library as an implementation of an abstract machine derived by elaborating the definitional machine. The abstract view lets us distill a minimalistic API, scAPI, sufficient for implementing multi-prompt delimited control. We argue that a language system that supports exception and stack-overflow handling supports scAPI. Our library illustrates how to use scAPI to implement multi-prompt delimited control in a typed language. The approach is general and can be used to add multi-prompt delimited control to other existing language systems.

Oleg was kind enough to send me an e-mail letting me know of this paper's existence (it appears not yet to be linked from the "Computation" page under which it is stored) and to include me in the acknowledgements. Since the paper in its current form has been accepted for publication, he indicated that it can be made more widely available, so here it is. In typical Oleg fashion, it offers insights at both the theoretical and implementation levels.

Bytecodes meet Combinators: invokedynamic on the JVM

Bytecodes meet Combinators: invokedynamic on the JVM. John Rose. VMIL'09.

The Java Virtual Machine (JVM) has been widely adopted in part because of its classfile format, which is portable, compact, modular, verifiable, and reasonably easy to work with. However, it was designed for just one language—Java—and so when it is used to express programs in other source languages, there are often “pain points” which retard both development and execution. The most salient pain points show up at a familiar place, the method call site.
To generalize method calls on the JVM, the JSR 292 Expert Group has designed a new invokedynamic instruction that provides user-defined call site semantics. In the chosen design, invokedynamic serves as a hinge-point between two coexisting kinds of intermediate language: bytecode containing dynamic call sites, and combinator graphs specifying call targets. A dynamic compiler can traverse both representations simultaneously, producing optimized machine code which is the seamless union of both kinds of input. As a final twist, the user-defined linkage of a call site may change, allowing the code to adapt as the application evolves over time. The result is a system balancing the conciseness of bytecode with the dynamic flexibility of function pointers.

The abstract is pretty vague, but this paper is actually quite interesting, particularly if you're interested in meta-object protocols and if, like me, you don't have the interest or patience to read JSRs. Of course, invokedynamic has been discussed many times over the years. The wheels of Java turn slowly...

Marrying VMs

VMKit is an LLVM project; per the announcement at the Proceedings of the 2008 LLVM Developers' Meeting:

VMKit is an implementation of the Java and .NET Virtual Machines that use LLVM to optimize and JIT compile the code. This talk [slides, video] describes how VMKit integrates components from various systems, how bytecode translation works, describes the current performance status of the system, and discusses areas for future extension.

JVM Language Summit report

Tim Bray reports about half of the JVM language summit. Among the things he discusses are Clojure, PHP and JVM/CLR cross-pollination.

Google V8 JavaScript Engine

You can read the docs and download the C++ source here.

V8 is supposedly the main added value of Chrome, the newly announced Google browser.

Our discussion of the Chrome announcement enumerates some of the features of V8.

Proceedings of the 2008 LLVM Developers' Meeting

The proceedings of the 2008 LLVM Developers' Meeting have been posted. The presentations included some overviews of various LLVM subsystems and internals and a few projects targeting the LLVM. Previous meeting's proceedings are also available.

Technometria: Google Web Toolkit

Phil Windley Technometria podcast is dedicated to the Google Web Toolkit. The guest on the show is Bruce Johnson a Tech Lead of GWT.

The show is very good, and more technical than usual. Many themes that are near and dear to LtU are discussed. Here are some pointers:

Bruce talks at length about the advantages of compiling from Java to JS, many of which arise from Java's static typing. He mainly talks about optimizations, but also about how static typing helps with tools in general (IDEs etc.). This was a subject of long and stormy debates here in the past.

The advantages, from a software engineering standpoint, of building in Java vs. JS are discussed. This is directly related to the ongoing discusison here on the new programming-in-the-large features added to JS2. I wonder if someone will write a compiler from Java/GWT to JS2 at some point, which will enable projects to move to JS2 and jump ship on Java all together.

Bruce mentions that since JS isn't class-based, and thus doesn't directly support the OO style many people are used to, there are many ways of translating common OO idioms into JS. This is, of course, the same type of dilemma the Scheme community has about many high level features. Cast as a question on OOP support the questions becomes is it better to provide language constructs that allow various libraries to add OO support in different ways, or to provide language support for a specific style. The same can be asked about a variety of features and programming styles, of course.

Finally, Bruce mentions that as far as he knows no one thought about something like GWT before they did. Well, I for one, and I don't think I was the only one, talked many times (probably on LtU) about Javascript as a VM/assembly language of the browser, clearly thinking about JS as a target language. I admint I wasn't thinking aobut compiling Java... But then, I am not into writing Java, so why would I think about Java as the source language...

JVM Languages group

Charles Nutter:

If you are interested in the future of non-Java languages on the JVM, you should be on this list. Yes, we talk about a lot of JVM lanuage implementation challenges, we discuss compilers and stack frames and call-site optimizations, but we also talk about features peripheral to language implementation like package indexing and retrofitting Java 5+ code. We need your help.

XML feed