Meta-Programming

Types for Flexible Objects

Types for Flexible Objects, by Pottayil Harisanker Menon, Zachary Palmer, Alexander Rozenshteyn, Scott Smith:

Scripting languages are popular in part due to their extremely flexible objects. These languages support numerous object features, including dynamic extension, mixins, traits, and first-class messages. While some work has succeeded in typing these features individually, the solutions have limitations in some cases and no project has combined the results.

In this paper we define TinyBang, a small typed language containing only functions, labeled data, a data combinator, and pattern matching. We show how it can directly express all of the aforementioned flexible object features and still have sound typing. We use a subtype constraint type inference system with several novel extensions to ensure full type inference; our algorithm refines parametric polymorphism for both flexibility and efficiency. We also use TinyBang to solve an open problem in OO literature: objects can be extended after being messaged without loss of width or depth subtyping and without dedicated metatheory. A core subset of TinyBang is proven sound and a preliminary implementation has been constructed.

An interesting paper I stumbled across quite by accident, it purports quite an ambitious set of features: generalizing previous work on first-class cases while supporting subtyping, mutation, and polymorphism all with full type inference, in an effort to match the flexibility of dynamically typed languages.

It does so by introducing a host of new concepts that are almost-but-not-quite generalizations of existing concepts, like "onions" which are kind of a type-indexed extensible record, and "scapes" which are sort of a generalization of pattern matching cases.

Instead of approaching objects via a record calculus, they approach it using its dual as variant matching. Matching functions then have degenerate dependent types, which I first saw in the paper Type Inference for First-Class Messages with Match-Functions. Interesting aside, Scott Smith was a coauthor on this last paper too, but it isn't referenced in the "flexible objects" paper, despite the fact that "scapes" are "match-functions".

Overall, quite a dense and ambitous paper, but the resulting TinyBang language looks very promising and quite expressive. Future work includes making the system more modular, as it currently requires whole program compilation, and adding first-class labels, which in past work has led to interesting results as well. Most work exploiting row polymorphism is particularly interesting because it supports efficient compilation to index-passing code for both records and variants. It's not clear if onions and scapes are also amenable to this sort of translation.

Edit: a previous paper was published in 2012, A Practical, Typed Variant Object Model -- Or, How to Stand On Your Head and Enjoy the View. BigBang is their language that provides syntactic sugar on top of TinyBang.

Edit 2: commas fixed, thanks!

Dependently-Typed Metaprogramming (in Agda)

Conor McBride gave an 8-lecture summer course on Dependently typed metaprogramming (in Agda) at the Cambridge University Computer Laboratory:

Dependently typed functional programming languages such as Agda are capable of expressing very precise types for data. When those data themselves encode types, we gain a powerful mechanism for abstracting generic operations over carefully circumscribed universes. This course will begin with a rapid depedently-typed programming primer in Agda, then explore techniques for and consequences of universe constructions. Of central importance are the “pattern functors” which determine the node structure of inductive and coinductive datatypes. We shall consider syntactic presentations of these functors (allowing operations as useful as symbolic differentiation), and relate them to the more uniform abstract notion of “container”. We shall expose the double-life containers lead as “interaction structures” describing systems of effects. Later, we step up to functors over universes, acquiring the power of inductive-recursive definitions, and we use that power to build universes of dependent types.

The lecture notes, code, and video captures are available online.

As with his previous course, the notes contain many(!) mind expanding exploratory exercises, some of which quite challenging.

Tool Demo: Scala-Virtualized

Tool Demo: Scala-Virtualized

This paper describes Scala-Virtualized, which extends the Scala language and compiler with a small number of features that enable combining the benefits of shallow and deep embeddings of DSLs. We demonstrate our approach by showing how to embed three different domain-specific languages in Scala. Moreover, we summarize how others have been using our extended compiler in their own research and teaching. Supporting artifacts of our tool include web-based tutorials, nightly builds, and an Eclipse update site hosting an up-to-date version of the Scala IDE for Eclipse based on the Virtualized Scala compiler and standard library.

Scala has always had a quite good EDSL story thanks to implicits, dot- and paren-inference, and methods-as-operators. Lately there are proposals to provide it with both macros-in-the-camlp4-sense and support for multi-stage programming. This paper goes into some depth on the foundations of the latter subject.

Macros that Work Together

Macros that Work Together - Compile-Time Bindings, Partial Expansion, and Definition Contexts, Matthew Flatt, Ryan Culpepper, David Darais, and Robert Bruce Findler. Under consideration for publication in J. Functional Programming.

Racket (formerly PLT Scheme) is a large language that is built mostly within itself. Unlike the usual
approach taken by non-Lisp languages, the self-hosting of Racket is not a matter of bootstrapping
one implementation through a previous implementation, but instead a matter of building a tower of
languages and libraries via macros. The upper layers of the tower include a class system, a component
system, pedagogic variants of Scheme, a statically typed dialect of Scheme, and more. The demands
of this language-construction effort require a macro system that is substantially more expressive than
previous macro systems. In particular, while conventional Scheme macro systems handle stand-alone
syntactic forms adequately, they provide weak support for macros that share information or macros
that use existing syntactic forms in new contexts.

This paper describes and models novel features of the Racket macro system, including support for
general compile-time bindings, sub-form expansion and analysis, and environment management. The
presentation assumes a basic familiarity with Lisp-style macros, and it takes for granted the need for
macros that respect lexical scope. The model, however, strips away the pattern and template system
that is normally associated with Scheme macros, isolating a core that is simpler, that can support
pattern and template forms themselves as macros, and that generalizes naturally to Racket’s other
extensions.

A good description of Racket's rocket science tools for growing languages.

Language Virtualization for Heterogeneous Parallel Computing

Hassan Chafi, Zach DeVito, Adriaan Moors, Tiark Rompf, Arvind Sujeeth, Pat Hanrahan, Martin Odersky, and Kunle Olukotun describe an approach to parallel DSLs that is a hybrid between external DSLs and internal DSLs in Language Virtualization for Heterogeneous Parallel Computing.

As heterogeneous parallel systems become dominant, application developers are being forced to turn to an incompatible mix of low level programming models (e.g. OpenMP, MPI, CUDA, OpenCL). However, these models do little to shield developers from the difficult problems of parallelization, data decomposition and machine-specific details. Ordinary programmers are having a difficult time using these programming models effectively. To provide a programming model that addresses the productivity and performance requirements for the average programmer, we explore a domain-specific approach to heterogeneous parallel programming.

We propose language virtualization as a new principle that enables the construction of highly efficient parallel domain specific languages that are embedded in a common host language. We define criteria for language virtualization and present techniques to achieve them.We present two concrete case studies of domain-specific languages that are implemented using our virtualization approach.

While the motivation of the paper is parallelization the proposed design looks like LINQ expression trees dialed to 11.

Semi-implicit batched remote code execution as staging

Oleg Kiselyov has just posted another amazing work: Semi-implicit batched remote code execution as staging.

Batching several remote-procedure or remote-object operations into one request decreases the number of network client/server round-trips, reduces the communication overhead and indeed significantly improves performance of distributed applications. The benefits are offset by the cost of restructuring the code to incite large batches; and by the increase in the difficulty of reasoning about the code, predicting its performance let alone establishing correctness. The overall research goal is to reduce the downside.

We describe a semi-implicit batching mechanism that makes the points of remote server communication explicit yet conceals the proxies, saving the trouble of writing them. The changes to the client code are minimal: mainly, adding calls to force. The type-checker will not let the programmer forget to call force. The remote batch server is simple and generic, with no need to optimize for specific clients.

Our mechanism batches both independent and data-dependent remote calls. Our mechanism is compositional, letting the programmer build nested applications and conditional (and, potentially, iterative) statements using composition, application and naming. Writing a remote program is exactly like writing a typed local program, which is type-checked locally, and can even be executed locally (for debugging).

The key insights are treating remote execution as a form of staging (meta-programming), generalizing mere remote function calls to remote applicative and conditional expressions, and introducing an embedded domain-specific language, Chourai, for such expressions. A batch of dependent remote function calls can then be regarded as a complex applicative expression in the A-normal form. Another key insight is that emulating call-by-value via call-by-need surprisingly makes sense.

Here's an example piece of Chourai code, for deleting albums whose rating is below 5 among the first n albums of an album database (called "large") hosted by the server. get_album, next_album, and similar functions constitute the "RPC" interface to the server.

     let delete_low_rating n =
      let rec loop album i =
        let t = guard (app2 lt (app get_rating album) (int 5)) 
                      (fun () -> app delete_album album) in
        if i >= n then force t else
        loop (app next_album album) (succ i)
      in loop (app get_album (string "large")) 0;;

Amazingly, delete_low_rating 4 requires just one round-trip to the server!

Directly Reflective Meta-Programming

I was recently pointed to the following fascinating paper:

Directly Reflective Meta-Programming, Aaron Stump (HOSC 22(2), June 2009).

Existing meta-programming languages operate on encodings of programs as data. This paper presents a new meta-programming language, based on an untyped lambda calculus, in which structurally reflective programming is supported directly, without any encoding. The language features call-by-value and call-by-name lambda abstractions, as well as novel reflective features enabling the intensional manipulation of arbitrary program terms. The language is scope safe, in the sense that variables can neither be captured nor escape their scopes. The expressiveness of the language is demonstrated by showing how to implement quotation and evaluation operations, as proposed by Wand. The language's utility for meta-programming is further demonstrated through additional representative examples. A prototype implementation is described and evaluated.

Meta-programming without any encoding at all. The only minor drawback that I can see is that this language is untyped - and designing a type system for it would be extremely challenging.

Omega - Language of the Future

When I discovered Tim Sheard's Languages of the Future, I realized that PLs do indeed have a future (beyond asymptotically approaching CLOS and/or adding whimsical new rules to your type checker). Compared to languages like Lisp, pre-generics Java, and Python, the "futuristic" languages like Haskell and O'Caml seemed to mainly offer additional static verification, and some other neat patterns, but the "bang for the buck" seemed somewhat low, especially since these languages have their own costs (they are much more complex, they rule out many "perfectly fine" programs).

Ωmega seems like a true revolution to me - it shows what can be done with a really fancy typesystem, and this seems like the first astounding advancement over existing languages, from Python to Haskell. Its mantra is that it's possible to reap many (all?) benefits of dependent programming, without having to suffer its problems, by adding two much more humble notions to the widely understood, ordinary functional programming setting: GADTs + Extensible Kinds.

Sheard and coworkers show that these two simple extensions allow the succinct expression of many dependently-typed and related examples from the literature. Fine examples are implementations of AVL and red-black trees that are always balanced by construction - it's simply impossible to create an unbalanced tree; this is checked by the type-system. It seems somewhat obvious that sooner than later all code will be written in this (or a similar) way.

How to understand this stuff: my route was through the generics of Java and C# (especially Featherweight Generic Java, FGJω, and A. Kennedy's generics papers). Once you understand basic notions like type constructors, variance, and kinds, you know everything to understand why GADTs + Extensible Kinds = Dependent Programming (and also esoteric stuff like polytypic values have polykinded types for that matter).

It is my belief that you must understand Ωmega now! Even if you're never going to use it, or something like it, you'll still learn a whole lot about programming. Compared to Ωmega, other languages are puny. ;P

Fortifying Macros

Fortifying Macros. Ryan Culpepper, Matthias Felleisen, ICFP 2010.

Existing macro systems force programmers to make a choice between clarity of specification and robustness. If they choose clarity, they must forgo validating significant parts of the specification and thus produce low-quality language extensions. If they choose robustness, they must write in a style that mingles the implementation with the specification and therefore obscures the latter. This paper introduces a new language for writing macros. With the new macro system, programmers naturally write robust language extensions using easy-to-understand specifications. The system translates these specifications into validators that detect misuses—including violations of context-sensitive constraints—and automatically synthesize appropriate feedback, eliminating the need for ad hoc validation code.

Presents syntax-parse, which seems like a truly great advancement over existing macro writing facilities.

The Future of C#

One of the future additions to C# announced by Anders Hejlsberg in this entertaining video from 2008 is Compiler as a Service. By that he means the ability to eval code strings (and I'm guessing that this will also be integrated with C#'s built-in AST objects).

He shows this off at around minute 59, to great effect and great excitement by the audience. It feels like an inflection point. There probably won't be another REPL-less language from now on.

I predict that after that, they'll add hygienic macros and quasisyntax.

XML feed