archives

Reflective Program Generation with Patterns

Reflective Program Generation with Patterns. Manuel Fähndrich, Michael Carbin, James R. Larus. October 2006.

Runtime reflection facilities, as present in Java and .NET, are powerful mechanisms for inspecting existing code and metadata, as well as generating new code and metadata on the fly. Such power does come at a high price though. The runtime reflection support in Java and .NET imposes a cost on all programs, whether they use reflection or not, simply by the necessity of keeping all metadata around and the inability to optimize code because of future possible code changes. A second---often overlooked---cost is the difficulty of writing correct reflection code to inspect or emit new metadata and code and the risk that the emitted code is not well-formed. In this paper we examine a subclass of problems that can be addressed using a simpler mechanism than runtime reflection, which we call compile-time reflection.We argue for a high-level construct called a transform that allows programmers to write inspection and generation code in a pattern matching and template style, avoiding at the same time the complexities of reflection APIs and providing the benefits of staged compilation in that the generated code and metadata is known to be well-formed and type safe ahead of time.

Macros, multi-staged programming etc. are the appropriate buzzowrds.

LtU readers will probably be interested in the STM example (see sec. 7.1)

Gradual Typing for Functional Languages

Gradual Typing for Functional Languages

Static and dynamic type systems have well-known strengths and weaknesses, and each is better suited for different programming tasks. There have been many efforts to integrate static and dynamic typing and thereby combine the benefits of both typing disciplines in the same language. The flexibility of static typing can be improved by adding a type Dynamic and a typecase form. The safety and performance of dynamic typing can be improved by adding optional type annotations or by performing type inference (as in soft typing). However, there has been little formal work on type systems that allow a programmer-controlled migration between dynamic and static typing. Thatte proposed Quasi-Static Typing, but it does not statically catch all type errors in completely annotated programs. Anderson and Drossopoulou defined a nominal type system for an object-oriented language with optional type annotations. However, developing a sound, gradual type system for functional languages with structural types is an open problem.

In this paper we present a solution based on the intuition that the structure of a type may be partially known/unknown at compile-time and the job of the type system is to catch incompatibilities between the known parts of types. We define the static and dynamic semantics of a λ-calculus with optional type annotations and we prove that its type system is sound with respect to the simply-typed λ-calculus for fully-annotated terms. We prove that this calculus is type safe and that the cost of dynamism is “pay-as-you-go”.

In other news, the Holy Grail has been found. Film at 11.

This piece of genius is the combined work of Jeremy Siek, of Boost fame, and Walid Taha, of MetaOCaml fame. The formalization of their work in Isabelle/Isar can be found here.

I found this while tracking down the relocated Concoqtion paper. In that process, I also found Jeremy Siek's other new papers, including his "Semantic Analysis of C++ Templates" and "Concepts: Linguistic Support for Generic Programming in C++." Just visit Siek's home page and read all of his new papers, each of which is worth a story here.