Lambda Calculus

Decidability of Higher Order Matching

Decidability of Higher Order Matching, Colin Stirling.

Higher-order unification is the problem: given an equation t = u containing free variables, is there a solution substitution Θ such that tΘ and uΘ have the same normal form? The terms are drawn from the simply typed lambda calculus and the same normal form is up to βη-equality. Higher order matching is the particular instance: when the term u is closed, can t be pattern matched to u? Although higher-order unification is undecidable (even if free variables are only second-order), higher-order matching was conjectured to be decidable by Huet. [...] In this paper, we confirm Huet's conjecture that higher-order matching is decidable.

This paper is very technical, but I think it's a problem of significant interest, since decidable fragments of higher-order unification are very important to theorem proving systems. As an aside, Huet conjectured that higher order matching was decidable in 1976, so it's taken thirty years of effort to prove this.

Light Logics and Optimal Reduction

Not sure this isn't a tad too technical for the front page, but posting has been light, so...

Baillot, Coppola & Del Lago have an arXived preprint, "Light Logics and Optimal Reduction: Completeness and Complexity:

Typing of lambda-terms in Elementary and Light Affine Logic (EAL, LAL, resp.) has been studied for two different reasons: on the one hand the evaluation of typed terms using LAL (EAL, resp.) proof-nets admits a guaranteed polynomial (elementary, resp.) bound; on the other hand these terms can also be evaluated by optimal reduction using the abstract version of Lamping's algorithm. The first reduction is global while the second one is local and asynchronous. We prove that for LAL (EAL, resp.) typed terms, Lamping's abstract algorithm also admits a polynomial (elementary, resp.) bound. We also show its soundness and completeness (for EAL and LAL with type fixpoints), by using a simple geometry of interaction model (context semantics).

We've 'known' that Lamping's algorithm implements Levy-optimality since Lamping 'proved' it in 1989, but unfortunately his argument has holes, and it has proven hard to fix them, this despite there being a fairly accessible geometric intuition supporting soundness & completeness. Harry Mairson has done worthy missionary work advertising the importance of this gap and working towards filling it. I haven't yet worked my way through this proof, but it looks like it belongs to a very attractive class of argument; namely those that attempt to forge a Curry-Howard-like correspondence between Lamping graphs and proof nets of weakened linear logics. Recommended reading for LtUers who enjoy a challenge!

A Logic for Parametric Polymorphism

A Logic for Parametric Polymorphism, Gordon Plotkin and Martín Abadi.

In this paper we introduce a logic for parametric polymorphism. Just as LCF is a logic for the simply-typed lambda-calculus with recursion and arithmetic, our logic is a logic for System F. The logic permits the formal presentation and use of relational parametricity. Parametricity yields --- for example --- encodings of initial algebras, final co-algebras and abstract datatypes, with corresponding proof principles of induction, co-induction and simulation.

RZ: a tool for bringing constructive and computable mathematics closer to programming practice

RZ: a tool for bringing constructive and computable mathematics closer to programming practice, Chris Stone and Andrej Bauer.

Realizability theory is not only a fundamental tool in logic and computability, but also has direct application to the design and implementation of programs: it can produce interfaces for the data structure corresponding to a mathematical theory. Our tool, called RZ, serves as a bridge between the worlds of constructive mathematics and programming. By using the realizability interpretation of constructive mathematics, RZ translates specifications in constructive logic into annotated interface code in Objective Caml. The system supports a rich input language allowing descriptions of complex mathematical structures. RZ does not extract code from proofs, but allows any implementation method, from handwritten code to code extracted from proofs by other tools.

Realizability is a way of formalizing the constructive interpretation of logical formulas (eg, the BHK interpretation). Constructively, a proof of an implication A implies B means that if you are given evidence for A, you must be able to compute evidence for B. Since this constructive interpretation does not specify any particular programming language (theorists often use the lambda calculus, Turing machines, and the like) so Andrej Bauer and Chris Stone observed that they could take the realizers of mathematical proofs to be Objective Caml programs! This meant that they could take mathematical theories and generate ML module signatures describing the interface to the implementation of a mathematical theory, plus comments about what invariants must hold.

Verifying Semantic Type Soundness of a Simple Compiler

Having been entirely too dismissive of one of Nick Benton's papers before, it seems only fitting that I should attempt expiation for my sins by discussing Formalizing and Verifying Semantic Type Soundness of a Simple Compiler:

We describe a semantic type soundness result, formalized in the Coq proof assistant, for a compiler from a simple imperative language with heap-allocated data into an idealized assembly language. Types in the high-level language are interpreted as binary relations, built using both second-order quantification and a form of separation structure, over stores and code pointers in the low-level machine.

I found this work striking for its strong defense of the utility of semantic, vs. syntactic, type soundness, and for its use of logical relations vs. unary predicates. I could swear that I recently read another paper making quite similar observations about semantic vs. syntactic type soundness. I'm not sure which paper it was, but regardless, there seems to me to be a great deal of overlap of insight and, to a lesser extent, approach, with A Certified Type-Preserving Compiler from Lambda Calculus to Assembly Language, and of course by extension to Leroy's work on formalizing big-step operational semantics, allocation, etc. Finally we have the very exciting news about progress on a mechanized metatheory for Standard ML, in Twelf. It's a very exciting time for mechanized metatheory! I'd love to get at least Nick Benton, Adam Chlipala, Xavier Leroy, and Tim Sweeney in a room for a few months so they could create the Next Mainstream Programming Language in Coq. Of course, we have Dr. Benton, Mr. Chlipala, and Mr. Sweeney right here on LtU... :-)

Update: Mea culpa; the paper I had in mind was A Very Modal Model of a Modern, Major, General Type System. It would seem that Dr. Benton, Mr. Chlipala, and Dr. Appel and colleagues have arrived at quite similar perspectives on good approaches to mechanizing metatheory for very interesting programming language design efforts.

A Certified Type-Preserving Compiler from Lambda Calculus to Assembly Language

A Certified Type-Preserving Compiler from Lambda Calculus to Assembly Language

I present a certified compiler from simply-typed lambda calculus to assembly language. The compiler is certified in the sense that it comes with a machine-checked proof of semantics preservation, performed with the Coq proof assistant. The compiler and the terms of its several intermediate languages are given dependent types that guarantee that only well-typed programs are representable. Thus, type preservation for each compiler pass follows without any significant "proofs" of the usual kind. Semantics preservation is proved based on denotational semantics assigned to the intermediate languages. I demonstrate how working with a type-preserving compiler enables type-directed proof search to discharge automatically large parts of the proof obligations.

Software/proof source code and documentation

Slides are available from a talk I gave at the Projet Gallium seminar at INRIA Rocquencourt, in OpenOffice and PDF formats.

I found this while attempting to follow up on this post. The approach taken here—dependently-typed ASTs and denotational semantics—and the observation that, carried to its logical conclusion, the result is typed intermediate and assembly languages, proof-carrying code, etc. suggests to me a connection to this work. It also seems to offer a denotational counterpoint to Leroy's coinductive big-step operational semantics, also done in Coq. Fascinating stuff.

Lightweight Fusion by Fixed Point Promotion

Lightweight Fusion by Fixed Point Promotion, Atsushi Ohori and Isao Sasano.

This paper proposes a lightweight fusion method for general recursive function definitions. Compared with existing proposals, our method has several significant practical features: it works for general recursive functions on general algebraic data types; it does not produce extra runtime overhead (except for possible code size increase due to the success of fusion); and it is readily incorporated in standard inlining optimization. This is achieved by extending the ordinary inlining process with a new fusion law that transforms a term of the form f o (fix g.λx.E) to a new fixed point term fix h.λx.E' by promoting the function f through the fixed point operator.

This is a sound syntactic transformation rule that is not sensitive to the types of f and g. This property makes our method applicable to wide range of functions including those with multi-parameters in both curried and uncurried forms. Although this method does not guarantee any form of completeness, it fuses typical examples discussed in the literature and others that involve accumulating parameters, either in the foldl-like specific forms or in general recursive forms, without any additional machinery.

In order to substantiate our claim, we have implemented our method in a compiler. Although it is preliminary, it demonstrates practical feasibility of this method.

Deforestation is one of those optimizations every functional programmer who has ever had to rewrite a beautiful composition of maps and filters into an evil, ugly explicit fold has always longed for. Unfortunately, the standard lightweight fusion algorithms have trouble with examples as simple as foldl, and this paper has a very nice account of a simple algorithm that can handle it.

The Theory of Parametricity in Lambda Cube

A draft by Takeuti Izumi

This paper defines the theories of parametricity for the system lambda-P-omega in lambda cube, and shows some of its application. These theories are defined by the axiom sets in the formal theories. These theories prove various important semantical properties in the formal systems.

Parametricity is Wadler gets his theorems for free, nad Izumi gives an example of one of these free theorems for dependent sums in the Calculus of Constructions.

Holodeck games and CCCs

From the n-Category Cafe, some notes on Holodeck Games and cartesian closed categories.

This also appeared here (and was mentioned by Phil Wadler). The two posts seem to have different addenda, comments and links, so it may be worth looking at both.

It's fun to see lambda calculus introduced to an audience already familiar with categories, as that seems to be the opposite of the usual state of affairs around here, and I can think of certain LtU regulars who will hopefully find this whole subject enjoyable.

Arrows, like Monads, are Monoids

By Chris Heunen and Bart Jacobs

At first, these Arrows may look somewhat arbitrary. Here we show that they are categorically fairly civilised, by showing that they correspond to monoids in suitable subcategories of bifunctors Cop × C → C. This shows that, at a suitable level of abstraction, arrows are like monads …

XML feed