Lambda Calculus

Theorem proving support in programming language semantics

We describe several views of the semantics of a simple programming language as formal documents in the calculus of inductive constructions that can be verified by the Coq proof system. Covered aspects are natural semantics, denotational semantics, axiomatic semantics, and abstract interpretation. Descriptions as recursive functions are also provided whenever suitable, thus yielding a a verification condition generator and a static analyser that can be run inside the theorem prover for use in reflective proofs. Extraction of an interpreter from the denotational semantics is also described. All different aspects are formally proved sound with respect to the natural semantics specification.

More work on mechanized metatheory with an eye towards naturalness of representation and automation. This seems to me to relate to Adam Chlipala's work on A Certified Type-Preserving Compiler from Lambda Calculus to Assembly Language, which relies on denotational semantics and proof by reflection, in crucial ways. More generally, it seems to reinforce an important trend in using type-theory-based theorem provers to tackle programming language design from the semantic point of view (see also A Very Modal Model of a Modern, Major, General Type System and Verifying Semantic Type Soundness of a Simple Compiler). I find the trend exciting, but of course I also wonder how far we can practically go with it today, given that the overwhelming majority of the literature, including our beloved Types and Programming Languages, is based on A Syntactic Approach to Type Soundness. Even the upcoming How to write your next POPL paper in Coq at POPL '08 centers on the syntactic approach.

Is the syntactic approach barking up the wrong tree, in the long term? The semantic approach? Both? Neither?

Natural Deduction for Intuitionistic Non-Commutative Linear Logic

Natural Deduction for Intuitionistic Non-Commutative Linear Logic, Jeff Polakow and Frank Pfenning. TLCA 1999.

Intuitionistic logic captures functional programming in a logical way, as can be seen from the Curry-Howard isomorphism between constructive proofs and functional programs. However, there are many structural properties of programs that are not captured within the intuitionistic framework, such as resource usage, computational complexity, and sequentiality. Intuitionistic linear logic can be thought of as a refinement of intuitionistic logic in which the resource consumption properties of functions can be expressed internally. Here, we further refine it to allow the expression of sequencing of computations. We achieve this by controlling the use of the structural rule of exchange to arrive at intuitionistic non-commutative linear logic.

My earlier post on linguistics reminded me of the Lambek calculus, which is an ordered logic invented in 1958(!) to model how to parse sentences. So I wanted to find a paper on ordered logic (ie, you can't freely swap the order of hypotheses in a context) and link to that.

Gödel, Nagel, minds and machines

Solomon Feferman. Gödel, Nagel, minds and machines. Ernest Nagel Lecture, Columbia University, Sept. 27, 2007.

Just fifty years ago, Ernest Nagel and Kurt Goedel became involved in a serious imbroglio about the possible inclusion of Goedel’s original work on incompleteness in the book, Goedel’s Proof, then being written by Nagel with James R. Newman. What led to the conflict were some unprecedented demands that Goedel made over the use of his material and his involvement in the contents of the book - demands that resulted in an explosive reaction on Nagel’s part. In the end the proposal came to naught. But the story is of interest because of what was basically at issue, namely their provocative related but contrasting views on the possible significance of Goedel’s theorems for minds vs. machines in the development of mathematics.

This is not directly PLT related, and more philosophical than what we usually discuss on LtU, but I think it will be of interest to some members of the community.

While the historical details are interesting, I am not sure I agree with the analysis. It would be interesting to here what others make of this.

To make this item slightly more relevant to LtU, let me point out that both the LC and category theory are mentioned (although they are really discussed only in the references).

On One-Pass CPS Transformations

Olivier Danvy, Kevin Millikin and Lasse R. Nielsen. On One-Pass CPS Transformations. March 2007.

We bridge two distinct approaches to one-pass CPS transformations, i.e., CPS transformations that reduce administrative redexes at transformation time instead of in a post-processing phase. One approach is compositional and higher-order, and is independently due to Appel, Danvy and Filinski, and Wand, building on Plotkin's seminal work. The other is non-compositional and based on a reduction semantics for the lambda-calculus, and is due to Sabry and Felleisen. To relate the two approaches, we use three tools: Reynolds's defunctionalization and its left inverse, refunctionalization; a special case of fold-unfold fusion due to Ohori and Sasano, fixed-point promotion; and an implementation technique for reduction semantics due to Danvy and Nielsen, refocusing.

This work is directly applicable to transforming programs into monadic normal form.

Also in JFP 17:793-812 (2007).

Binary Lambda Calculus and Combinatory Logic

While Anton was waxing about Church & Turing, I figured that Occam's Razor would be the type of proof one would postulate when giving the nod to Lambda Calculus over Universal Turing Machines. This leads inexorably to the question of what is the smallest (as measured in binary bits) Turing Machine that can possibly be constructed. John Tromp provides an answer to this question in his always fun Lambda Calculus and Combinatory Logic Playground:

Pictured you can see the 210 bit binary lambda calculus self-interpreter, and the 272 bit binary combinatory logic self-interpreter. Both are explained in detail in my latest paper available in PostScript and PDF. This design of a minimalistic universal computer was motivated by my desire to come up with a concrete definition of Kolmogorov Complexity, which studies randomness of individual objects. All ideas in the paper have been implemented in the the wonderfully elegant Haskell language, which is basically pure typed lambda calculus with lots of syntactic sugar on top.

Interestingly, the version based on the Lambda Calculus is smaller than the one on Combinators. A statement I found of interest in the paper about PL's:

Although information content may seem to be highly dependent on choice of programming language, the notion is actually invariant up to an additive constant.

Not sure if that statement means that PL research is ultimately doomed. :-)

Compiling with Continuations, Continued

Compiling with Continuations, Continued, Andrew Kennedy. ICFP 2007.

We present a series of CPS-based intermediate languages suitable for functional language compilation, arguing that they have practical benefits over direct-style languages based on A-normal form (ANF) or monads. Inlining of functions demonstrates the benefits most clearly: in ANF-based languages, inlining involves a renormalization step that rearranges let expressions and possibly introduces a new ‘join point’ function, and in monadic languages, commuting conversions must be applied; in contrast, inlining in our CPS language is a simple substitution of variables for variables.

We present a contification transformation implemented by simple rewrites on the intermediate language. Exceptions are modelled using so-called ‘double-barrelled’ CPS. Subtyping on exception constructors then gives a very straightforward effect analysis for exceptions. We also show how a graph-based representation of CPS terms can be implemented extremely efficiently, with linear-time term simplification.

Analyzing the Environment Structure ofHigher-Order Languages using Frame Strings

Analyzing the Environment Structure of Higher-Order Languages using Frame Strings, Matthew Might and Olin Shivers. 2007.

Reasoning about program behaviour in programming languages based on the lambda-calculus requires reasoning in a unified way about control, data and environment structure. Previous analysis work has done an inadequate job on the environment component of this task. We develop a new analytic framework, Delta-CFA, which is based on a new abstraction: frame strings, an enriched variant of procedure strings that can be used to model both control flow and environment allocation. This abstraction enables new environment-sensitive analyses and transformations that have not been previously attainable. We state the critical theorems needed to establish correctness of the entire technology suite, with their proofs.

Even if you're not interested in flow analysis of functional languages, the preface to this paper is very enjoyable reading. (If you are interested in flow analysis, the whole thing is enjoyable reading. I want a couple of weeks to go through this paper in detail.)

Lambda: The Semantics Tool

Lambda is an interactive, graphical, pedagogical computer program that helps students of formal semantics practice the typed lambda calculus.

We discussed how the LC is used in linguistics in the past (check the archives). This tool may be useful even for those not interested in this angle, even though that's the intended use of the software.

Lambda Animator

Lambda Animator from Mike Thyer is a tool for displaying and experimenting with alternate reduction strategies in the LC. The tool can use a number of reduction strategies, including completely lazy evaluation.

The tool can be invoked in several different modes (via JWS or as an applet), and contains many examples, and the documentation provides clear definitions of the sometime confusing terminology in the field.

Notice that the "step" and "run" buttons only work when rendering new graphs, which only works if you are running in trusted mode and have Graphviz installed. Otherwise you'll have to use the the up/down cursor keys or the scroll bar to review already rendered graphs (which exist for all the examples).

The site list relevant papers and dissertations.

LC for kids (alligators, oh my!)

(via Wadler)

A visual LC game.

You can show it to the kids, or try to guess what each element in the game represents before reading the explanation at the end...

XML feed