Semantics

Levy: a Toy Call-by-Push-Value Language

Andrej Bauer's blog contains the PL Zoo project. In particular, the Levy language, a toy implementation of Paul Levy's CBPV in OCaml.

If you're curious about CBPV, this implementation might be a nice accompaniment to the book, or simply a hands on way to check it out.

It looks like an implementation of CBPV without sum and product types, with complex values, and without effects. I guess a more hands-on way to get to grips with CBPV would be to implement any of these missing features.

The posts are are 3 years old, but I've only just noticed them. The PL Zoo project was briefly mentioned here.

Imperative Programs as Proofs via Game Semantics

Imperative Programs as Proofs via Game Semantics, Martin Churchill, James Laird and Guy McCusker. To appear at LICS 2011.

Game semantics extends the Curry-Howard isomorphism to a three-way correspondence: proofs, programs, strategies. But the universe of strategies goes beyond intuitionistic logics and lambda calculus, to capture stateful programs. In this paper we describe a logical counterpart to this extension, in which proofs denote such strategies. We can embed intuitionistic first-order linear logic into this system, as well as an imperative total programming language. The logic makes explicit use of the fact that in the game semantics the exponential can be expressed as a final coalgebra. We establish a full completeness theorem for our logic, showing that every bounded strategy is the denotation of a proof.

This paper increases the importance of gaining a more-than-casual understanding of game semantics for me, since it combines two of my favorite things: polarized type theories and imperative higher-order programs.

Macros that Work Together

Macros that Work Together - Compile-Time Bindings, Partial Expansion, and Definition Contexts, Matthew Flatt, Ryan Culpepper, David Darais, and Robert Bruce Findler. Under consideration for publication in J. Functional Programming.

Racket (formerly PLT Scheme) is a large language that is built mostly within itself. Unlike the usual
approach taken by non-Lisp languages, the self-hosting of Racket is not a matter of bootstrapping
one implementation through a previous implementation, but instead a matter of building a tower of
languages and libraries via macros. The upper layers of the tower include a class system, a component
system, pedagogic variants of Scheme, a statically typed dialect of Scheme, and more. The demands
of this language-construction effort require a macro system that is substantially more expressive than
previous macro systems. In particular, while conventional Scheme macro systems handle stand-alone
syntactic forms adequately, they provide weak support for macros that share information or macros
that use existing syntactic forms in new contexts.

This paper describes and models novel features of the Racket macro system, including support for
general compile-time bindings, sub-form expansion and analysis, and environment management. The
presentation assumes a basic familiarity with Lisp-style macros, and it takes for granted the need for
macros that respect lexical scope. The model, however, strips away the pattern and template system
that is normally associated with Scheme macros, isolating a core that is simpler, that can support
pattern and template forms themselves as macros, and that generalizes naturally to Racket’s other
extensions.

A good description of Racket's rocket science tools for growing languages.

Conservative Logic

Edward Fredkin and Tommoasso Toffoli from the MIT Labratory for Computer Science present Conservative Logic...

a comprehensive model of computation which explicitly reflects a number of fundamental principles of physics, such as the reversibility of the dynamical laws and the conservation of certain additive quantities (among which energy plays a distinguished role). Because it more closely mirrors physics than traditional models of computation, conservative logic is in a better position to provide indications concerning the realization of high-performance computing systems, i.e., of systems that make very efficient use of the "computing resources" actually offered by nature. In particular, conservative logic shows that it is ideally possible to build sequential circuits with zero internal power dissipation. After establishing a general framework, we discuss two specific models of computation. The first uses binary variables and is the conservative-logic counterpart of switching theory; this model proves that universal computing capabilities are compatible with the reversibility and conservation constraints. The second model, which is a refinement of the first, constitutes a substantial breakthrough in establishing a correspondence between computation and physics. In fact, this model is based on elastic collisions of identical "balls," and thus is formally identical with the atomic model that underlies the (classical) kinetic theory of perfect gases. Quite literally, the functional behavior of a general-purpose digital computer can be reproduced by a perfect gas placed in a suitably shaped container and given appropriate initial conditions.

This paper has a small discussion in a forum thread mostly saying the paper should be on the front page.

Abstract interpreters for free

Matthew Might, "Abstract interpreters for free", Static Analysis Symposium 2010 (SAS 2010).

...we present a two-step method to convert a small-step concrete semantics into a family of sound, computable abstract interpretations. The first step re-factors the concrete state-space to eliminate recursive structure; this refactoring of the state-space simultaneously determines a store-passing-style transformation on the underlying concrete semantics. The second step uses inference rules to generate an abstract state-space and a Galois connection simultaneously. The Galois connection allows the calculation of the “optimal” abstract interpretation. The two-step process is unambiguous, but nondeterministic: at each step, analysis designers face choices. Some of these choices ultimately influence properties such as flow-, field- and context-sensitivity. Thus, under the method, we can give the emergence of these properties a graph-theoretic characterization.

The work in this paper provides some context for known static analysis techniques like k-CFA, and also opens up some interesting new directions for static analysis development. Also, as Matt points out, there are some pedagogical benefits to having a systematic process for getting from semantics to abstract interpretation.

What Sequential Games, the Tychonoff Theorem and the Double-Negation Shift have in Common

What Sequential Games, the Tychonoff Theorem, and the Double-Negation Shift have in Common, Martin Escardo and Paulo Oliva, to appear in MSFP 2010.

This is a tutorial for mathematically inclined functional programmers, based on previously published, peered reviewed theoretical work. We discuss a higher-type functional, written here in the functional programming language Haskell, which

  1. optimally plays sequential games,
  2. implements a computational version of the Tychonoff Theorem from topology, and
  3. realizes the Double-Negation Shift from logic and proof theory.

The functional makes sense for finite and infinite (lazy) lists, and in the binary case it amounts to an operation that is available in any (strong) monad.

In fact, once we define this monad in Haskell, it turns out that this amazingly versatile functional is already available in Haskell, in the standard prelude, called sequence, which iterates this binary operation. Therefore Haskell proves that this functional is even more versatile than anticipated, as the function sequence was introduced for other purposes by the language designers, in particular the iteration of a list of monadic effects (but effects are not what we discuss here).

One of the most durable and productive analogies in semantics is the analogy between computability and continuity. Depending on how you read the history, this idea might even predate computers: Brouwer proved that all intuitonistic functions on the reals were continuous.

Over the last few years, Escardo and his collaborators have done a lot of cool stuff showing how this network of ideas can be turned into miraculous-looking little programs, so it's very nice to see a relatively accesible introduction to this work.

Handlers of Algebraic Effects

Matija Pretnar, Gordon Plotkin (2009) Handlers of Algebraic Effects:

We present an algebraic treatment of exception handlers and,
more generally, introduce handlers for other computational effects repre-
sentable by an algebraic theory. These include nondeterminism, interac-
tive input/output, concurrency, state, time, and their combinations; in
all cases the computation monad is the free-model monad of the theory.
Each such handler corresponds to a model of the theory for the effects
at hand. The handling construct, which applies a handler to a compu-
tation, is based on the one introduced by Benton and Kennedy, and is
interpreted using the homomorphism induced by the universal property
of the free model. This general construct can be used to describe previ-
ously unrelated concepts from both theory and practice.

Handling a computational effect, such as raising an exception, amounts to homomorphically mapping the handled computation onto another computation. So, for example, raise is interpreted as the exception handling code given to the handler.

While encompassing both returning and non-returning handlers, this idea becomes more interesting when you start to handle the other effects, such as lookup and update. Then you can get things like state rollback when an exception occurs, and others (CSS renaming and hiding, stream redirection, timeout). Thus the semantics of handlers gives rise to a new programming construct.

If you want a gentler introduction to the subject (along with Plotkin's algebraic theory of effects and Levy's Call-by-Push-Value), try Pretnar's 2010 thesis.

The Galois connection between syntax and semantics

Peter Smith, The Galois connection between syntax and semantics.

explains Lawvere’s remark about ‘the familiar Galois connection between sets of axioms and classes of models, for a fixed [signature]‘

Seems like a rather nice introduction to the notion of Galois connection (I seem to remember someone asking about this awhile back).

One place Galois connections pop up is in the realm of Abstract Interpretation.

Milawa: A Self-Verifying Theorem Prover for an ACL2-Like Logic

Milawa: A Self-Verifying Theorem Prover for an ACL2-Like Logic

Milawa is a "self-verifying" theorem prover for an ACL2-like logic.

We begin with a simple proof checker, call it A, which is short enough to verify by the "social process" of mathematics.

We then develop a series of increasingly powerful proof checkers, call them B, C, D, and so on. We show that each of these is sound: they accept only the same formulas as A. We use A to verify B, and B to verify C, and so on. Then, since we trust A, and A says B is sound, we can trust B, and so on for C, D, and the rest.

Our final proof checker is really a theorem prover; it can carry out a goal-directed proof search using assumptions, calculation, rewrite rules, and so on. We use this theorem prover to discover the proofs of soundness for B, C, and so on, and to emit these proofs in a format that A can check. Hence, "self verifying."

This might help inform discussions of the relationship between the de Bruijn criterion (the "social process" of mathematics) and formal verification. I think it also serves as an interesting signpost on the road forward: it's one thing to say that starting with a de Bruijn core and evolving a more powerful prover is possible in principle; it's another thing for it to actually have been done. The author's thesis defense slides provide a nice, quick overview.

A Lambda Calculus for Real Analysis

A Lambda Calculus for Real Analysis

Abstract Stone Duality is a revolutionary paradigm for general topology that describes computable continuous functions directly, without using set theory, infinitary lattice theory or a prior theory of discrete computation. Every expression in the calculus denotes both a continuous function and a program, and the reasoning looks remarkably like a sanitised form of that in classical topology. This is an introduction to ASD for the general mathematician, with application to elementary real analysis.

This language is applied to the Intermediate Value Theorem: the solution of equations for continuous functions on the real line. As is well known from both numerical and constructive considerations, the equation cannot be solved if the function "hovers" near 0, whilst tangential solutions will never be found.

In ASD, both of these failures and the general method of finding solutions of the equation when they exist are explained by the new concept of overtness. The zeroes are captured, not as a set, but by higher-type modal operators. Unlike the Brouwer degree, these are defined and (Scott) continuous across singularities of a parametric equation.

Expressing topology in terms of continuous functions rather than sets of points leads to treatments of open and closed concepts that are very closely lattice- (or de Morgan-) dual, without the double negations that are found in intuitionistic approaches. In this, the dual of compactness is overtness. Whereas meets and joins in locale theory are asymmetrically finite and infinite, they have overt and compact indices in ASD.

Overtness replaces metrical properties such as total boundedness, and cardinality conditions such as having a countable dense subset. It is also related to locatedness in constructive analysis and recursive enumerability in recursion theory.

Paul Taylor is deadly serious about the intersection of logic, mathematics, and computation. I came across this after beating my head against Probability Theory: The Logic of Science and Axiomatic Theory of Economics over the weekend, realizing that my math just wasn't up to the tasks, and doing a Google search for "constructive real analysis." "Real analysis" because it was obvious that that was what both of the aforementioned texts were relying on; "constructive" because I'd really like to develop proofs in Coq/extract working code from them. This paper was on the second page of results. Paul's name was familiar (and not just because I share it with him); he translated Jean-Yves Girard's regrettably out-of-print Proofs and Types to English and maintains a very popular set of tools for typesetting commutative diagrams using LaTeX.

XML feed