Theory

Conservation laws for free!

In this year's POPL, Bob Atkey made a splash by showing how to get from parametricity to conservation laws, via Noether's theorem:

Invariance is of paramount importance in programming languages and in physics. In programming languages, John Reynolds’ theory of relational parametricity demonstrates that parametric polymorphic programs are invariant under change of data representation, a property that yields “free” theorems about programs just from their types. In physics, Emmy Noether showed that if the action of a physical system is invariant under change of coordinates, then the physical system has a conserved quantity: a quantity that remains constant for all time. Knowledge of conserved quantities can reveal deep properties of physical systems. For example, the conservation of energy, which by Noether’s theorem is a consequence of a system’s invariance under time-shifting.

In this paper, we link Reynolds’ relational parametricity with Noether’s theorem for deriving conserved quantities. We propose an extension of System Fω with new kinds, types and term constants for writing programs that describe classical mechanical systems in terms of their Lagrangians. We show, by constructing a relationally parametric model of our extension of Fω, that relational parametricity is enough to satisfy the hypotheses of Noether’s theorem, and so to derive conserved quantities for free, directly from the polymorphic types of Lagrangians expressed in our system.

Seemingly impossible programs

In case this one went under the radar, at POPL'12, Martín Escardó gave a tutorial on seemingly impossible functional programs:

Programming language semantics is typically applied to
prove compiler correctness and allow (manual or automatic) program
verification. Certain kinds of semantics can also be applied to
discover programs that one wouldn't have otherwise thought of. This is
the case, in particular, for semantics that incorporate topological
ingredients (limits, continuity, openness, compactness). For example,
it turns out that some function types (X -> Y) with X infinite (but
compact) do have decidable equality, contradicting perhaps popular
belief, but certainly not (higher-type) computability theory. More
generally, one can often check infinitely many cases in finite time.

I will show you such programs, run them fast in surprising instances,
and introduce the theory behind their derivation and working. In
particular, I will study a single (very high type) program that (i)
optimally plays sequential games of unbounded length, (ii) implements
the Tychonoff Theorem from topology (and builds finite-time search
functions for infinite sets), (iii) realizes the double-negation shift
from proof theory (and allows us to extract programs from classical
proofs that use the axiom of countable choice). There will be several
examples in the languages Haskell and Agda.

A shorter version (coded in Haskell) appears in Andrej Bauer's blog.

sml-family.org

In his blog, Bob Harper, in joint effort with Dave MacQueen and Lars Bergstrom, announces the launch of sml-family.org:

The Standard ML Family project provides a home for online versions of various formal definitions of Standard ML, including the "Definition of Standard ML, Revised" (Standard ML 97). The site also supports coordination between different implementations of the Standard ML (SML) programming language by maintaining common resources such as the documentation for the Standard ML Basis Library and standard test suites. The goal is to increase compatibility and resource sharing between Standard ML implementations.

The site includes a history section devoted to the history of ML, and of Standard ML in particular. This section will contain a collection of original source documents relating to the design of the language.

Inferring algebraic effects

Logical methods in computer science just published Matija Pretnar's latest take on algebraic effects and handlers:

We present a complete polymorphic effect inference algorithm for an ML-style language with handlers of not only exceptions, but of any other algebraic effect such as input & output, mutable references and many others. Our main aim is to offer the programmer a useful insight into the effectful behaviour of programs. Handlers help here by cutting down possible effects and the resulting lengthy output that often plagues precise effect systems. Additionally, we present a set of methods that further simplify the displayed types, some even by deliberately hiding inferred information from the programmer.

Pretnar and Bauer's Eff has made previous appearances here on LtU. Apart from the new fangled polymorphic effect system, this paper also contains an Eff tutorial.

Breaking the Complexity Barrier of Pure Functional Programs with Impure Data Structures

Breaking the Complexity Barrier of Pure Functional Programs with Impure Data Structures by Pieter Wuille and Tom Schrijvers:

Pure functional programming language offer many advantages over impure languages. Unfortunately, the absence of destructive update, imposes a complexity barrier. In imperative languages, there are algorithms and data structures with better complexity. We present our project for combining existing program transformation techniques to transform inefficient pure data structures into impure ones with better complexity. As a consequence, the programmer is not exposed to the impurity and retains the advantages of purity.

This paper is along the same lines a question I asked a couple of years ago. The idea here is to allow programming using immutable interfaces, and then automatically transform it into a more efficient mutable equivalent.

Luca Cardelli Festschrift

Earlier this week Microsoft Research Cambridge organised a Festschrift for Luca Cardelli. The preface from the book:

Luca Cardelli has made exceptional contributions to the world of programming
languages and beyond. Throughout his career, he has re-invented himself every
decade or so, while continuing to make true innovations. His achievements span
many areas: software; language design, including experimental languages;
programming language foundations; and the interaction of programming languages
and biology. These achievements form the basis of his lasting scientific leadership
and his wide impact.
...
Luca is always asking "what is new", and is always looking to
the future. Therefore, we have asked authors to produce short pieces that would
indicate where they are today and where they are going. Some of the resulting
pieces are short scientific papers, or abridged versions of longer papers; others are
less technical, with thoughts on the past and ideas for the future. We hope that
they will all interest Luca.

Hopefully the videos will be posted soon.

Propositions as Types

Propositions as Types, Philip Wadler. Draft, March 2014.

The principle of Propositions as Types links logic to computation. At first sight it appears to be a simple coincidence---almost a pun---but it turns out to be remarkably robust, inspiring the design of theorem provers and programming languages, and continuing to influence the forefronts of computing. Propositions as Types has many names and many origins, and is a notion with depth, breadth, and mystery.

Philip Wadler has written a very enjoyable (Like busses: you wait two thousand years for a definition of “effectively calculable”, and then three come along at once) paper about propositions as types that is accessible to PLTlettantes.

The marriage of bisimulations and Kripke logical relations

CK Hur, D Dreyer, G Neis, V Vafeiadis (POPL 2012). The marriage of bisimulations and Kripke logical relations.
There has been great progress in recent years on developing effective techniques for reasoning about program equivalence in ML-like languages---that is, languages that combine features like higher-order functions, recursive types, abstract types, and general mutable references. Two of the most prominent types of techniques to have emerged are bisimulations and Kripke logical relations (KLRs). While both approaches are powerful, their complementary advantages have led us and other researchers to wonder whether there is an essential tradeoff between them. Furthermore, both approaches seem to suffer from fundamental limitations if one is interested in scaling them to inter-language reasoning.

In this paper, we propose relation transition systems (RTSs), which marry together some of the most appealing aspects of KLRs and bisimulations. In particular, RTSs show how bisimulations' support for reasoning about recursive features via coinduction can be synthesized with KLRs' support for reasoning about local state via state transition systems. Moreover, we have designed RTSs to avoid the limitations of KLRs and bisimulations that preclude their generalization to inter-language reasoning. Notably, unlike KLRs, RTSs are transitively composable.

I understand the paper as offering an extension to bisimulation that handles the notion of hidden transitions properly and so allows a generalisation of KLRs to any systems that can be treated using bisimulations. Applications to verified compilation are mentioned, and everything has been validated in Coq.

Copatterns: the final approach to codata?

Andreas Abel and Brigitte Pientka's Well-Founded Recursion with Copatterns; a Unified Approach to Termination and Productivity is one of my highlights of the just-finished ICFP 2013, but it makes sense to focus on the first paper on this work, published at POPL back in January.

Copatterns: Programming Infinite Structures by Observations
Andreas Abel, Brigitte Pientka, David Thibodeau, Anton Setzer
2013

Inductive datatypes provide mechanisms to define finite data such as finite lists and trees via constructors and allow programmers to analyze and manipulate finite data via pattern matching. In this paper, we develop a dual approach for working with infinite data structures such as streams. Infinite data inhabits coinductive datatypes which denote greatest fixpoints. Unlike finite data which is defined by constructors we define infinite data by observations. Dual to pattern matching, a tool for analyzing finite data, we develop the concept of copattern matching, which allows us to synthesize infinite data. This leads to a symmetric language design where pattern matching on finite and infinite data can be mixed. We present a core language for programming with infinite structures by observations together with its operational semantics based on (co)pattern matching and describe coverage of copatterns. Our language naturally supports both call-by-name and call-by-value interpretations and can be seamlessly integrated into existing languages like Haskell and ML. We prove type soundness for our language and sketch how copatterns open new directions for solving problems in the interaction of coinductive and dependent types.

Codata has been often discussed here and elsewhere. See notably the discussion on Turner's Total Functional Programming (historical note: this 2004 beautification of the original 1995 paper which had much of the same ideas), and on the category-theory-inspired Charity language. Given those precedents, it would be easy for the quick reader to "meh" on the novelty of putting "observation" first (elimination rather than introduction rules) when talking about codata; yet the above paper is the first concrete, usable presentation of an observation in a practical setting that feels right, and it solves long-standing problem that current dependently-typed languages (Coq and Agda) have.

Coinduction has an even more prominent role, due to its massive use to define program equivalence in concurrent process calculi; the relevant LtU discussion being about Davide Sangiorgi's On the origins of Bisimulation, Coinduction, and Fixed Points. The POPL'13 paper doesn't really tell us how coinduction should be seen with copatterns. It does not adress the question of termination, which is the topic of the more recent ICFP'13 paper, but I would say that the answer on that point feels less definitive.

The Size-Change Termination Principle for Constructor Based Languages

The Size-Change Termination Principle for Constructor Based Languages, by Pierre Hyvernat:

This paper describes an automatic termination checker for a generic first-order call-by-value language in ML style. We use the fact that value are built from variants and tuples to keep some information about how arguments of recursive call evolve during evaluation. The result is a criterion for termination extending the size-change termination principle of Lee, Jones and Ben-Amram that can detect size changes inside subvalues of arguments. Moreover the corresponding algorithm is easy to implement, making it a good candidate for experimentation.

Looks like a relatively straightforward and complete description of a termination checker based on a notion of 'sized types' limited to first-order programs. LtU has covered this topic before, although this new paper doesn't seem to reference that particular Abel work.

XML feed