Functional

ICFP Programming Contest 2018

Yep, it on!

Sequent Calculus as a Compiler Intermediate Language

Sequent Calculus as a Compiler Intermediate Language
2016 by Paul Downen, Luke Maurer, Zena M. Ariola, Simon Peyton Jones

The typed λ-calculus arises canonically as the term language for a logic called natural deduction, using the Curry-Howard isomorphism: the pervasive connection between logic and programming languages asserting that propositions are types and proofs are programs. Indeed, for many people, the λ-calculus is the living embodiment of Curry-Howard.

But natural deduction is not the only logic! Conspicuously, natural deduction has a twin, born in the very same paper, called the sequent calculus. Thanks to the Curry-Howard isomorphism, terms of the sequent calculus can also be seen as a programming language with an emphasis on control flow.

Resource Polymorphism

Resource Polymorphism, by Guillaume Munch-Maccagnoni:

We present a resource-management model for ML-style programming languages, designed to be compatible with the OCaml philosophy and runtime model. This is a proposal to extend the OCaml language with destructors, move semantics, and resource polymorphism, to improve its safety, efficiency, interoperability, and expressiveness. It builds on the ownership-and-borrowing models of systems programming languages (Cyclone, C++11, Rust) and on linear types in functional programming (Linear Lisp, Clean, Alms). It continues a synthesis of resources from systems programming and resources in linear logic initiated by Baker.

It is a combination of many known and some new ideas. On the novel side, it highlights the good mathematical structure of Stroustrup's "Resource acquisition is initialisation" (RAII) idiom for resource management based on destructors, a notion sometimes confused with finalizers, and builds on it a notion of resource polymorphism, inspired by polarisation in proof theory, that mixes C++'s RAII and a tracing garbage collector (GC).
The proposal targets a new spot in the design space, with an automatic and predictable resource-management model, at the same time based on lightweight and expressive language abstractions. It is backwards-compatible: current code is expected to run with the same performance, the new abstractions fully combine with the current ones, and it supports a resource-polymorphic extension of libraries. It does so with only a few additions to the runtime, and it integrates with the current GC implementation. It is also compatible with the upcoming multicore extension, and suggests that the Rust model for eliminating data-races applies.

Interesting questions arise for a safe and practical type system, many of which have already been thoroughly investigated in the languages and prototypes Cyclone, Rust, and Alms.

An ambitious goal, and it would be an incredibly useful addition to OCaml. Might even displace Rust in some places, since you can theoretically avoid triggering the GC, but you have the excellent GC available when needed. This is definitely a pain point for Rust.

How efficient is partial sharing?

Partial sharing graphs offer a reduction model for the lambda calculus that is optimal in a sense put forward by Jean Jacques Levy. This model has seen interest wax and wane: initially it was thought to offer the most efficient possible technology for implementing the lambda calculus, but then an important result showed that bookkeeping overheads of any such model could be very high (Asperti & Mairson 1998). This result had a chilling effect on the initial wave of excitement over the technology.

Now Stefano Guerrini, one of the early investigators of partial sharing graphs, has an article with Marco Solieri (Guerrini & Solieri 2017) arguing that the gains from optimality can be very high and that partial sharing graphs can be relatively close to the best possible efficiency, within a quadratic factor on a conservative analysis (this is relatively close in terms of elementary recursion). Will the argument and result lead to renewed interest in partial sharing graphs from implementors of functional programming? We'll see...

(Asperti & Mairson 1998) Parallel beta reduction is not elementary recursive.

(Guerrini & Solieri 2017) Is the Optimal Implementation inefficient? Elementarily not.

Is Haskell the right language for teaching functional programming principles?

No! (As Simon Thompson explains.)
You cannot not love the "exploration of the length function" at the bottom. Made me smile in the middle of running errands.

A unified approach to solving seven programming problems

A fun pearl by William E. Byrd, Michael Ballantyne, Gregory Rosenblatt, and Matthew Might from ICFP: seven programming challenges solved (easily!) using a relational interpreter. One challenge, for example, is to find quines. Another is to find programs that produce different results with lexical vs. dynamic scope.

The interpreter is implemented in miniKanren (of course), inside Racket (of course).

ICFP 2017 live streaming

If you are one of the few LtU-ers not presenting (just kidding...), you can get your functional programming fix by following the live stream from ICFP, starting tomorrow at 11:00 (UK). Want to know when to tune in? Check out the incredible program!

Proceedings of the ACM on Programming Languages

Proceedings of the ACM on Programming Languages (PACMPL) is a Gold Open Access journal publishing research on all aspects of programming languages, from design to implementation and from mathematical formalisms to empirical studies. Each issue of the journal is devoted to a particular subject area within programming languages and will be announced through publicized Calls for Papers.

See the ToC of the September 2017, ICFP issue, here. Some very cool stuff.

Congrats!

Review of Graham Hutton's Programming in Haskell, 2e

A concise review by Simon Thompson of the second edition of Graham Hutton's Programming in Haskell. The first edition was published in 2007, but chapters were written earlier, and the review focuses on how the language has changed since then, embracing the "categorical / algebraic approach more fully".

RustBelt: Securing the Foundations of the Rust Programming Language

RustBelt: Securing the Foundations of the Rust Programming Language by Ralf Jung, Jacques-Henri Jourdan, Robbert Krebbers, Derek Dreyer:

Rust is a new systems programming language that promises to overcome the seemingly fundamental tradeoff between high-level safety guarantees and low-level control over resource management. Unfortunately, none of Rust’s safety claims have been formally proven, and there is good reason to question whether they actually hold. Specifically, Rust employs a strong, ownership-based type system, but then extends the expressive power of this core type system through libraries that internally use unsafe features. In this paper, we give the first formal (and machine-checked) safety proof for a language representing a realistic subset of Rust. Our proof is extensible in the sense that, for each new Rust library that uses unsafe features, we can say what verification condition it must satisfy in order for it to be deemed a safe extension to the language. We have carried out this verification for some of the most important libraries that are used throughout the Rust ecosystem.

Rust is definitely pushing the envelope in a new direction, but there's always a little wariness around using libraries that make use of unsafe features, since "safety with performance" is a main reason people want to use Rust. So this is a great step in the right direction!

XML feed