A-Z of Programming Languages: Erlang

The latest entry has Joe Armstrong discussing Erlang in the ongoing series of interviews with PL designers (The A-Z of Programming Languages). Two related things caught my eye. The first is the obvious truism about language features:

Removing stuff turns out to be painfully difficult. It's really easy to add features to a language, but almost impossibly difficult to remove things. In the early days we would happily add things to the language and remove them if they were a bad idea. Now removing things is almost impossible.

The other thing that I found intriguing was his mention of integrating version control into the language:

We have mechanisms that allow the application software to evolve, but not the language and libraries itself. We need mechanisms for revision control as part of the language itself. But I don't know how to do this. I've been thinking about this for a long time. Instead of having external revision control systems like Git or Subversion I'd like to see revision control and re-factoring built into the language itself with fine-grain mechanism for introspection and version control.

Not sure what he has in mind?

How Does Our Language Shape The Way We Think?

Seems like its been a while since we last grated our linguistic experts. From How Does Our Language Shape The Way We Think? by Lera Boroditsky, the age-old discussion gets reopened:

Such a priori arguments about whether or not language shapes thought have gone in circles for centuries, with some arguing that it's impossible for language to shape thought and others arguing that it's impossible for language not to shape thought. Recently my group and others have figured out ways to empirically test some of the key questions in this ancient debate, with fascinating results.
Being the Programming Languages weblog, issues surrounding languages in general are somewhat tangential. Unlike the linguists, it is generally accepted that programming language syntax and semantics does have a significant effect on design and construction of programs. But like liguistics, one would be hard pressed to isolate the language from the community (culture). My take would be that a large measure of the benefit of looking at new PLs derives from being exposed to differing communities - not just in learning the details of a language.

Lectures on Jacques Herbrand as a Logician

Wirth, Siekmann, Benzmueller & Autexier. Lectures on Jacques Herbrand as a Logician. SEKI Report SR-2009-01.

Herbrand's work, more than that of any other, provides the intellectual foundations of logic programming. This very readable article discusses Herbrands' contributions to proof theory and the formulation of the idea of a recursive function, and most importantly to PL, his fundamental theorem that yields a semi-decision algorithm for first-order logic and his unification algorithm.

Peter Landin

I was just forwarded a message that Peter Landin passed away yesterday.

From: Edmund Robinson
Date: 4 June 2009 09:10:11 GMT+00:00
Subject: Peter Landin

I am very sorry to inform you that Peter Landin died yesterday of natural causes.

For those members who are several generations away from Peter's early contributions, he was one of the major figures in the UK at the time that Computer Science was beginning to establish itself as a discipline. Some of his papers from 40 years ago are essential reading for any serious student of programming languages as still the simplest and clearest exposition of ideas that remain fundamental. The ideas in his papers were truly original and beautiful, but Peter never had a simplistic approach to scientific progress, and would scoff at the idea of individual personal contribution. Some of his own greatest ontribution to the field was as part of a golden nexus of work on programming languages in the UK in the late 60's and early 70's, containing Dana Scott and Christopher Strachey and others as well as Peter. The ideas they developed through their discussions truly lifted the study of programming languages to another level, and are now part of the bedrock of the subject.

Landin was one of the founders of our field, and did a lot of work of lasting value. We've discussed his papers here many times before, even though some of them were written decades ago. He did work that cast a long shadow, or phrased better, did work that illuminated wide vistas.

Translation of Tree-processing Programs into Stream-processing Programs based on Ordered Linear Types

Translation of Tree-processing Programs into Stream-processing Programs based on Ordered Linear Types, Koichi Kodama, Kohei Suenaga, Naoki Kobayashi, JFP 2008.

There are two ways to write a program for manipulating tree-structured data such as XML documents: One is to write a tree-processing program focusing on the logical structure of the data and the other is to write a stream-processing program focusing on the physical structure. While tree-processing programs are easier to write than stream-processing programs, tree-processing programs are less efficient in memory usage since they use trees as intermediate data. Our aim is to establish a method for automatically translating a tree-processing program to a stream-processing one in order to take the best of both worlds.

We first define a programming language for processing binary trees and a type system based on ordered linear types, and show that every well-typed program can be translated to an equivalent stream-processing program. We then extend the language and the type system to deal with XML documents. We have implemented an XML stream processor generator based on our algorithm, and obtained promising experimental results.

Linear logic is what you get when you give up the structural rules of weakening and contraction -- it's logic when you cannot reuse or forget any hypothesis you make in the course of a proof. This means that every formula is used exactly once, which makes it useful (via Curry-Howard) for programming, as well: linear types give us a way of tracking resources and state in a type system. Intuitively, you can think of it as a kind of static analysis that ensures that an object's runtime reference count will always be one.

However, linear logic retains the structural rule of exchange, which lets us use hypotheses in a different order than we introduced them. Ordered logic is what you get when you drop exchange, as well. (Amusingly, it was invented long before linear logic -- in the 1950s, Lambek introduced it with an eye towards applications in linguistics: he wanted to express the difference between one word occurring either before, or after, another.)

This means that you can use ordered logic to express the order in your types the order in which you use data, as well, which the authors here use to design a language whose typechecker statically rules out memory-inefficient programs.

Computing Needs Time

Edward A. Lee, Computing Needs Time, Communications of the ACM, Volume 52, Issue 5 (May 2009).

The foundations of computing, rooted in Turing, Church, and von Neumann, are about the transformation of data, not about physical dynamics. This paper argues that we need to rethink the core abstractions if we really want to integrate computing with physical processes. In particular, I focus on a key aspect of physical processes that is almost entirely absent in computing, the passage of time. This is not just about “real-time systems,” which accept the foundations and retrofit them with temporal properties. Although that technology has much to contribute, I will argue that it cannot solve the problem alone because it is built on flawed foundations.

The section of most direct relevance to LtU is probably section 5.2 on programming languages, which opens by saying:

Programming languages provide an abstraction layer above the ISA. If the ISA is to expose selected temporal properties, and programmers wish to exploit this, then one approach would be to reflect these in the languages.

Also potentially of interest to the LtU readership is section 5.4 on formal methods, which closes by asserting that

...type systems are formal methods that have had enormous impact. What is needed is time systems with the power of type systems.

Note: The "Tagged Signal" meta-model of computation mentioned in section 3 of the paper was previously discussed on LtU here.

Going functional on exotic trades

Simon Frankau, Diomidis Spinellis, Nick Nassuphis, Christoph Burgard. Going functional on exotic trades. JFP 19(01):27-45.

The Functional Payout Framework (fpf) is a Haskell application that uses an embedded domain-specific functional language to represent and process exotic financial derivatives. Whereas scripting languages for pricing exotic derivatives are common in banking, fpf uses multiple interpretations to not only price such trades, but also to analyse the scripts to provide lifecycle support and more. This paper discusses fpf in relation to the wider trading workflow and our experiences in using a functional language in such a system as both an implementation language and a domain-specific language.

Section 3 is a nice discussion of why Haskell was chosen. Section 4 illustrates one of the major benefits of using DSLs, namely that different backends can be applied to programs written in the DSL, allowing various types of runtime behavior and analysis without the need to re-code each program for each purpose (thus n+m, rather than n*m programs).

Another topic that may be worth discussing is the authors' dislike of point free style.

What we really need, of course, are DSLs for financial regulation (and possibly for specifying various definitions of honesty), but that's a separate issue...

Questions Five Ways

I think one of the better ideas I've had on this blog is my Questions Five Ways series. For each post, I'll ask a guiding question of five leading hackers, some from the Ruby community and some from outside it.

So far the questions were about concurrency, code reading, and static code analysis & testing. I understand Pat is interested in hearing suggestion for future topics.

I find the discussion about concurrency interesting. Naturally we have been urging people to look at Erlang for quite awhile, and Haskell parallelism is also a frequent topic here. It is nice to see how these things are becoming more mainstream. It also means it is about time we moved on to new things...

Code reading is, of course, near and dear to me.

Forth Dimensions

Forth Dimensions volumes 1-21, 1978-1999.

The Forth Interest Group's bi-monthly, hard-copy magazine with an international circulation was founded in 1978. The owner of one of Silicon Valley's most successful computer bookstores called it, "The best special-interest technical magazine ever". Forth Dimensions is no longer published.

The site contains scanned copies of every issue for download as PDF. Perhaps some Forth hackers can point us at the most interesting ones?

Types are Calling Conventions

If optimization and intermediate languages for lazy functional languages are your things, take a look at Types are Calling Conventions by Max Bolingbroke and Simon Peyton Jones.

It is common for compilers to derive the calling convention of a function from its type. Doing so is simple and modular but misses many optimisation opportunities, particularly in lazy, higher-order functional languages with extensive use of currying. We restore the lost opportunities by defining Strict Core, a new intermediate language whose type system makes the missing distinctions: laziness is explicit, and functions take multiple arguments and return multiple results.