Fun

sml-family.org

In his blog, Bob Harper, in joint effort with Dave MacQueen and Lars Bergstrom, announces the launch of sml-family.org:

The Standard ML Family project provides a home for online versions of various formal definitions of Standard ML, including the "Definition of Standard ML, Revised" (Standard ML 97). The site also supports coordination between different implementations of the Standard ML (SML) programming language by maintaining common resources such as the documentation for the Standard ML Basis Library and standard test suites. The goal is to increase compatibility and resource sharing between Standard ML implementations.

The site includes a history section devoted to the history of ML, and of Standard ML in particular. This section will contain a collection of original source documents relating to the design of the language.

Scratch jr

Scratch jr is an iPad version of the Scratch environment, designed with young kids in mind. It is the best kid-oriented programming tool I tried so far, and my five year old has great fun making "movies" with it. As I noted on twitter an hour after installing, the ability to record your own voice and use it for your sprites is a killer feature. Check it out!

Functional Geometry and the Traite ́ de Lutherie

Functional Geometry and the Traite ́ de Lutherie by Harry Mairson, Brandeis University.

We describe a functional programming approach to the design of outlines of eighteenth-century string instruments. The approach is based on the research described in Francois Denis’s book, Traite ́ de lutherie. The programming vernacular for Denis’s instructions, which we call functional geometry, is meant to reiterate the historically justified language and techniques of this musical instrument design. The programming metaphor is entirely Euclidean, involving straightedge and compass constructions, with few (if any) numbers, and no Cartesian equations or grid. As such, it is also an interesting approach to teaching programming and mathematics without numerical calculation or equational reasoning.

The advantage of this language-based, functional approach to lutherie is founded in the abstract characterization of common patterns in instrument design. These patterns include not only the abstraction of common straightedge and compass constructions, but of higher-order conceptualization of the instrument design process. We also discuss the role of arithmetic, geometric, harmonic, and subharmonic proportions, and the use of their rational approximants.

Study finds that when no financial interests are involved programmers choose DECENT languages

For immediate release. By studying the troves of open source software on the github service, a recent (and still unpublished) study by a team of empirical computer scientists found that programmers prefer to use DECENT languages. DECENT languages were defined by the team to be languages that conform to a set of minimal ethical standards, including clean syntax, expressive type systems, good package managers and installers, free implementations, no connection to the military-industrial complex and not harming animals. The researchers argue that their data support the view that the prevalent use of indecent languages, Java in particular, is the result of money influencing programmer decisions. The principal author of the study notes that DECENT languages do not necessarily make the most MORAL choices (e.g., many of them are not statically typed). He attributed this to a phenomenon called the "weakness of the will."

Details to follow.

Propositions as Types

Propositions as Types, Philip Wadler. Draft, March 2014.

The principle of Propositions as Types links logic to computation. At first sight it appears to be a simple coincidence---almost a pun---but it turns out to be remarkably robust, inspiring the design of theorem provers and programming languages, and continuing to influence the forefronts of computing. Propositions as Types has many names and many origins, and is a notion with depth, breadth, and mystery.

Philip Wadler has written a very enjoyable (Like busses: you wait two thousand years for a definition of “effectively calculable”, and then three come along at once) paper about propositions as types that is accessible to PLTlettantes.

The origin of zero-based array indexing

An amusing historical analysis of the origin of zero based array indexing (hint: C wasn't the first). There's a twist to the story which I won't reveal, so as not to spoil the story for you. All in all, it's a nice anecdote, but it seems to me that many of the objections raised in the comments are valid.

Lisp in Summer Projects

This summer, spend some quality time with your favorite technology in our 2013 summer programming contest!

The Lisp community is awarding prizes for demonstrating interesting and useful programs, technologies and art using any LISP-based technology.

Lisp, prizes, what's not to like?

DYNAMO

I was surprised to see that DYNAMO hasn't been mentioned here in the past. DYNAMO (DYNAmic MOdels) was the simulation language used to code the simulations that led to the famous 1972 book The Limits to Growth from The Club of Rome. The language was designed in the late 1950s. It is clear that the language was used in several other places and evolved through several iterations, though I am not sure how extensively it was used. When Stafford Beer was creating Cybersyn for Salvador Allende he used DYNAMO to save time suggesting it was somewhat of a standard tool (this is described in Andrew Pickering's important book The Cybernetic Brain).

The language itself is essentially what you'd expect. It is declarative, programs consisting of a set of equations. The equations are zero and first-order difference equations of two kinds: level equations (accumulations) and rate equations (flows). Computation is integration over time. Levels can depend on rates and vice versa with the language automatically handling dependencies and circularities. Code looks like code looked those days: fixed columns, all caps, eight characters identifiers.

Here are a few links:

  • Section 3.7 of this history of discrete event simulation languages is a succinct description of the history of the language and its main features.
  • A more leisurely description of the language and the Limits to Growth model can be found in this article. Ironically, the author of the article reimplemented the model in Javascript (run it!). What was originally written in a DSL is now implemented in a general purpose language, with all the niceties handled manually.
  • Finally, a nice piece on Jay Forrester who prompted the creation of SIMPLE and DYNAMO, its offspring.

What is the most bizarre thing you have seen done with TeX?

Simple Generators v. Lazy Evaluation

Oleg Kiselyov, Simon Peyton-Jones and Amr Sabry: Simple Generators:

Incremental stream processing, pervasive in practice, makes the best case for lazy evaluation. Lazy evaluation promotes modularity, letting us glue together separately developed stream producers, consumers and transformers. Lazy list processing has become a cardinal feature of Haskell. It also brings the worst in lazy evaluation: its incompatibility with effects and unpredictable and often extraordinary use of memory. Much of the Haskell programming lore are the ways to get around lazy evaluation.

We propose a programming style for incremental stream processing based on typed simple generators. It promotes modularity and decoupling of producers and consumers just like lazy evaluation. Simple generators, however, expose the implicit suspension and resumption inherent in lazy evaluation as computational effects, and hence are robust in the presence of other effects. Simple generators let us accurately reason about memory consumption and latency. The remarkable implementation simplicity and efficiency of simple generators strongly motivates investigating and pushing the limits of their expressiveness.

To substantiate our claims we give a new solution to the notorious pretty-printing problem. Like earlier solutions, it is linear, backtracking-free and with bounded latency. It is also modular, structured as a cascade of separately developed stream transducers, which makes it simpler to write, test and to precisely analyze latency, time and space consumption. It is compatible with effects including IO, letting us read the source document from a file, and format it as we read.

This is fascinating work that shows how to gain the benefits of lazy evaluation - decoupling of producers, transformers, and consumers of data, and producing only as much data as needed - in a strict, effectful setting that works well with resources that need to be disposed of once computation is done, e.g. file handles.

The basic idea is that of Common Lisp signal handling: use a hierarchical, dynamically-scoped chain of handler procedures, which get called - on the stack, without unwinding it - to parameterize code. In this case, the producer code (which e.g. reads a file character by character) is the parameterized code: every time data (a character) is produced, it calls the dynamically innermost handler procedure with the data (it yields the data to the handler). This handler is the data consumer (it could e.g. print the received character to the console). Through dynamic scoping, each handler may also have a super-handler, to which it may yield data. In this way, data flows containing multiple transformers can be composed.

I especially like the OCaml version of the code, which is just a page of code, implementing a dynamically-scoped chain of handlers. After that we can already write map and fold in this framework (fold using a loop and a state cell, notably.) There's more sample code.

This also ties in with mainstream yield.

XML feed