Functional

OCaml vs. C++ for Dynamic Programming

Some might want to check this /. thread,

On one particular problem instance (a garden of size 7x3), my C++ implementation finished in 1 second, while the OCaml implementation was still running after 16 minutes. Bear in mind that my OCaml implementation was dramatically faster than my equivalent Haskell code... If you write OCaml code that is isomorphic to C code, it will be fast---what about if you use OCaml the way it was meant to be used?

My critical view of this sort of thing is well known, and I don't want another holy war (the Knuth thread is enough for my taste), but I guess we should check comparisons of this sort from time to time, if only to keep everyone honest (including ourselves).

Educational Pearl: Automata as Macros

Shriram Krishnamurthi's classic macro automaton example, well known from his LL1 presentation, is now written up as an educational pearl for JFP.

As you may recall, the example shows how to use Scheme macros to create an an automata DSL, and concentrates on the importance of proper tail-calls for achieving the desired semantics.

De-typechecker: converting from a type to a term

This message presents polymorphic functions that derive a term for a given type -- for a class of fully polymorphic functions: proper and improper combinators.

Oleg's converter can let us `visualize' what a function with a particular type may be doing. It can be helpful in understanding functions written in point-free style.

And to top it off, we are shown how to do SLD resolution using Haskell typeclasses...

Embedded Interpreters

This is a tutorial on using type-indexed embedding projection pairs when writing interpreters in statically-typed functional languages. The method allows (higher-order) values in the interpreting language to be embedded in the interpreted language and values from the interpreted language may be projected back into the interpreting one. This is particularly useful when adding command-line interfaces or scripting languages to applications written in functional languages. We first describe the basic idea and show how it may be extended to languages with recursive types and applied to elementary meta-programming. We then show how the method combines with Filinski's continuation-based monadic reflection operations to define an extensional version of the call-by-value monadic translation and hence to allow values to be mapped bidirectionally between the levels of an interpreter for a functional language parameterized by an arbitrary monad. Finally, we show how SML functions may be embedded into, and projected from, an interpreter for an asynchronous pi-calculus via an extensional variant of a standard translation from lambda into pi.

Another paper from Nick Benton.

Like the previous one this paper is dense and detailed, but this time there are some useful practical techniques that may come handy next time you build a DSL in a functional language.

Functional Geometry

Frank Buss commented on c.l.l. how he never really understood how to use higher order functions as combinators, until he read the article from Peter Henderson about Functional Geometry. He has written his own article about how he used these concepts and Common Lisp to re-create the famous recursive fish drawing by M.C. Escher.

Nothing new for most readers, but it's fun anyway.

Region Streams: Functional Macroprogramming for Sensor Networks

This paper presents the design of a functional macroprogramming language for sensor networks, called Regiment. The essential data model in Regiment is based on region streams, which represent spatially distributed,time-varying collections of node state. A region stream might represent the set of sensor values across all nodes in an area or the aggregation of sensor values within that area. Regiment is a purely functional language, which gives the compiler considerable leeway in terms of realizing region stream operations across sensor nodes and exploiting redundancy within the network.

The operations on region streams include fold and map. The language uses monads to represent time varying values.

I know nothing about sensor networks (this paper is from DMSN 2004: the First Workshop on Data Management for Sensor Networks) so I can't judge how useful this project really is.

Notice that Regiment isn't implemented yet. Only a highly restricted (dynamically typed) subset is.

Implicit parallel functional programming

This discussion over on the Haskell Mailing List might interest those readers following the debates on concurrency issues in the discussion group...

Non-determinism in functional languages

Non-determinism in functional languages. Sondergaard and Sestoft. The Computer Journal, Volume 35, Issue 5, pp. 514-523. 1992

The introduction of a non-deterministic operator in even an very simple functional programming language gives rise to a plethora of semantic questions. These questions are not only concerned with the choice operator itself. A surprisingly large number of different parameter passing mechanisms are made possible by the introduction of bounded non-determinism. The diversity of semantic possibilities is examined systematically...
A very useful paper if you are interested in this sort of thing. Thinking of non-determinism helps calrify muddled thinking about properties such as referential transparency.

The version I found online, alas, is a set of tiff image files, one for each page...

Implementation of FPL

Simon Peyton Jones book on the Implementation of Functional Programming Languages, circa 1987, is available online.

This book is about implementing functional programming languages using lazy graph reduction, and it divides itnto three parts.



The first part describes how to translate a high-level functional language into an intermediate language, called lambda calculus, including detailed coverage of pattern-matching and type checking. The second part begins with a simple implementation of the lambda calculus, based on graph reduction, and then develops a number of refinements and alternatives, such as super-combinators, full laziness and SK combinators. Finally, the third part describes the G-machine, a sophisticated implementation of graph reduction, which provides a dramatic increase in performance...

via the Haskell mailing lists. I've seen lots of references to this work, as it was one of the seminal works in the development of the Haskell programming language. But I can't find where it was directly discussed on LtU before.

Composable memory transactions

Composable memory transactions. Tim Harris, Simon Marlow, Simon Peyton Jones, and Maurice Herlihy. Submitted to PPoPP 2005.

Writing concurrent programs is notoriously difficult, and is of increasing practical importance. A particular source of concern is that even correctly-implemented concurrency abstractions cannot be composed together to form larger abstractions. In this paper we present a new concurrency model, based on transactional memory, that offers far richer composition. All the usual benefits of transactional memory are present (e.g. freedom from deadlock), but in addition we describe new modular forms of blocking and choice that have been inaccessible in earlier work.

The work is in the context of Concurrent Haskell (of course). The authors work from the assumption that a purely functional (they use the term declarative) language is a perfect setting for transactional memory. The reason is that mutable state is rare and explicit.

XML feed