Ï€: a pattern language

π - not to be confused with the π-calculus - is a pattern-based language being developed by the Software Technology group at Technische Universität Darmstadt. Quoting from the project website:

There is only one language construct in π: the pattern. Patterns are, simply speaking, EBNF-expressions with an associated meaning; a pattern can be easiest understood as a function with a syntactically complex (context-free) "signature". The non-terminal symbols in the signature are then the parameters of the pattern. A π-program is a sequence of instruction symbols (technically, sentences), each being a sequence of (Unicode) characters. The sentences are then evaluated (executed) in the respective order.

The basic idea here seems similar to the OMeta language, previously mentioned on LtU here, but based on EBNF instead of Parsing Expression Grammars (PEGs).

Pattern definitions in π have the form

declare_pattern name ≔ syntax ⇒ type ➞ meaning;

Here's a trivial example of defining a pattern:

declare_pattern
   integer_potentiation ≔ integer:i %W- "^" %W- integer:j ⇒ integer ➞
   {
      int result = i;
      for (int k = 1; k <= j-1; k++)
         result *= i;
      return result;
   };

The resulting pattern can then be used directly in expressions, such as print(42^6);.

More information about the language, as well as the implementation, can be found at http://www.pi-programming.org. There's an OOPSLA09 paper on π as well, but I haven't been able to find an open access version of it yet.

[Update: the π team has made their OOPSLA article available here]

Causal Commutative Arrows and Their Optimization

Causal Commutative Arrows and Their Optimization, Hai Liu. Eric Cheng. Paul Hudak. ICFP 2009.

Arrows are a popular form of abstract computation. Being more general than monads, they are more broadly applicable, and in particular are a good abstraction for signal processing and dataflow computations. Most notably, arrows form the basis for a domain specific language called Yampa, which has been used in a variety of concrete applications, including animation, robotics, sound synthesis, control systems, and graphical user interfaces.

Our primary interest is in better understanding the class of abstract computations captured by Yampa. Unfortunately, arrows are not concrete enough to do this with precision. To remedy this situation we introduce the concept of commutative arrows that capture a kind of non-interference property of concurrent computations. We also add an init operator, and identify a crucial law that captures the causal nature of arrow effects. We call the resulting computational model causal commutative arrows.

To study this class of computations in more detail, we define an extension to the simply typed lambda calculus called causal commutative arrows (CCA), and study its properties. Our key contribution is the identification of a normal form for CCA called causal commutative normal form (CCNF). By defining a normalization procedure we have developed an optimization strategy that yields dramatic improvements in performance over conventional implementations of arrows. We have implemented this technique in Haskell, and conducted benchmarks that validate the effectiveness of our approach. When combined with stream fusion, the overall methodology can result in speed-ups of greater than two orders of magnitude.

One way of understanding what is going on in this paper is that in terms of dataflow programming, FRP programs correspond to programs with single-entry, single-exit dataflow graphs. This means that none of the internal dataflow nodes in an FRP program are actually necessary -- you can coalesce all those nodes into a single node while preserving the observable behavior. (They briefly touch on this point when they mention that synchronous languages try to compile to "single loop code".) What's very slick is that they have a nice normal form procedure that (a) is defined entirely in terms of their high-level language, and (b) always yields code corresponding to the the coalesced dataflow graph. It's an elegant demonstration of the power of equational reasoning.

Retrospective: An Axiomatic Basis for Computer Programming

Retrospective: An Axiomatic Basis for Computer Programming, by C.A.R. Hoare:

This month marks the 40th anniversary of the publication of the first article I wrote as an academic. I have been invited to give my personal view of the advances that have been made in the subject since then, and the further advances that remain to be made. Which of them did I expect, and which of them surprised me?

An interesting review of the history of computing. He has some nice perspectives on the complementarity of testing and formal methods, and how the growing cracking industry became an unexpected driving force behind industrial interest in verification.

Design Patterns 15 Years Later: An Interview with Erich Gamma, Richard Helm, and Ralph Johnson

Larry O'Brien recently interviewed three of the Gang of Four about their seminal work on patterns. Larry teased the interview's readers for awhile, but he eventually asked the pressing question that most language designers ask and debate about patterns ;) Here it is:

Larry: GoF came out relatively early in the ascent of OOP as the mainstream paradigm and, for better or worse, "patterns" seem to be associated with OO approaches. You even hear functional and dynamic advocates boasting that their languages "don't need" patterns. How do you respond to that?

Erich: Just as an aside, it is also easy to forget that we wrote design patterns before there was Java or C#.

Ralph: Some of those languages don't need some of the patterns in Design Patterns because their languages provide alternative ways of solving the problems. Our patterns are for languages between C++ and Smalltalk, which includes just about everything called "object-oriented," but they certainly are not for every programming language. I don't think anybody actually says that programmers in other languages don't need patterns; they just have a different set of patterns.

Erich: Design patterns eventually emerge for any language. Design déjà-vu is language neutral. While these experiences are not always captured as patterns, they do exist. The design principles for Erlang come to mind.

Larry: Where would a person go to learn about patterns for dynamic and functional languages? Who's making good contributions?

Ralph: If by "dynamic" you mean dynamic object-oriented languages like Smalltalk, Ruby or Python, then our patterns are applicable. Functional languages require different patterns, but I don't know who is working on them.

Note: At the end of the interview, Erich says that they tried refactoring the patterns into new categories in 2005. The draft breakdown he provides (accidentally???) takes out Memento, Chain of Responsibility, Bridge, Adapter, and Observer.

As I said above these are just notes in a draft state. Doing a refactoring without test cases is always dangerous...

UPDATE: The Gang of Four have an accompanying article for the interview that they wrote as a group. See A Look Back: Why We Wrote Design Patterns: Elements of Reusable Object-Oriented Software.

Have your AHOS and eat HOAS too!

Noam (uid 3913) announced on his weblog at the beginning of the year a technique that allows one to bridge the gap between meta-theoretic notions of function space and theory-internal notions, in a way that is compatible with structural induction over Higher-Order Abstract Syntax, by applying Reynolds' Definitional Interpreters for Higher-Order Programming Languages [pdf] (cf. LtU classic), and realised it as an implementation in Twelf.

That's a lot of ideas in one sentence, but since HOAS is in the air this month, I guess there are a good number of LtUers who will want to get to grips with this stuff.

Refs:

SequenceL - declarative computation on nonscalars

I recently came across the language SequenceL, which it seems NASA is using in some of its human spaceflight programs. SequenceL is described as a high-level language for declarative computation on nonscalars. One of the key features of the language appears to be the avoidance of the need to explicitly specify recursive or iterative operations. For example, given the function definition

Search(scalar Target, tuple [Subscript, Word]) = 
    Subscript when Word = Target 

which applies to tuples, the programmer can apply the function directly to lists of tuples without any need to specify how that application will be performed, e.g.

search(fox,[[1,dog],[2,cat],[3,fox],[4,parrot]]) → 3 
search(rabbit,[[1,dog],[2,cat],[3,fox],[4,parrot]]) → [] 

The language designers (Daniel Cooke and J. Nelson Rushton) claim that this kind of thing leads to more concise and readable code, and a more direct representation of a specification.

Unfortunately, the SequenceL website appears to be inaccessible at the moment. However, Daniel Cooke's site includes links to a number of papers and talks that describe SequenceL. In particular, the paper Normalize, Transpose, and Distribute: An Automatic Approach for Handling Nonscalars provides a detailed description of the "Consume-Simplify-Produce/Normalize-Transpose" approach that is embodied by SequenceL. There's also an overview of SequenceL available through CiteSeer.

Safe Garbage Collection = Regions + Intensional Type Analysis

Safe Garbage Collection = Regions + Intensional Type Analysis, by Daniel C. Wang and Andrew W. Appel:

We present a technique to implement type-safe garbage collectors by combining existing type systems used for compiling type-safe languages. We adapt the type systems used in region inference [16] and intensional type analysis [8] to construct a safe stop-and-copy garbage collector for higher-order polymorphic languages. Rather than using region inference as the primary method of storage management, we show how it can be used to implement a garbage collector which is provably safe. We also introduce a new region calculus with non-nested object lifetimes which is significantly simpler than previous calculi. Our approach also formalizes more of the interface between garbage collectors and code generators. The efficiency of our safe collectors are algorithmically competitive with unsafe collectors.

I'm surprised this region calculus hasn't received more attention or follow-up discussion in subsequent papers. It seems eminently practical, as it replaces the more complicated alias analyses required in other region calculi, with garbage collection of region handles; seems like a huge improvement over general GC, assuming region inference is sufficiently precise.

ICFP 2009 videos

I'm not sure whether this has already been posted (a quick search says "no"), but videos from 40 presentations at ICFP 2009 (in Edinburgh), are available through this Vimeo account. I'm sure this will be of some interest to LtU readers.

A Type-theoretic Foundation for Programming with Higher-order Abstract Syntax and First-class Substitutions

A Type-theoretic Foundation for Programming with Higher-order Abstract Syntax and First-class Substitutions by Brigitte Pientka, appeared in POPL 08.
Higher-order abstract syntax (HOAS) is a simple, powerful technique for implementing object languages, since it directly supports common and tricky routines dealing with variables, such as capture-avoiding substitution and renaming. This is achieved by representing binders in the object-language via binders in the meta-language. However, enriching functional programming languages with direct support for HOAS has been a major challenge, because recursion over HOAS encodings requires one to traverse - abstractions and necessitates programming with open objects.

We present a novel type-theoretic foundation based on contextual modal types which allows us to recursively analyze open terms via higher-order pattern matching. By design, variables occurring in open terms can never escape their scope. Using several examples, we demonstrate that our framework provides a name-safe foundation to operations typically found in nominal systems. In contrast to nominal systems however, we also support capture-avoiding substitution operations and even provide first-class substitutions to the programmer. The main contribution of this paper is a syntax directed bi-directional type system where we distinguish between the data language and the computation language together with the progress and preservation proof for our language.

Its been a while since I posted, but this paper appears that it may be of interest to some members of this community. It looks interesting to me, but I just wish I understood all the terminology. I don't know what "open objects" are, and why they are a problem. I don't understand what HOAS is. I don't even know what binders are. The list goes on. I surely can't be the only person who is interested, but feels that this is just out of their grasp. I bet that I probably could understand these things with a minimum of explanation, given I have experience implementing languages. If anyone is interested in rephrasing the abstract in more basic terms, I would be very appreciative. [Edit: corrected spelling of Brigitte Pientka. My apologies.]

JVM language summit 2009

Notes and presentations from the href="http://www.jvmlangsummit.com">JVM language summit 2009 are available. (Even though the page says 2008).

Most intersting talk seems to be from Rich Hickey's clojure presentation where he wants languages to be explicit about time.

Presentation[PDF]
Conclusion of his talk: [Youtube Video]. Followup comment from John Rose @ Sun

Also Erik Meijer talks about .NET Reactive Framework. Calls GOF book crap [YouTube video].

Archive of Live Tweets
Choice quotes:
"If you want to get rid of the clojure parenthesis' you have not experienced them enough" - Rich Hickey
"Mock objects and dependency injection are just for people who don't know math" - Erik Meijer