archives

Elephant 2000: A Programming Language for the year 2015 Based on Speech Acts

McCarthy's Elephant language proposal was mentioned here several times in the past. This talk from Etech provides a nice introduction to the fundamental idea behind Elephant and its background.

The talk includes interesting, though not entirely motivated, comments related to the paper Ascribing Mental Qualities to Machines. This is one of McCarthy's most significant papers in my opinion, and deserves more attention and debate. It is also rather amusing. I hope I will find the time some day to put this paper in context (McCarthy's comments in the Etech talk notwithstanding), but for the time being I recommend it to anyone interested in this sort of thing.

One thing is for sure: We can safely add to the 2009 predictions the prediction that Elephant will not be ready in 2009...

Using Promises to Orchestrate Web Interactions

Phil Windley posted a few useful links about this topic following WWW2008.

Bonus question: How is this item connected to the one I posted earlier today?

Extensible Term Language 0.2.1

The Extensible Term Language is a high level meta-syntax language that that allows to define small and big languages that use blocks, expressions, operators, and statements as primary meta-syntax elements. The language definition is compiled to LL(1) grammar afterward.

ETL tries to find a new balance between syntax generality and extensibility. It is designed to allow creating DSLs and programming languages almost as extensible as Lisp on the syntax level (but macros are supposed to be implemented as tree rewriting rules), while retaining nice surface syntax (this example tries to be as close to Java as possible and this one to be somewhat close to dynamic functional languages). The parser also supports automatic error recovery.

Java implementation of the parser is available for download. The documentation is also available online on the project's web site.

Since the previous announce there were mainly usability changes in the grammar definition language and now it is much more compact. There was a lot of bug fixes. And finally there is a tutorial that demonstrates how to implement own DSL on using the AST parser.

ETL might be a nice tool for quick implementation of own DSL with nice surface syntax and for creating new experimental programming languages.

Efficient Interpretation by Transforming Data Types and Patterns to Functions

This paper [pdf] describes an efficient interpreter for lazy functional languages like Haskell and Clean. The interpreter is based on the elimination of algebraic data types and pattern-based function definitions by mapping them to functions using a new efficient variant of the Church encoding. The transformation is simple and yields concise code.

Nested functions - how many nesting levels are really needed?

I'm implementing a language that supports nested functions with closure semantics, i.e.

def func(x:int) returns (int):int
{
   return def nested(y:int) returns int 
          { x*y; }
}

Now, to simplify the closure implementation, I only allow one nesting level. Is this overly restrictive? What does other languages do?

I haven't come across a real-life use-case for multiple nesting levels.

Any opinions / counter examples would be greatly appreciated.

Thanks.
-s

what causes really wide pages?

e.g.

http://lambda-the-ultimate.org/node/3129

shows up as like 2x wide in my firefox (on a mac).

Tony Hoare / Historically Bad Ideas: "Null References: The Billion Dollar Mistake"

(seen via http://catless.ncl.ac.uk/Risks/25.51.html#subj9.1, i didn't find it searching on ltu yet)

at qcon, london:

Abstract: I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null. More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965.