Functional programming with GNU make

One of the gems squirreled away on Oleg's site is "Makefile as a functional language program":

"The language of GNU make is indeed functional, complete with combinators (map and filter), applications and anonymous abstractions. That is correct, GNU make supports lambda-abstractions."

Although I've classified this under Fun, Oleg exploits
the functional nature of Make for a real, practical application:

"...avoiding the explosion of makefile rules in a project that executes many test cases on many platforms. [...] Because GNU make turns out to be a functional programming system, we can reduce the number of rules from <number-of-targets> * <number-of-platforms> to just <number-of-targets> + <number-of-platforms>."

See the article for a code comparison
between make and Scheme, and check out the Makefile in question.

What's up guys?

So I am busy a couple of days, and not one editor manages to post something new? I am disappointed...

Maybe it's time we recruited some more contributing editors. If you are a regular and want to join, let me know.

Generics in Visual Basic 2005

You knew it couldn't be far behind, right?

Defining and Using Generics in Visual Basic 2005 on MSDN has the details.

New Chip Heralds a Parallel Future

Missing no chance to stand on my soapbox about the need for easy PL retargeting, I bring you insights from Paul Murphy about our parallel-processing, Linux future.

[T]he product has seen a billion dollars in development work. Two fabs...have been custom-built to make the new processor in large volumes....To the extent that performance information has become available, it is characterized by numbers so high that most people simply dismissed the reports....

The machine is widely referred to as a cell processor, but the cells involved are software, not hardware. Thus a cell is a kind of TCP packet on steroids, containing both data and instructions and linked back to the task of which it forms part via unique identifiers that facilitate results assembly just as the TCP sequence number does.

The basic processor itself appears to be a PowerPC derivative with high-speed built-in local communications, high-speed access to local memory, and up to eight attached processing units broadly akin to the Altivec short array processor used by Apple. The actual product consists of one to eight of these on a chip -- a true grid-on-a-chip approach in which a four-way assembly can, when fully populated, consist of four core CPUs, 32 attached processing units and 512 MB of local memory.

Paul follows up with a shocker.

I'd like to make two outrageous predictions on this: first that it will happen early next year, and secondly that the Linux developer community will, virtually en masse, abandon the x86 in favor of the new machine.

Abandonment is relative. The new processor will emulate x86 no problem, as Paul notes. In the PowerPC line, already today we have Linux for PowerPC complete with Mac OS X sandbox. From a PL standpoint, however, this development may cattle-prod language folks off their x86 back ends and into some serious compiler refactoring work. I hope so!

Eric Gunnerson's JavaOne report

This may be of interest to LtU readers.

Most of the specific language features were discussed here previously, but the C# perspective may make this worth a look.

Database Abstraction Layers and Programming Languages

From time to time I like to return to the issue of database integration, only to once again remark that the difficulty in creating good database APIs (as opposed to simply embedding SQL) is the result of the poor programming facilities provided by most programming languages (e.g., no macros for syntax extension, no continuation or first class functions to handle control flow etc.).

Why return to this topic today? Jeremy Zawodny aruges on his blog that Database Abstraction Layers Must Die!

Along the way he says,

Adding another layer increases complexity, degrades performance, and generally doesn't really improve things.

So why do folks do it? Because PHP is also a programming language and they feel the need to "dumb it down" or insulate themselves (or others) from the "complexity" of PHP.


Why do we need an abstraction layer anyway?

The author uses an argument I hear all the time: If you use a good abstraction layer, it'll be easy to move from $this_database to $other_database down the road.

That's bullshit. It's never easy.

Double ouch, but true enough. Databases are like women (can't live with them, can't live without), and getting rid of one can be as painful as divorce...

So what's the solution? Surprise, surprise: use a libary. But isn't that an abstraction layer? Of course it is.

What Jeremy advocates is plain old software engineering and design. Everyone should do it. I can't beleive anyone does anything else.

But wait. I just told you it's hard to build such a library, since programming languages makes the design of such libraries hard (e.g., should you use iterators, cursors or return record buffers? should your library database access routine be as flexible as a select statement?) So we design libaries that aren't very good, but hopefully are good enough.

And that's the question I put before you. We all know about coupling and cohesion. We all know about building software abstractions. Are our tools for building abstractions powerful enough for this basic and standard abstraction: the database access abstraction layer?

Type-Based Optimization for Regular Patterns

Type-Based Optimization for Regular Patterns, by Michael Y. Levin and Benjamin C. Pierce. WWW Workshop on High-Performance XML Processing, May 2004.

We describe work in progress on a compilation method based on matching automata, a form of tree automata specialized for pattern compilation, in which we use the schema of the value owing into a pattern matching expression to generate more efficient target code.

A set of slides is also available.

Logical Methods in Computer Science

Logical Methods in Computer Science is a fully refereed, open access, free, electronic journal. It welcomes papers on theoretical and practical areas in computer science involving logical methods, taken in a broad sense... Papers are refereed in the traditional way, with two or more referees per paper. Copyright is retained by the author.

Many of the topics to be covered by this new journal are related to PL research or are of interest to PL researchers, so I hope many of the papers published in it will be interesting enough to discuss here.

The editorial team is impressive, with Dana Scott as editor-in-chief, and Plotkin and Vardi as managing editors.

The editorial board includes many prominent figures among them Abadi, Abramsky, Gries, Pierce, Wadler and Wand.

Early history of Fortran

A very rich site devoted to tracking down the source code for the original Fortran compiler:

My name is Paul McJones. I hope to use this weblog to discuss software history among other topics. For several months I’ve been studying the early history of Fortran, and trying to track down the source code for the original Fortran compiler. Although I just set up this weblog recently (June-July 2004), I’ve created back-dated entries to document my quest in chronological order

It seems most items recently are about programming language history... This site describes an interesting quest, which makes me wonder if the evolution of more recent languages will be easier to document, given the Internet and so forth. It would be rather amusing if LtU will once be used as an historical resource ;-)

The idea of preserving classic software is a good one. I think programming languages (and programming technology in general) are very good indecators of the state of the art and the major issues of the day (e.g., Java and the Net), so building a timelime by considering PLs sounds like a good idea.

We should also keep in mind that John Backus of FP fame was famous even before that for his work on compilers, and was involved with the Fortran team at IBM.

Functional Objects

Functional Objects. Matthias Felleisen. ECOOP 2004. slides (pdf).

In my talk, I will compare and contrast the two ideas of programming and programming language design. I will present and defend the thesis that good object-oriented programming heavily "borrows" from functional programming and that the future of object-oriented programming is to study functional programming and language design even more.

Not all that much that is new for LtU readers, but a nice overview none the less. Includes some details about the PLT Scheme approach to modules and objects.