archives

Generics are a mistake?

Generics Considered Harmful

Ken Arnold, "programmer and author who helped create Jini, JavaSpaces, Curses, and Rogue", writes that the usefulness of generics is outweighed by their complexity. Ken is talking about Java 5, but such critiques are well-known for C++, and C# is not immune either. Ken describes the Java case as follows:

So, I don't know how to ease into this gently. So I'll just spit it out.

Generics are a mistake.

This is not a problem based on technical disagreements. It's a fundamental language design problem.

Any feature added to any system has to pass a basic test: If it adds complexity, is the benefit worth the cost? The more obscure or minor the benefit, the less complexity its worth. Sometimes this is referred to with the name "complexity budget". A design should have a complexity budget to keep its overall complexity under control.

[...]

Which brings up the problem that I always cite for C++: I call it the "Nth order exception to the exception rule." It sounds like this: "You can do x, except in case y, unless y does z, in which case you can if ..."

Humans can't track this stuff. They are always losing which exception to what other exception applies (or doesn't) in any given case.

[...]

Without [an explicit complexity] budget, it feels like the JSR process ran far ahead, without a step back to ask “Is this feature really necessary”. It seemed to just be understood that it was necessary.

It was understood wrong.

The article contains a few simple supporting examples, including the interesting definition of Java 5's Enum type as:

Enum<T extends Enum<T>>

...which "we're assured by the type theorists ... we should simply not think about too much, for which we are grateful."

If we accept the article's premise, here's a question with an LtU spin: do the more elegant, tractable polymorphic inferencing type systems, as found in functional languages, improve on this situation enough to be a viable alternative that could address these complexity problems? In other words, are these problems a selling point for better type systems, or another barrier to adoption?

[Thanks to Perry Metzger for the pointer.]

GHC Survey Results

The results are in for the 2005 Glasgow Haskell Compiler user survey, with a summary and all the raw data. The comments were the highlight for me; see for instance Applications I use GHC for.

(Previous LtU mention)

Dyna: a weighted dynamic logic programming language

Dyna is a language I stumbled upon by accident today. What I find interesting qualities of it are its claimed clean compilation to C++ and its non-standard semantics.

Dyna is a small, very-high-level programming language that makes it easy to specify dynamic programs and train their weights. You write a short declarative specification in Dyna, and the Dyna optimizing compiler produces efficient C++ classes that form the core of your C++ application.

Natural applications in NLP include various kinds of parsing, machine translation, speech decoding, and finite-state modeling. Dyna also has many other applications, especially in other applied AI areas, since dynamic programming (and its degenerate case, tree search) is a common strategy for solving combinatorial optimization problems.

Properly speaking, Dyna is a "weighted" logic programming language: terms have values, and Horn clauses are replaced by aggregation equations. It is Turing-complete. You can also think of Dyna as extending compiled C++ with powerful deductive database features.