A Java Fork/Join Framework

Doug Lea: A Java Fork/Join Framework, Proceedings of the ACM 2000 conference on Java Grande.

This paper describes the design, implementation, and performance of a Java framework for supporting a style of parallel programming in which problems are solved by (recursively) splitting them into subtasks that are solved in parallel, waiting for them to complete, and then composing results. The general design is a variant of the work−stealing framework devised for Cilk.

This work is about to be incorporated into Java 7 as jsr166y:

Parallel*Array (often referred to as PA) and its planned follow-ons for sets and maps, provide an easier/better way of routinely programming to take advantage of dozens to hundreds of processors/cores: If you can think about a programming problem in terms of aggregate operations on collections of elements, then we can automate parallel execution. This generally pays off if either you have lots of elements, (in which case, it works well even if the operations are small/cheap), or if each of the operations are time consuming (in which case it works well even if there are not a lot of elements). To take advantage of this though, the aggregate processing must have a regular structure, which means that you must be able to express things in terms of apply, reduce, filter, map, cumulate, sort, uniquify, paired mappings, and so on.

Iterators Must Go

Andrei Alexandrescu: Iterators Must Go, BoostCon 2009 keynote.

Presents a simple yet far-reaching replacement for iterators, called ranges, and interesting "D" libraries built on it: std.algorithm and std.range.

Ranges pervade D: algorithms, lazy evaluation, random numbers, higher-order functions, foreach statement...

(Related: SERIES, enumerators, SRFI 1, and The Case For D by the same author)

Concepts Get Voted Off The C++0x Island

On Monday, July 13th the C++ standards committee voted "Concepts" out of consideration for C++0x.

First, skepticism regarding the feasibility and usefulness of concepts intensified the antipathy towards this proposal. Some people expressed concerns about compile-time and runtime overhead. Second, the creators of the Concepts proposal tried desperately to improve and patch Concepts. The last nail in the coffin was Bjarne Stroustrup's paper "Simplifying the Use of Concepts" from June. It's a masterpiece in terms of presenting Concepts but it also scared folks. The general sense was that concepts were broken, the committee was not sure what the correct direction was to fix them, and it would probably take several more years to come up with a reasonable fix that would achieve consensus. Considering that Concepts were originally designed to simplify C++, a critical mass of committee members agreed in July 2009 that it was time to bid Concepts goodbye.

Edit:

For more on the meeting see "The View (or trip report) from the July 2009 C++ Standard Meeting" part 1 and part 2

Edit 2:

Bjarne Stroustrup on The C++0x "Remove Concepts" Decision.

Unladen Swallow: LLVM based Python compiler

The second release of Unladen Swallow was made available yesterday. The project plan describes the goals as follows:

We want to make Python faster, but we also want to make it easy for large, well-established applications to switch to Unladen Swallow.

  1. Produce a version of Python at least 5x faster than CPython.
  2. Python application performance should be stable.
  3. Maintain source-level compatibility with CPython applications.
  4. Maintain source-level compatibility with CPython extension modules.
  5. We do not want to maintain a Python implementation forever; we view our work as a branch, not a fork.

Of course, Google is known for writing a lot of their software in Python, and now they are trying to speed it up, just like the V8 project has done for JavaScript.

In the Land of Invented Languages

Just finished reading In the Land of Invented Languages, by Arika Okrent. It makes an accessible read for many topics in language.

Natural languages may be less universal than music and less precise than programming languages, but they are far more versatile, and useful in our everyday lives, than either. Ambiguity, or fuzziness of meaning, is not a flaw of natural language but a feature that gives it flexibility and that, for whatever reason, suits our minds and the way we think. Likewise, the fact that languages depend on arbitrary convention or cultural habit is not a flaw but a feature that allows us to rein in the fuzziness by establishing agreed-upon meanings at different levels of precision. Language needs its "flaws" in order to do the enormous range of things we use it for.

Aside from this passage, the book barely mentions PL's at all. But programming languages are, by definition, invented languages (and, no, Perl does not qualify as natural), I think there are many parallels to be drawn. Most language inventors don't do it for the money - creating PL's is not a way to untold wealth. And there are a thousand invented (and programming) languages, so the chance of success is rather slim (and mostly accidental). The book itself is more an informal narrative that goes between personal experience, to examining the persons behind the languages, and on to a more critical analysis of the languages spotlighted. Although there are over 500 listed languages in the appendix, there is only in depth coverage of a dozen or so. The major periods covered:

  • Enlightenment: John Wilkins and his Philosophical Language are the main subject of this period. The 17th century saw the widespread adoption of mathematical conventions, and there was a belief that a language could be designed that removed ambiguity - words would convey meaning exactly as intended. That belief is still a central tenet in much PL design.
  • Idealism: Here we have Zamenhoff and Esperanto trying to bring about peace, love and understanding by sharing a common language. A couple of WWs would tell us that such utopian visions were not quite achieved. But Esperanto has been the most successful invented language in terms of usage. Most of the languages of this period were designed to be easier to learn, and were a mixture of languages - rather than striking out in bold semantic/syntactic fashion. Of course, we have PLs that want to borrow features from many different sources and strive to be easy to learn. Then again, efforts to reduce the number of languages usually have the effect of just creating more languages.
  • Symbols: Charles Bliss and Blissymbolics with emphasis on non-oral language in this section covering symbol language and sign language. Visual PLs is what I thought of here
  • Logic: Brown and Loglan were started as a roundabout thought experiment for Sapir-Whorf. But the only answer it would provide would be: what if, instead of trying to get AI from programming languages, we used something like a programming language for speaking, writing and communicating in the large.
  • Esoteric: Klingon and other Conlang's are discussed in this section, with the emphasis on language as art or puzzle. Esoteric PLs are similar in spirit.

Lot's of tangential topics that are fun (Chinese writing, Hebrew, Tolkien, etc) and covers some very colorful characters. Not sure if PL designers are quite so eccentric, though I suspect it's only because we are still early in the game for PL evolution.

Announcing the new Haskell Prime process, and Haskell 2010

Simon Marlow:

...with ICFP and the Haskell Symposium approaching we felt it was time to
get the new process moving and hopefully produce a language revision...

In the coming weeks we'll be refining proposals in preparation for
Haskell 2010. By all means suggest more possibilities; however note
that as per the new process, a proposal must be "complete" (i.e. in the
form of an addendum) in order to be a candidate for acceptance.

More here.

Phosphorous, The Popular Lisp

Joseph F. Miklojcik III, Phosphorous, The Popular Lisp.

We present Phosphorous; a programming language that draws on the power and elegance of traditional Lisps such as Common Lisp and Scheme, yet which brings those languages into the 21st century by ruthless application of our “popular is better” philosophy into all possible areas of programming language design.

Introduces the concept of the Gosling Tarpit, and presents a novel method for having both a broken lexical scope (needed for popularity) and maintaining one's reputation as a language designer.

(via Chris Neukirchen)

RepRap: the self-replicating machine

the RepRap is a self-replicating machine (3D printer) created by Adrian Bowyer.

Look at your computer setup and imagine that you hooked up a 3D printer. Instead of printing on bits of paper this 3D printer makes real, robust, mechanical parts. To give you an idea of how robust, think Lego bricks and you're in the right area. You could make lots of useful stuff, but interestingly you could also make most of the parts to make another 3D printer. That would be a machine that could copy itself.

There are RepRap machines around the world these days. LtU readers are invited to report their experiences!

Open Source for Hardware?

a recent opencores.com article by Jeremy Bennett.

Open source is well established as a business model in the software world. Red Hat is now approaching the market capitalization of Sun Microsystems, while IBM, the worlds largest patent holder, makes more money from open source than other software (source: BBC Radio 4 “In Business”). Major tools such as the Firefox web browser, the Apache web server and the Eclipse IDE are all open source.

...

Now here's a novel idea. What about open source for hardware? At first sight this seems a non-starter. Open source relies on the nil marginal cost of software distribution, but hardware has to be manufactured.

But a modern silicon chip is typically built from silicon “intellectual property” (IP), written in a hardware description language such as Verilog or VHDL. Fabless design houses may never produce a chip themselves—one of the largest and best known is ARM in Cambridge, whose processor IP is built by other companies into one billion chips ever month. That IP costs the same amount to produce, whether it goes into one chip or one billion.

Hardware is software, and open-source hardware looks like a red-hot area these days. Do we have any open-source hardware developers lurking on LtU? If so please say hello. :-)

Oh no! Animated Alligators!

Lambda calculus as animated alligators and eggs. Virtually guaranteed to turn any 4 year old into a PLT geek.

The non-animated game was mentioned previously on LTU here.