"Future of Programming Languages" panel

From Strange Loop 2010, here is the video for the "Future of Programming Languages" panel which featured Guy Steele, Douglas Crockford, Joshua Bloch, Bruce Tate, and Alex Payne. Ted Neward led the discussion mostly through questions from the audience.

"Future of Programming Languages"

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

A text summary was

A text summary was contributed at reddit.

From a text summary above:

From a text summary above: "Today's patterns are tomorrows lingiustic constructs. Today's bug patterns are tomorrow's type system features."

This is so wrong!

For example, at the time of "patterns" emergence there were languages which do not have any patterns at all. I cannot find, but it was discussed on LtU.

Same for type systems. While you can enrich your current type system with locking analysis, for example, you better off with dependent types.

Practical languages today

Practical languages today often have gaps in expressiveness in order to support modularity, consistency or safety analysis, testing, simplify optimization or garbage collection, achieve predictable performance. These languages use 'design patterns' to cover those gaps in expressiveness. Language design is not a zero-sum game (well proven by Intercal and esoteric Turing tarpits), but the language design space is very large and exploring it is very expensive. Today's design patterns, bug patterns, self-discipline, frameworks, and boiler-plate offer efficient and readily accessible studies for tweaking language designs of tomorrow. And, so, I agree in spirit with what you quoted above.

This isn't to say that design patterns should directly become linguistic constructs. For example, representing complex numbers in SQL involves passing two columns around, everywhere, but I wouldn't wish to solve that by adding 'complex numbers' as a new SQL language feature, rather some generic sort of 'composite' type and perhaps ability to overload '*' and '+' and related operators would be more appropriate. When people say design patterns are missing language features what they really mean is that design patterns are a language smell, and a better language design or choice of features might have avoided need for the pattern.

Anyhow, you argue that "this is so wrong!" and refer to expressive languages and languages with powerful consistency systems.

And it is true that we can find highly expressive language with features such as macros, reflection, compile-time staged evaluation, mutable top-level, fexprs, open types and functions, AOP-like cross-cutting code. A high level of expressiveness could allow us to add persistence or reactivity or multi-methods to a language that otherwise lacks them, and effectively avoid many 'design patterns'.

Similarly, it is true that there are languages with rich modeling for properties of safety and consistency - term rewrite systems, calculus of construction, dependent typing, Hoare logic.

The difficulty has always been in reconciling these features into the same language, in addition to other practical desiderata (such as performance, scalability, modularity). Since they haven't been reconciled, most practical languages fall somewhere in between for expressiveness and analysis and other useful properties, and thus will have plenty of design patterns, bug patterns, and other issues - just, hopefully, fewer issues than its ancestor languages.

Same for type systems. While

Same for type systems. While you can enrich your current type system with locking analysis, for example, you better off with dependent types.

Isn't that counterexample still arguing for replacing locking patterns with types? While much of the hubub about types is about new levels of expressiveness (e.g., hey look, there's a SMT solver in there!), an important area is showing how to cast problems into them. E.g., Jeff Foster and Nikhil Swamy have both done some particularly compelling work here. It's often not clear how to do it so we start with a new analysis -- I know somebody looking at types for locking patterns -- but, after that initial understanding, we have a better shot at reusing existing machinery.

Numeric extensibility in SQL and other languages

I normally use PostgreSQL for my database needs and I can indeed add Rationals, Complex Numbers, Quaternions or whatever else I need since the type and operator systems are table-driven - which seems natural for a relational system! Contrast that with Scheme's vaunted Numeric Tower which gives you what the designers think you might want without providing any means for extending it. Templates allow me to add compile-time numeric dimension checking to C++ (and unit-converting, too) but very few other languages provide such extensibility. Scheme provides an exact/inexact numeric distinction but I know of no major language which will help me manage the precision of my inexact calculations - something we were all trained to do back when we used slide rules, but which I never see people doing now. I would like languages of the future to provide a small clean functional core which includes metaprogramming and other extensibility mechanisms. A good standard library is essential but so are community extension repositories, otherwise powerful extensions simply don't get used. I do not evaluate a language for its expressiveness in solving my problems, I evaluate it solely for its ability to support arbitrary extensions cleanly and efficiently.

confidence intervals

One of the reasons it is not so easy is you need to know a lot about functions you are using to know what to do with imprecision. For example if I know my actual value x lies between a and b and my function f is monotonic between a and b (i.e. either flat/increasing or flat/decreasing then) f(x) lies within f(a) and (b). If I don't have that assurance then things can get hairy. Think of trying to figure out the precision of a calculation on something like f(x) = 1/x where I'm sure my initial value is between -2 and 2.

Assuming your f isn't too pathological most mathematical languages handle this sort of problem well.

As for not being able to extend Scheme's numeric tower, why not? I don't follow what stops you from just declaring your own numeric types.

SQL and C++

Interesting. I too was trained to account for precision, in first-year college courses and on the job, all within the past 15 years. So the engineering discipline is partially intact, but the software is lacking... particularly the popular languages CAD packages (probably because they're 'easier' and 'cheap' compared to the ones that do it right).

I'm surprised you're able to do so much with SQL and C++. They're nasty languages but often the best option for production work, so we might as well make the most of them!
A quick search yields a sample chapter from "C++ Template Metaprogramming" on dimensional analysis. Is that what you're doing, or can you point out some better examples?


There were two things thought were interesting about this panal. I'll hit the other one in another post. But I was shocked by the universality of the feeling that we are in a window where people are experimenting with new languages. That only a small number of languages have libraries big enough to meet the commercial needs and the cost of supporting a large library was gigantic. The market was going to have to consolidate and do so quickly. That language people should look at this as a choosing process.

I'm trying to think back over my lifetime in computers mid 1980s to 2010 when languages were stagnant across the board and I don't see any. When was this period when there where only one or two choices? I think back to when Visual Basic was king, and C++ was directly competing, C was still popular, Perl was exploding on systems administration and the new CGI / web, Lingo was being used heavily for CD-ROM applications....

In other words I think it is always the case that there are a small number of dominant languages and a large number of 2nd tier languages and even larger number of 3rd tier languages. Overtime 2nd tier languages can become dominant languages and dominant languages can fall to 2nd tier. Languages that have always been 2nd tier can fall to 3rd or off the map entirely. The 3rd tier language space is always active and bubbling with innovation.

So what does everyone think about this window notion?


In this panal there was tremendous focus on efficiency. The general feeling was the high level languages were good, but were much much too slow. So for example they very critical of the Parrot VM, as being a waste. I'm sort of curious what people on LTU think about this point.

Is there room for a powerful but kinda slowish VM? Does it make sense or does it make sense to just not worry about efficiency. I certainly took advantage of the speed of Perl 5 many times, where I processed large numbers of records as fast as the disk could go. If Perl were 100x slower it wouldn't have been the right choice. Getting Perl 5 as fast as it was in the 4.0/4.36/5.0/5.002 days took tremendous effort.

I myself always wanted Perl 6 to just move ahead directly on Pugs and abandon Parrot. Parrot has already proven itself a much more complet project than Perl 6 and of nebulous value. Its looking like Perl is going to lose another 1/2 dozen years to Parrot. If Parrot worked out then I'd have no problem with Perl jumping over but I think Perl lost out to Ruby and Python based on losing a decade of improvements by biting off two complex projects at once.

Is there room for a powerful but kinda slowish VM?

Neither Bash, TCL, Visual Basic, Python nor Perl are very fast, they alleviate that by implementing non-trivial operations in lower-level languages.

A VM doesn't need to be fast, as long as the interpreter/compiler and applications written are.

Slowish, but energy conserving VM

A "powerful" VM would be one for concurrent problem solving use cases that eat up a lot of cores, and trade performance for energy.


Rather than 'efficiency', I'd like to see more language designers focusing on other performance metrics: responsiveness (time between input becoming available and being processed), timing (variability between intended time to emit an output and actual time), utilization (ability to make full use of available resources, such as multiple cores; roughly, throughput / efficiency).

Beyond that - I really haven't a clue what you mean by "powerful VM", but I believe there is room for intermediate languages with a wider variety of nice properties than just 'portability', especially if the immediate performance costs can be recouped by an optimizer (possibly one directed by annotations), and if the more expensive operations can be shifted to lower-level languages while we're waiting on improvements in the optimizer.

Tradeoffs and complexity...

Yes, better analysis and control of tradeoffs sounds nice. Latency, timing guarantees, multicore -and- unicore utilization efficiency, RAM, cache, network, etc. Just beware of utopianism.

I'd also like to have better dependency analysis tools... if everyone was aware of the Rube Goldbergian complexity of most software (the panel mentioned "web apps held together by duct tape and wire") then just maybe it would bother everyone else as much as it bothers me! :)

How about slowish -compilers- (including incremental and JIT compilers)? A compiler can be really simple and direct, in comparison to VMs and interpreters, and can be a big win for cache performance, for example. (That's sort of the Ken Thompson school of thought.)

I like the idea of a standardized abstract intermediate language (as opposed to lower-level concrete bytecodes), but my pragmatic side says this adds an unnecessary layer. At least in smaller systems, overall complexity is reduced by using a simple compiler with a few common optimizations, falling back to assembly/machine code for occasional optimizations. Big corporate/govt systems are a different matter, of course.

intermediate language

I like the idea of a standardized abstract intermediate language (as opposed to lower-level concrete bytecodes), but my pragmatic side says this adds an unnecessary layer.

You see this a lot in print languages. For example:

Word -> Postscript -> PCL -> Engine Microcode

or more recently PDF with some printers being able to handle PDF directly and PDF -> Postscript being rather trivial. In practice though Word / Microsoft drivers put out pathologically bad Postscript. Adobe's apps, and their drivers put out usable code that you can hand edit or even better write scripts to clean up after it was emitted.

Is that the goal? And if not what is the goal?


In terms of what was meant by powerful I was quoting. But I can guess Parrot has some pretty cool features:

-- Parser Grammar Engine which allows for the virtual assembly to have parsing rules which makes compiler designed easier

-- Registers and stacks, allowing the VM to use more low level algorithms in its implementations and easier translations from the runtime to machines codes

-- lots of debugging oriented features

that sort of thing.


As for your ideas that sounds a lot like real time operating systems and languages. I suspect you need OS support to make that work.