## Prediction for 2008

So, what are your prediction for 2008? Naturally, we are only interested with predictions related to programming languages...

Three types of predictions are thus in order: (1) Predictions about PLT research (direction, fads, major results) (2) Predictions about programming languages (whether about specific languages, or about families etc.) and (3) Predictions about industrial use of languages/language-inspired techniques (adoption, popularity).

## Comment viewing options

### Predictions

(1) The theme of the year will be concurrency. Hopefully, some research language will move beyond Erlang in that area.

(2) People realize that new languages need a killer application/framework, like Rails. Authors of aspiring languages will be working hard on that rather than on new language features.

(3) F# will be apprehensively adopted for production use in some .NET-heavy companies.

### (1), (2) I agree. (3) - I'd

(1), (2) I agree.
(3) - I'd give it 0.3 probability, if this refers to shipped products rather than internal applications. This would be a very exciting thing, if it happens.

### Upping the ante

(1) As is the case in almost every year in the past, concurrency will kind of be a theme but won't go anywhere because the current paradigms (including lambdas) and architectures (multi-cores) can't deal with it very well. Maybe by 2010 we will finally bite the bullet and start from scratch to get concurrency that really works; something radical like OCCAM and transputers!

(2) Given a language that is kind of similar to existing languages (at least in paradigms), a killer framework is important. However, language designers can still find lots to do in genuinely new language paradigms (e.g., Subtext).

(3) Scala will finally get a mature .NET backend and become the new language of choice for both the JVM and .NET :)

### (3) Scala will finally get a

(3) Scala will finally get a mature .NET backend and become the new language of choice for both the JVM and .NET :)

Oh god, I hope so...

### My prediction is a little

My prediction is a little bit more conservative, but I predict Scala will gain momentum, and at least one high visibility project will use Scala (aside from lift, that is).

### lambdas?

Perhaps you could expand a bit on why concurrency won't go anywhere because of lambdas...for those of us who have been reading for a couple of years that Lambdas are the ultimate.

### Killer apps are overrated

The idea that "new languages need a killer app" is strange. Who originally presented this view point?

Everyone talks about Rails being Ruby's killer app. When you ask them to explain why, they can't give you technical reasons. All they can do is give you buzzwords like "Active Record", "Active Support", &c. "Sounds Like Rails" should not be the SLOGAN for language designers.

How can language designers be so far removed from the consequences of their actions that they legitimately present language features as subservient to "Sounds Like Rails" phenomena?

Although it would surely be nice to catapult a language into fame, I don't see the point in fame alone. Computer scientists do not seem to see the value in documenting scope analysis details. It simply doesn't publish papers, but that is what the "real world" needs. So few Rails developers actually seem to understand why Rails is actually successful. When I explain the reasons to them, they seem startled by how unaware they were of the "magic" behind Rails. Do computer scientists expect to do better problem domain analysis than the programmers who work in that problem domain regularly? Should computer scientists expect undergraduates and graduate students to do better problem domain analysis as well?

If someone here can predict where the man power for improved problem domain analysis will come from, then THAT is the prediction I want to hear. It seems to me any progress in developing true "killer apps" starts there, with ideas about how to develop scalable, secure architectures.

Maybe THEN all language designers can understand "Sounds Like Rails" is about rapid application development, and not about actually finishing projects faster... two VERY distinct concepts! Maybe 2008 will be the year language designers pay closer attention to architecture and maintenance. Software engineering researchers like Stephen Schach would certainly benefit from language designers paying closer attention to these two concepts.

### Not really.

[[The idea that "new languages need a killer app" is strange. Who originally presented this view point?]]

Well Ruby did increase a lot in usage when RoR came, so apparently new language either need a big marketing investment (Java) or a killer app (Ruby) to become successful.

Sure RoR was possible only because Ruby is a nice language but it was still not very used until RoR came.

### But, really, yes...

Sure RoR was possible only because Ruby is a nice language but it was still not very used until RoR came.

This quote typifies a common fallacy perpetuated by language enthusiasts who believe "new languages need a killer app". Note that whenever this argument comes up, language enthusiasts never point to the architectural accomplishments of Rails. The crutch is always that Ruby is a "nice language". I suggest doing away with this crutch and exchanging it for problem domain analysis.

Surely others have noticed that we have mounds of academic papers from "the software crisis" about maintenance, but there is a woeful lack of understanding about common problem domains. There are loose bodies of knowledge like Ion Stoica's Stateless Core, Roy Fielding's Architectural Styles and the Design of Network-based Software Architectures, and Trygve Reenskaug's metaphors for planning systems.

Forgive me for possibly being naive, but if these problem domains are so common, then where is the Bob Vila how-to everyone should be familiar with? It's missing possibly because there is a wrong attitude that "new languages need a killer app". This wrong attitude misses the fact that all ideas are transferable. Rails is not about a "nice language", it is about a set of architectural constraints particularly well-suited to rapid application development, but not necessarily maintenance and changing scope.

An honest use case that backs up "Ruby is a nice language" is RSpec, which other languages might require an additional parser in order to implement. However, even still, RSpec comes from authors who have thorough knowledge of the problem domain. Extracting that knowledge will likely yield much richer information than mere "design patterns".

As an aside, "nice language", "killer app", and "marketing dollars" does not explain the fact that about 40% of the top 100 web sites on the Internet are running PHP. Word-of-mouth marketing only carries so far, too. It seems to me that "killer app" is an illusory, complex way to provide exemplary code and documentation.

### PHP has several "killer apps"...

...driving its popularity, even though it is frequently criticized for its merits as a PL.

Tons of web frameworks and apps are written in the thing, and it works well enough. It excels at what it was designed to do--glue webservers to RDBMSs. It integrates well with SGML-derived markup languages (HTML, XML, and all XML apps).

### don't conflate "killer apps" w/ community success factors...

PHP was not designed to glue webservers to RDBMSs: http://groups.google.ch/group/comp.infosystems.www.authoring.cgi/msg/cc7d43454d64d133?oe=UTF-8&output=gplain PHP grew from there. When talking about "growing a language", nothing is more exemplary in its success (and failure) than PHP. Consider the willingness of the Zend Engine contributors to drop features that contribute the least to project productivity.

Still, I understand your argument. I regularly attend LIPHP and NYPHP, subscribe and post to both mailing lists, and have used PHP on and off for about 5 years. I am well-aware of the PHP ecosystem and what those who use PHP actually like about the language. Today, the popular use case for PHP is basically what you described: a simple way to loop over database rows, with a metric ton of standard library functions, and all of the core is accompanied with superb documentation. However, the notion that it integrates well with a presentation language is strange. No programming language integrates well with any template language. That's why we need discipline as programmers to keep state out of the template.

LtU is driven by a so-called "killer app" written in PHP: Drupal. Drupal is an interesting phenomena. My friend who attends the NYC Drupal Users Group says it is like nothing he has ever seen before. He says there are about 30+ people packed into a room, standing room only. If you look at the success of Drupal, it is far more than a "killer app". It is a huge community. Just like PHP core, Drupal's policy is to drop the features that contribute the least to productivity. Both constantly identify the most profitable set of architectural constraints that ensure it's longevity. Both have great documentation.

Drupal has changed a lot. Joomla has changed a lot. Heck, Joomla has been forked once or twice (Mambo) to get where it is now. And for the most part, Joomla is a nightmare to work with. However, there is a lot to learn from the Mambo/Joomla experiment.

Again, I'm interested in predictions on where the man power will come from the problem domain analysis necessary to write the same "killer app" over and over again, in any language the implementor wishes.

### Sigh

[[all ideas are transferable.]]
Even in bold, that's just another way to rephrase the Turing tarpit, so it has about the same value, that is to say none..

[[40% of the top 100 web sites on the Internet are running PHP.]]
So what? It's called the momemtum of the installed base (of application and programmers), COBOL have much more existing code than Scala, Ocaml, Ruby, all together, so what?
Shall we keep using COBOL for all eternity?

I remember a chart indicating Ruby interest (I think it was the number of books sold on Ruby), there was a noticeable increase when RoR appeared, but as Ruby started low this doesn't mean that the incumbant will be replaced anytime soon..

### Businesses will pay you to rewrite their code so long as you can

Businesses will pay you to rewrite their code so long as you can preserve their data. Path dependence is easy to break if you can preserve the data. Many people who use COBOL today do so because they have no choice, in part because the original sources were lost and in part because the documentation is missing.

Shall we keep using COBOL for all eternity?

A false dilemma.

a noticeable increase when RoR appeared

And the Pragmatic Programmers' Ruby book also contributed to Ruby's success. A book is a form of documentation. There were a lot of timely events.

I think you are sighing over a really subtle nuance that most people won't allow me to comment on, because they don't think it matters:

[[all ideas are transferable.]]
Even in bold, that's just another way to rephrase the Turing tarpit, so it has about the same value, that is to say none..

But it's not rephrasing the Turing tar pit. It's making the observation about striving for practical goals. Would you build a language without considering prior successes and failures? You shouldn't. Would you analyze a problem domain without considering prior successes and failures? You shouldn't. An appropriate translation of all ideas are transferable would be that you should listen to people who can teach you something, even if all they can teach you is how they use something. Where are the man months in 2008 going to come from to seed such a venture?

All ideas are transferable has practical relevance you may not have considered, that absolutely distances it from a Turing tar pit phenomena. If I have a client who wants MySQL, and I'm only supporting PostgreSQL, but she is willing to pay for MySQL, then she is going to get MySQL!

### Every language needs a killer clique

It's good that you mentioned that the "Pragmatic..." book helped make Ruby more popular.

People should remember that the languages we are talking about such as Ruby and PHP are largely sustained and spread by the so-called "open source community". Without wishing to make claims that are over-reaching, I think it is safe to say that these languages go through very rapid growth when little "success stories" about their commercial application spread by word of mouth and through ancillary marketing -- and then ever-larger circles of open source community members start looking for nails to fit the new hammer.

"Ancillary marketing" is my own concept. When the hackers in the community start talking about one of these languages, companies such as O'Reilley start paying attention. They might blog about it (and they are thought-leader bloggers, often quoted and linked to). They might ask around at conferences. They might make an editorial decision to publish a book about it. They'll then use sales feedback from the book to hint about trends (e.g., "What's hot? Judging by book sales, PHP is waining and Ruby looks like an early-stage upstart."). They are making, from their perspective, what are true, interesting, and useful observations but the net effect is to hint to the open source community that there is a new job market opening up for experts in the new technology -- so many, many, many programmers begin to train themselves and build up portfolios of work around the technology (afaict).

Now, maybe I'm too far out on a limb. The dynamic I've sketched there is testable. Still, I've watched this community grow from since the start and I've watched the community it replaced since the very early days of the GNU system.

My main point is that to study the economic growth of those languages, you have to study the economic conditions of their spread. It's nowhere near as simple as killer apps or perfect documentation or language features or any other such thing. It's all about what motivates the mob of people that identify themselves with the open source community: social status, access to jobs, trendiness -- those factors don't determine but they "co-dominate" the action.

I happen to believe that this dynamic is very bad for the larger society and that it does amount to the open source community collectively shooting each other in the feet. That's a much bigger question than PLT applies to, though. PLT provides "expert witness" when we want to look for irrational outcomes of the open source social dynamic. But if you want to try to predict how languages win and lose, I think you have to look beyond the technology and into that social and economic dynamic.

-t

### Exactly

My main point is that to study the economic growth of those languages, you have to study the economic conditions of their spread.

I agree completely. However, I want to underline that by reiterating that all ideas are transferable, and noticing this fact will leads to better engineered systems... which is all I really care about. Problem domain analysis fascinates me precisely because it's so hard to find resources about it. There is a lot of how-to, but not books that contain elaborate results. Instead, you read specifications from ISO or Motif, like ICCCM, and say to yourself, "How did they come up with this specification?" You constantly hear the word "over-engineered" in the programming profession, but little thought is given to helping guide the engineer away from such a mistake. Programming language theory in isolation studies the tools used for implementing those better engineered systems. However, without a strong body of knowledge, we'll behave like cavemen rolling a wheel without an axle. :-)

I happen to believe that this dynamic is very bad for the larger society and that it does amount to the open source community collectively shooting each other in the feet.

How do you feel about Barry Schwartz and the Paradox of Choice? :-)

Now, maybe I'm too far out on a limb.

Eh, there is no such thing. Places like LtU are only useful if groupthink is strongly resisted.

### Haskell Will Rock The Blogosphere...

...but no one will actually use it except for a few 1337 h4x0rs. Instead, you'll see "monads in Python", "applicative functors in C#" and stuff like that.

You've already read this post and were just pretending to predict it right? RIGHT? :-)

### Prediction is very difficult

"Prediction is very difficult, especially about the future"

attributed to Niels Bohr

### Just Kidding

Yes, I was just pretending.

### (1) Hybrid type systems and

(1) Hybrid type systems and approachable compiler hacking. Return of parsers after the XML winter. Parsing is not considered harder anymore than using regular expressions. Decrease of the number of new monad tutorials.

(2) Rails is in the downtrend of its hype cycle. People start to look at Scala but its effects are mostly limited to blogosphere articles: Scala will be what Erlang was in 2007. While there is still concurrency-panic it is mostly ignored by practitioners who just throw more processes on their problems. C++ with its new ISO standard for 2009 will become the language-of-the-year in 2008.

(3) Still no Next Big Language despite Steve Yegges long blog posts. However Adobe AIR and ActionScript 3 will be considered for serious X-platform applications. AS3 is the newcomer of the year.

### Prediction revisited

Seems like I was wrong about parsing. The DSL business is just too nerdy to make a real difference. For matters of C++ and ActionScript my prediction seems to apply pretty well. Same with Ruby. It was also correct not to mention concurrent progamming.

### XQuery, Scheme

(3) Industrially, use of XQuery will continue its (afaict) exponential growth. Facts behind that: Mailing list statistics (e.g., from Markmail) support this. Support for major extensions to XQuery (for updates and "grouping") are expected to start to be widely available and fairly solid. The database giants are investing heavily in quite sophisticated XQuery implementations (afaict). Perhaps my technology, Flower, XQVM, and all that may play a role here. With or without my stuff, XQuery is exploding.

(2) Predictions about programming languages: About Scheme in particular: the real de facto R6 will emerge from the ashes of controversy. By the end of the year, people outside of the Scheme community will be startled by the sudden bump up in practical utility of many familiar implementations and two of the newer implementations will make surprisingly great progress. Support behind it: loose observations of how a whole bunch of implementations and efforts are currently evolving. The R6 effort's rough patches have had the side effect of better focusing work all around -- it was an excellent debate in content, though perhaps not entirely in form.

(1) About PLT: pass.

-t

### I will chicken out and say

I will chicken out and say that a year is too short a time period to make any interesting predictions in this field. As the chicken I am I will also not offer any predictions in the longer term.

### My Predictions

1) F# and Scala will enjoy large uptakes, as the top 10%-25% of programmers will realize that they probably need to learn a functional languages, and these will be the easiest to learn- for two reasons, 1) they reuse a well known programming environment with large, well known libraries, and let you "fall back" to known languages and paradigms when the pressure is on to get something done (in this sense, Scala and F# will act like C++ did when people were adopting OO- allowing them to drop back to C when they needed to get work done), and 2) being "hybrid" languages, they don't require you to relearn all new data structures and ways of thinking all at once.

Functional languages won't make the leap across the chasm to the non-elite programmers this year.

2) Haskell will remain the headwaters for new ideas in programming languages, and ideas from Haskell and ML will continue seeping into popular programming languages like Ruby and Java, driven by a desire to "stem the tide" of the best programmers to functional languages. Also, there'll be continuing unrest in the mideast, and the sun will rise in the east and set in the west.

Now, starting to go out on a limb:

3) There will start to be "multi-language" projects, projects code in some combination of F#/C# or Scala/Java. A popular combination will have the "heavy lifting" portions (long on algorithms and data structures and computation) done in the functional language, and the GUI/UI code done in the OO language. Many of these projects won't be publicized as such, as they'll just be a small group of developers deciding to rewrite their hunk of code in a functional language, and forgetting to tell management they did this (not unlike how many enterprise Linux installations went on in the late 90's).

4) Ruby will lose it's shine. Actually, more specifically, Rails will lose it's shine. More critics, like Zed will show up. Most of the defectors, however, will simply go back to PHP and Java.

5) Languages will start showing up looking to be the successor to Haskell. These languages will have Hindley-Milner type systems, be purely functional, and use monads, but will not be lazily evaluated. The goal of these languages will be to solve the problems Haskell has with space leaks and judging space/time costs of complicated algorithms in "production" code. Note that none of these languages will actually manage to dislodge Haskell from being king of theoretical hill (at least in 2008, and maybe not ever), I'm just saying they'll start showing up.

### 1) I bet the top 10% of

1) I bet the top 10% of programmers already know FP, but they just aren't allowed to program in it very often (for works-for-hire).

2) Somewhat agreed, but as usual the ideas won't be widely understood until they are translated to more pragmatic languages (e.g., FrTime/Scheme vs. Yampa/Haskell for me personally!).

3) Disagree. Scala can completely supplant Java: its got FP features sure, but its also got advanced OO features (mixins!). You don't need Java at all if you are willing to use Scala.

4) Dynamic languages were around before Ruby and will be around after. Better bet that defectors will go back to Perl or Python, but I wonder if Ruby will lose its shine. Its meta-programming facilities are really cool.

5) I don't think there is such a strong demand for a successor to Haskell. I think we'll instead see the recent trend of pragmatism to continue for a while longer.

### Predictions for Scala in 2008

Sean, I'm was hoping you would "predict" that Scala will have a more fully featured IDE in 2008 ;-)

Anyway, here are my "safe" Scala predictions:

1) Book comes out in dead tree form
2) More fully featured IDE or two
3) In part because of the previous two, there's even more blogosphere buzz - positive comments from many more Java community luminaries and from other directions. Some more complaints from the Java masses that it's too weird either syntactically or semantically. (very safe prediction since it's already happening). A few (very few) relatively well thought out debates on the merits of the proposed Java 7 standard vs Scala or Scala vs Python/Ruby/Smalltalk or Scala vs Haskell or Scala vs Ocaml/F# or Scala vs Erlang.
4) A few moderately high profile open source Java projects take up Scala seriously - mixing it with their existing Java sources for pragmatic reasons. A few not so high profile proprietary projects do the same thing. This comes with mixed success, though more success than not - the code is cleaner, but the build systems are more complicated and some participants struggle with the advanced concepts.
5) A lot of lower profile projects (both open source and proprietary) are kicked off entirely in Scala.

Here's one that is more wishful thinking than prediction. Or, maybe it's a prediction but the 2008 time frame is wishful.

6) People inside Google let the world know that many of its developers have been playing with Scala during their free 20%, including experimentally porting some internal Java apps to Scala with good results. Google management has become convinced to evaluate Scala as an addition to its approved languages for internal apps. The evaluation will even examine the feasibility of phasing Java out for new development. The Scala community plasters this news all over every Java blog, Java user group, or cafe that sells cups of java. The resulting flood of interest causes Martin Odersky to half-wish he had followed the Haskell unofficial motto of "avoid success at all costs."

I suppose some other big player could leak Scala plans, in particular Weblogic or IBM, but my bet would be on Google first since they're more likely to have the developers who are smart enough to recognize Scala's power plus the corporate culture to listen.

### That wouldn't be a

That wouldn't be a prediction. Every weekend I get closer, but my new job is also getting busier :) I hope I can hack on the IDE more during the CNY holiday. Here is number 7 for Scala:

7) The .NET backend gets fixed and Scala becomes the new language of choice for .NET programming. Ok, this is really more of a personal wish than prediction.

The problem with Google, Microsoft, and IBM et al is that they are already backing other languages (e.g., Google has Guido). There are no big name internal advocates for Scala in any of those companies.

I don't think there is such a strong demand for a successor to Haskell. I think we'll instead see the recent trend of pragmatism to continue for a while longer.

Agreed. Haskell's direction is and will continue to be driven by the research community. Pure and eager aren't that interesting a research combination, even if there is some pragmatic value in it. And the industrial community that values the pragmatism of eagerness doesn't yet know how to evaluate the pragmatism of purely functional code.

Some future Haskell-like purely functional eager language is certainly possible, but not in 2008 - not with any significant up-take in either the industrial or research settings. I can safely predict, 10 or 20 years from now when language X is hot, some Paul Graham like pundit will be saying "I don't understand why X is getting so much attention now - Haskell did all that decades ago and did it better." :-)

### And the industrial community

And the industrial community that values the pragmatism of eagerness...

Just a side note. I never heard someone talking about the industry as a community before. Maybe there is still hope in 2008?

### Industrial community

You're right. It's a terrible turn of phrase. More or less I meant "all the software people who aren't in research." Calling such a disparate mob a "community" is a misapplication of the word.

### Pure Functional and Eager

Coq with extraction to OCAML already fills the role of a pure haskell like language which is eager. I don't think there is a need to fill a role that already exists.

Coq suffers from not having a large programming oriented library and a very small community that is almost all programming language researchers or mathematicians. A bit of work however, and I think we could easily see it stealing programmers from haskell.

### I don't think most Haskell

I don't think most Haskell users would consider Coq to be Haskell-like. No type classes, no sugar for monads, no general recursion (at least not without then providing an explicit proof that it still terminates), no layout rule...

### coq-svn

The development version of coq has (experimental) support for type classes. You can use the Notation syntax to get the syntactic sugar for monads almost trivially.

### Oh No...

There are languages more esoteric than Haskell...and I was just getting the hang of type classes.

### Good ones

These are my favorite predictions so far. (I'm too lazy too come up with any of my own, so I'll just kibitz instead.)

I sort of agree with Sean on (3): I think such multi-language projects will happen, but I think the advantages of Scala (at least, I don't know F#) in reducing run-of-the-mill Java boilerplate (getters/setters, etc.), will quickly make people want to use it for the "dumb" parts of the project as well. The smooth curve away from existing ("enterprise OO") approaches and techniques will make it possible to do multi-paradigm projects in a single language.

I disagree with Sean on your (5). I think there's lots of reasons to hope for a successor to Haskell. I don't know whether we'll see an obvious one this year, though. Just "eager Haskell" isn't enough. There are a lot of possible directions for such a language to go, and without being as far out as Epigram, for example, it's not entirely clear which direction would be most fruitful. In fact, a good successor to SML could just as easily take the crown here. Toss in Chakravarty et al.'s "modular type classes" and an otherwise more powerful type system, and you could have a really killer ML. On the other hand, a lot of what makes Haskell great is the personalities involved, and I don't know any way to replicate that. Anyway now I'm really just rambling...

### Scoring myself

1) Called this wrong. There has been a fair amount of take up of F#, at least relative to the size of the functional programming community, but compared to the size of the C# programming community, there has been very little take up. Where I went wrong was being way too optimistic about how willing Windows programmers were to use a tool that was not "officially released". That and being too optimistic about how long it'd take for F# to be officially released.

2) Got this right, but it was a freebie. But I'll take the points I can get.

3) Again, I got this wrong- but mainly because I was too optimistic about the uptake of functional languages.

4) I'm going to claim I got this one right, although there is an army of Ruby fanatics willing to argue that I didn't. What I didn't see was the uptake of Groovy among Ruby developers.So I got the "they'll move back to Java and PHP" part wrong. So I'm giving myself half credit here.

5) I'm going to have to say I called this wrong. For the life of me I can't name a single language that's in the "replace Haskell" category. I don't count Clojure in this category because that's really a Lisp dialect, and doesn't have a HM type system, and really isn't positioned as a Haskell-replacement (Scala, maybe, not Haskell).

So, 1.5 right out of 5, or a 30% hit rate. And that's only counting the freebie. I suck.

### Convergence

The intense variety of programming languages has to moderate sometime. There aren't so many different kinds of mathematical notation for the same thing. It is natural for humans to gravitate to different languages when separated. Look at all the natural languages and their differences. But now that we are all connected, even the universe of natural languages is converging, at least in the connected space of the internet, towards English.

Ideas will win out through some sort of Darwinian unnatural selection. Cats and dogs proliferate by adapting to humans. Some languages proliferate by attaching themselves to large corporations or movements. While "Quality" plays a role, evolution consists of a series of accidents, too. The divergence of life into so many different species is an example of the countervailing force of divergence. But society needs to have fewer, better programming languages.

Six things winnow out the contenders:

1. We're heading toward many projects in the zone of many tens of millions of lines of code. We need higher level languages to reduce the code size of these hyper-complex projects.

2. The lust for speed still pushes to generation of machine code.

3. The need to understand code pushes to both terseness and simplicity.

4. The need to get it right pushes to languages that support machine proofs and verification, especially as projects get bigger.

5. The split into multiple cores etc. pushes to new paradigms for parallelism.

6. The need of people to collaborate and communicate pushes to the convergence of many languages into fewer, or one. The difficulty of producing languages and compilers that meet all these needs also pushes toward convergence.

Frederick

### (1) Concurrency,

(1) Concurrency, provability, semi/dynam/stat/ic type systems...sort of what already is there.
(2) Haskell,The Holy! No more to say. But there will be more attention to structural/synthetic composability of code (Lisp macros) which have it's advantages over just-ADT expressions; and there will come parenthesis-less Lisps with semi/dynam/stat/ic type systems like Dylan, Goo or REBOL. New module systems will be developed to put constraints on side-effects at module level.
(3) Business-core ones will start to understand good programming language/methodology/culture matters. So life will be more beautiful and fertile for developers.

### GC crashes under inefficiency

Haskell, F#, Scala, ML, Ruby, Java 'commercial hoping' continues and plenty of people lose jobs.

### No Javascript predictions?!

No Javascript predictions?! I am stunned.

### Not really a PL prediction...

but Ajax continues to dominate web apps, and will increase its market share. The new Javascript will be a big part of that--though changes in the language itself won't be the big reason.

Reasons JS/Ajax will continue to succeed:

* Remains the best cross-platform solution for rich web content. Nobody trusts MS/Silverlight to be cross-platform, and many view it as yet another attempt by Redmond to achieve more vendor lock-in, regardless of the technical merits. Adobe's seeming inability to deliver a 64-bit version of Flash will really start to hurt in the coming year, as more and more PCs ship with 64-bit CPUs and OSs.

* It is the preferred platform of Google. Sometime this year, youtube will provide a video delivery mechanism which doesn't use Flash.

* The main weakness of Ajax compared to Flash--authoring tools for the web developer without a strong programming background--will be addressed. The new JS3 standard will help here. However expect IE8 to introduce various incompatibilities with Javascript 2.0... and expect more and more PC users to migrate to Mozilla as a result.

Javascript 2.0 the language will be regarded as an improvement. It will start to see increased use on the server side, though it won't dislodge any of the "P" languages (or Ruby) anytime soon in the area of dynamically-typed glue languages. Future versions of the language will focus on the server side.

Oh, and at least one additional (i.e. one who hasn't done so already) notable Lisp hacker will proclaim JS the true successor to Lisp, and announce such in his blog. Other Lispers will take notice and debate the merits of JS. The programming world at large will go on as before. :)

### Who's Nobody?

* Remains the best cross-platform solution for rich web content. Nobody trusts MS/Silverlight to be cross-platform, and many view it as yet another attempt by Redmond to achieve more vendor lock-in, regardless of the technical merits. Adobe's seeming inability to deliver a 64-bit version of Flash will really start to hurt in the coming year, as more and more PCs ship with 64-bit CPUs and OSs.

Silverlight is already crossplatform, with Microsoft developing the Mac version themselves. Now whether Linux zealots trust Mono developers to do the job or just hate Silverlight because it's a Microsoft product is another story.

I don't believe that Java applets will make a comeback. It's just one of those timing issues. So that basically leaves Silverlight to be a competitor in the really rich RIA space. Silverlight 2.x with the mini-CLR seems to give Microsoft a leg up on Flash on the technical advantages front.

### Cross-platform means more than that

Cross-platform means more than having implementations exist for multiple OS's, including non-MS offerings. MS has supported Apple for years. Apple isn't going to be a credible threat to MS in the operating systems market for quite a few years.

The biggest threat to MS remains the open-source ecosystem. Linux is a big part of that, and the most visible part of that as it's the component that competes directly with Microsoft's flagship product, Windows. Will Silverlight be fully-supported within the open source ecosystem, free of any "people who use XXX might get sued" FUD? If Silverlight becomes successful, will MS refrain from erecting barriers (technical, legal, or whatever-else) to its continued use on open-source platforms? MS, unfortunately, has a long history behind it, so you'll have to forgive the community of "linux zealots" if they don't believe MS to be entirely trustworthy. (That said--bad reputations can be repaired; just ask IBM. Maybe in 2015, a "reformed" MS will lead the way to freedom against an obnoxious, entrenched Google monopoly. History can and does repeate itself).

Agree with you about Java. Nowadays, Java doesn't do anything that isn't done equally well by other languages in the GC + OO + Generics + Static Typing space. The JVM platform, rather than the Java language itself, may end up being the most important contribution of the Java product.

### The JVM platform, rather

The JVM platform, rather than the Java language itself, may end up being the most important contribution of the Java product.

Other than ubiquitousness, is there anything to the JVM that isn't done better in other platforms?

### Cross-platform means more

Cross-platform means more than having implementations exist for multiple OS's, including non-MS offerings. MS has supported Apple for years. Apple isn't going to be a credible threat to MS in the operating systems market for quite a few years.

Oh really, then what does it mean? It supports Linux? How about the VIC 20?

The biggest threat to MS remains the open-source ecosystem. Linux is a big part of that, and the most visible part of that as it's the component that competes directly with Microsoft's flagship product, Windows. Will Silverlight be fully-supported within the open source ecosystem, free of any "people who use XXX might get sued" FUD? If Silverlight becomes successful, will MS refrain from erecting barriers (technical, legal, or whatever-else) to its continued use on open-source platforms? MS, unfortunately, has a long history behind it, so you'll have to forgive the community of "linux zealots" if they don't believe MS to be entirely trustworthy. (That said--bad reputations can be repaired; just ask IBM. Maybe in 2015, a "reformed" MS will lead the way to freedom against an obnoxious, entrenched Google monopoly. History can and does repeate itself).

Once you get out of "the community" bubble, you'll see that your "nobody" is a very small percentage of everybody.

### Well...

First off... this thread is starting to divert from the topic (language predictions, JS in particular). The point of the MS criticism above wasn't simply to bash MS, but to discuss an issue which may affect the future prospects of Javascript.

It's certainly safe to say that "cross-platform" doesn't include the Vic-20.

That said... cross platform does mean more than MS and Mac. It also means more than Linux--desktop Linux is *not* the platform which will ultimately drive this, at least not in the short term. Embedded devices are probably a bigger factor--whether running Linux, some flavor of embedded Windows, PalmOS, or some other operating system. These devices tend to have a strong dependency on the free software ecosystem to some level or another.

Cell phones, in particular, are more and more converging towards open source platforms, especially at the application layer. The cell phone is what might save Java--an interesting development, given that Java was initially conceived as an embedded language before being re-targeted towards web applications.

Like it or not, there are quite a few folks out there who perceive Silverlight as little more than an attempt by MS to "recapture" the web--to make use of a standard which it controls necessary for a good browsing experience. It tried this strategem once before with ActiveX; and may have succeeded had ActiveX controls not been such a mammoth security hole. The difference is that this time, MS's competition isn't the tandem of a Unix server vendor dabbling in programming languages with a cash-strapped Web startup which struck gold with web browsers. The air supply of Mozilla and the rest of the community will not be cut off so easily, especially with IBM and Google (among others) as allies.

It isn't just RMS and his fellow-travelers, either, who distrust Redmond. More and more people in positions of significant influence are wary of dealing with Redmond or depending on its technology. Whether or not these criticisms of Microsoft are correct is beside the point for this topic--they are out there, and I'm stating that many potential Silverlight customers (both on the content production side, and content consumption side) are reluctant to use or support it for no reason other than who produces it. In fairness, I should note that there exist many other organizations who will deploy it precisely because it does come from Microsoft, and thus enjoys the status of a de-facto standard.

Again, I must emphasize that my remarks are not intended as a criticism of the technical merits of Silverlight, or .Net, or any of the languages on the .Net platform. But as we all know, many technical decisions are made based on non-technical reasons.

### That said... cross platform

That said... cross platform does mean more than MS and Mac. It also means more than Linux--desktop Linux is *not* the platform which will ultimately drive this, at least not in the short term. Embedded devices are probably a bigger factor--whether running Linux, some flavor of embedded Windows, PalmOS, or some other operating system. These devices tend to have a strong dependency on the free software ecosystem to some level or another.

And we can expect Silverlight to make an appearance on Windows Mobile sometime in the future. So even by your suspect definition, Silverlight will be crossplatform.

Like it or not, there are quite a few folks out there who perceive Silverlight as little more than an attempt by MS to "recapture" the web--to make use of a standard which it controls necessary for a good browsing experience. It tried this strategem once before with ActiveX; and may have succeeded had ActiveX controls not been such a mammoth security hole. The difference is that this time, MS's competition isn't the tandem of a Unix server vendor dabbling in programming languages with a cash-strapped Web startup which struck gold with web browsers. The air supply of Mozilla and the rest of the community will not be cut off so easily, especially with IBM and Google (among others) as allies.

It isn't just RMS and his fellow-travelers, either, who distrust Redmond. More and more people in positions of significant influence are wary of dealing with Redmond or depending on its technology. Whether or not these criticisms of Microsoft are correct is beside the point for this topic--they are out there, and I'm stating that many potential Silverlight customers (both on the content production side, and content consumption side) are reluctant to use or support it for no reason other than who produces it. In fairness, I should note that there exist many other organizations who will deploy it precisely because it does come from Microsoft, and thus enjoys the status of a de-facto standard.

You're just wrong in your assumption that the number of people that care about Microsoft's business practices is significant enough to make or break Silverlight. Don't worry, you're not the first to make that mistake.

Again, I must emphasize that my remarks are not intended as a criticism of the technical merits of Silverlight, or .Net, or any of the languages on the .Net platform. But as we all know, many technical decisions are made based on non-technical reasons.

And objective analysis tends to get obscured by personal opinions that are in no way indicative of overall trends.

By the way, the Mono/Novell folks are developing an open source version of Silverlight (they're calling it Moonlight), with at least tepid support from Microsoft.

Now back to your original comment..
. Nobody trusts MS/Silverlight to be cross-platform, and many view it as yet another attempt by Redmond to achieve more vendor lock-in, regardless of the technical merits.

Your assumption that enough people care about your definition of crossplatform to make or break Silverlight is just plain wrong. Yes, obviously Javascript/Ajax will continue to be an important part of development, but considering the limitations of HTML/Javascript/CSS, Flash being basically the only game in town right now, and the technical and other advantages of developing for Silverlight, I can't see anyway except for it to be fairly ubiquitous in several years.

### Those were the famous last

Those were the famous last words of Java Applets and ActiveX.. And judging by the overhead, you can forget Silverlight ever running more efficiently.

Devices battle will not be won by bloated and verbose UI tech, and there is nothing in Silverlight that will end the competition story for you know, what matters: useful services, data. How you visualise it will change x times over and across platforms.

Silverlight hangs as much as IE crashes by itself with plain DHTML.. long way away from being ubiquitous or at times even usable as some will like you to believe (MS and .NET vendors)

It's been over 3 years since they announced it and 1 year in production. Result: CPU and RAM overutilisation (and no, it never went down with any runtime framework known to man; Adobe is much smarter about it btw).

### Why JS?

notable Lisp hacker will proclaim JS the true successor to Lisp

No, no, no. Everybody knows that Arc is the true successor to Lisp. :-)

Kidding aside, Arc may see the light of day this year. I'm not quite willing to make that a prediction.

But I have to wonder, why is anybody proclaiming JS the true successor to Lisp? It doesn't seem any more Lispy than say Ruby or Python.

### Successor to Lisp

The only POSSIBLE (so I mean maybe a) successor to Lisp is REBOL which already has a good implementation too.
Take a look at it.
:)

### Why?

But I have to wonder, why is anybody proclaiming JS the true successor to Lisp?

Because everyone who dislikes Lisp syntax (which is the reason for much of its power) has to have an alternative that they claim has more power. Javascript is just this person's flavor of the day.

You know, it would be a lot easier if everyone would just understand that you won't have a more powerful language without Lisp syntax (all of the crappy syntactic markers in most languages just get in the way of transformations) and just give in to the way of the parentheses. Saves a lot of time on writing and maintaining parsers, too.

### I'm not entirely sure

why Javascript gets praised as the "successor to Lisp" (note that I don't make that claim myself; I'm merely repeating what many others have said), whereas other dynamically-typed languages with first-class HOFs and lambdas, eval(), and a somewhat clean syntax (i.e. not Perl :) aren't.

The claim that S-expressions, rather than a more complicated syntax, make a language more powerful...is a curious one. Certainly, the syntactic complexity of C++ or Perl should be avoided, but the syntax of JS is relatively clean. Javascript programmers seem to have no problem composing new functions and passing 'em off to eval(), despite the slightly more complicated syntax.

In designing the syntax of a language, one has to consider both the ease of humans to read and write the language, and the ease of programs to generate and analyze the language. The first is highly context-dependent, as programmers are far better at reading and writing syntaxes they already understand.

Sexprs do very well on the second measure; as sexprs are about as simple as you can get. Many programmers report that sexprs are difficult for them to read and write. Although experienced Lisp/Scheme programmers have no trouble, there seems to be a bit of a cognitive barrier.

A language like Fortress is probably the opposite. I know little about Fortress, so my remarks here may be way off base--but from what I understand, it's designed to be highly readable and writeable, and to permit people to write code in the notation of the problem domain as much as possible; in particular, the domain of mathematics. It wouldn't surprise me to learn that *parsing* Fortress code correctly is a non-trivial task, and that metaprogramming in Fortress--at least at the level of the surface syntax--is an even bigger pain. (Again, I don't know).

But what of other syntaxes--like Javascript? It's a bit more complex--having {} braces in addition to (); having explicit term-separating tokens (besides simple whitespace), and a few keyword forms are syntactically neither an atom nor a list of some sort. But it's not horribly complicated, and there seems to be little technical obstacle to writing Javascript parsers or generators. And the additional tokens and forms seem to give lots of visual cues to the reader, that aren't present in sexprs.

I might further point out, that writing and maintaining parsers is a long-solved problem in our discipline. I could write a JS parser from scratch in less than an hour. I could knock out a sexpr-parser from scratch in even less time, but so what? I don't have to write either--both already exist in numerous forms. Sexprs do have the advantage of not needing a stack to parse (you'll still need a semantic stack if you wanna do anything useful), but so what?

Going forward, a good solution to the "syntax problem" is probably different representations. Syntax trees (or more generally, syntax graphs) can easily be represented in a machine-friendly format like sexprs or XML. Authorship can occur in more readable formats, for which a well-defined translation to the core format occurs; this readable format may or may not be able to encapsulate code written in the core format, for those who choose to do so.

But I'll have to disagree with your assertion that sexprs are necessary for efficient metaprogramming. There are far too many counterexamples of productive metaprogramming occurring in languages with more complicated syntaxes, for that claim to stand. Even in things like C++ and Perl, which IMHO have the ugliest syntaxes of any modern general purpose PLs (modern excludes early Fortran dialects, and things like Intercal :), people can effectively write macros and other functions which transform program text.

Finally, I'll note that in many other programming language communities, means other than syntactic transformation (macros) are often preferred for metaprogramming. Perhaps it's due to poor implementations (the C preprocessor has given macro systems a bad name among many programmers), but macros (hygenic or otherwise) are de-emphasized in many language communities--and some languages don't support them at all.

### Hating

It is not (all) about hating syntax. It is possible to remove prans from Lisp and it is still Lisp; not another language. REBOL in nature is Lisp and has it's macro system. Other researches had taken placed around like Dylan - which is a prans-less Lisp developed by Apple first time - and Goo - which is a dialect of Scheme.
When I protest about prans (personally I have no problem with prans) the strange thing is why this "most intelligent people on earth" can not think more practical? IT IS POSSIBLE to remove prans from Lisp without giving away anything. So insisting on having those prans around, seems to be a kind of ... you name it! ;)

### Wel, let's put it this way...

Lisps have had periods where people have tried to integrate "more normal" syntaxes. These attempts went over like a lead balloon. You do need some way to differentiate expression bounds and Lisp's method is the next to most lightweight. Forth's post-fix syntax (just in case anyone wants to accuse me of being completely provincial WRT Lisp) is actually the lightest-weight way of doing this. Syntax beyond proper delimitation of sub-expressions makes defining transforms more difficult and often limits contexts within which the transformations can be applied.

And, in case you didn't know, Dylan originally had Lisp syntax. Check out the first Dylan reference manual, published by Apple, where it was initially developed. Being developed at their Cambridge (as in MA) Research Lab, it's not unusual that it was initially developed as a Lisp dialect. The transition to a heavy-syntax form was made to make the language more palatable to programming world at large (a goal whose outcome is still unclear). The downside of this change was that it made the definition of macros more difficult.

### Mutltiple solutions?

The programming language world (in the sense of the sider programming community, not research) seems in love with the idea of finding the perfect solutions to all aspects of programming, and that that all these solutions will live happily inside one language.

An alternative is to view the demands on programming languages as separating the solution space into various niches. It seems to me that this metaphor is helpful in many cases. Specifically, I would argue that Lisp-like syntax is the clear winner in a specific niche (the niche that includes several important aspects of meta-programming, including macros). So instead of looking for a Lisp-like language without a Lispy syntax, I would argue that Lispy syntax is one of the few cases in which we have a clear winner in a specific niche (or island in the solution space, if you will).

Seems to me that if this is the case it is better to look for good solutions in other niches, than to try to mess around with a solution that stood the test of time, and outlived several supposedly better adapted descendants...

### Perfect solutions to all aspects of programming

I definitely think it is necessary to have such a thing. As the systems getting more and more complicated; mind-set shifting at work time is getting more and more expensive and risky. Imagine a project that you are programming in (at least) 3 languages at a time (this is rare? No! I write ASP.NET (C#), SQL, JavaScript everyday!). How many time you have tried a problem in SQL with an imperative mind-set? Or using JavaScript like C# is not impossible; but then what is JavaScript is good for when I do not consider it's powers of prototype-based inheritance system and higher order things over there and so on?
Lisp can be that if it try to be it. Lisp is too busy with itself to have look around for providing solution for other developers!
REBOL hide prans with giving functions a fixed arity. That is a kind of restriction; maybe a disgusting one. And it is a lot of improvement too! (Here I do not mean "use REBOL" I am just speaking) It's dialect system is as rich as Lisp macros (after all it derives from Lisp).
So I think prans are more like a costume than a really appreciation for s-expressions.
Thanks!

### Technical issue... or cultural?

As you point out, many Lisp dialects/implementations have attempted to integrate or use syntaxes other than sexprs; from the original "m-expressions" proposal of McCarthy and co, to things like Dylan.

None have proven successful among the Lisp community.

Is this a technical issue, or a cultural issue? Just as much as the reluctance to use sexprs found in non-Lisp programmers may be due to inertia rather than the relative merits of sexprs versus other approaches, I suspect that many in the Lisp community have a cultural bias against anything but. Not that there's anything wrong with that; if you have trained your brain to parse sexprs when you see 'em on the screen, going to something else might generate resistance.

Remember Wadler's law. People think syntax isn't important--but it is. I'd rather use a PL with horrible syntax and excellent semantics than the reverse, but many programmers won't discover the beautiful semantics of even the most elegant language, if they don't like its look and feel.

Keep in mind: I'm not asserting that any particular syntactic style is the *correct* one, or promoting/trashing sexprs per se; I'm merely noting that many programmers are uncomfortable with them (and won't consider using Lisp because of it).

At any rate, it's unfortunate that Dylan didn't enjoy much success as a PL (even though some folks are trying to keep it alive). I doubt that syntax played a role in its demise; Apple wasn't positioning Dylan as a replacement for Lisp, or attempting to otherwise woo Lisp programmers who might prefer sexprs. Instead, they were aiming for a wider programming community--a group with no desire for sexprs, and possibly a dislkie of them.

### More Difficult Things

I totally agree there will be some kind of downsides there. After all no pain no gain! ;)
But these difficulties are placed at the heart of compiler(interpreter) itself. For example consider type-inference in Haskell; personally - as for me - it is almost impossible for me to implement such a brilliant type inference system all by myself. Yet I can consume and use it in my work (in this case I am a developer/user).
This can be the case for Lisp. By such bulk of processing power flying around why not to try make a more powerful core? (with less prans of course!)
Cheers! :)

### iLISP

It would be nice if LISPs -- all of them -- had a way to translate to/from an indentation based syntax.

### It's not even February yet,

It's not even February yet, and we have a (reluctant) prediction come true.

### Woohoo! I win

Where's my giant check?

### Only if you find a banker

Only if you find a banker who can read Church numerals...

...currency?

Good one!

### Hello,

Can I cash this check for (Î»x.xx)(Î»x.xx) dollars?

You know, I learned Erlang because I heard Armstrong said it was a currency oriented language.

### Then it was a con-currency...

Then it was a con-currency...

### Some C/C++ predictions

* The forthcoming C++ standard will be ratified without any major changes. Many additional features not part of the official standard will be "de facto" features of the language, as they will be supported in similar fashion by all major compiler vendors. Many of these features will be considered for future revisions of the standard.

* More and more C++ apps will be written with a garbage collector in mind; many third-party libraries will be shipped that will require garbage collection (lotsa "new", no "delete").

* C will diverge more from C++. As C++ implementations improve, many application areas where C has been a major contributor will see more and more projects move to C++. Numerical programming will be a big one; as C99 features like "restrict" are unoficially supported by C++ implementations, and C++ generics and template metaprogramming give C++ a huge productivity advantage without a corresponding performance disadvantage. As a result...going out on a limb really far here... actually, outright BSing here...:

** C will in the future add support for generics and overloading; including the ability to overload operators. Some further incompatibilities with C++ will be introduced by this effort.
** C will avoid adding anything, however, that complicates the runtime model.
** C will not, however, embrace OO in any meaningful way:
*** It will maintain a clean separation between data and functions. Structs will not get methods or static elements--i.e. they won't turn at all into classes. Functions will continue to only exist at the outermost scope. There will be no such thing as a "this" pointer; all arguments to functions will be explicit.
*** Neithr inheritance, nor any other form of subtyping, will be added to C. C may add ways to compose data structures to permit code reuse.
*** C will not include any built-in support for any sort of dynamic dispatch, instrospection, or reflection. No virtual anything.
*** C will finally add something similar to constructors/destructors; however keeping with the clean separation of functions and data, these will be implemented as regular functions with special names; not as struct members. C will also get something similar to placement new/delete in C++, to allow structs to be initialized or cleaned up as part of a dynamic allocation scheme. It may even get keywords which function similarily to new/delete, but which will work differently.
*** C may inherit anonymous unions from C++; it won't, however, get true algebraic sums (see note above about complicating runtime).
*** C will add ways to demarcate functions without side-effects or external dependencies. It won't get closures or lambdas. It may (subject to the constraints above) find a way for datatypes to overload "operator ()", thus allowing the creation of things which resemble functions--if it does, this will be a major source of pain for C/C++ interoperability.
*** C will not add support for exceptions, coroutines, continuations, or any other non-local, non-LIFO flow control. C might get mandatory TCO.
*** In addition to language-level generics, the capabilities of the C preprocessor will be expanded. New types of macros which "understand" curly braces and angle-brackets will be introduced. Hygenic macro arguments will be introduced. Macros will be able to include conditionals, and won't be constrained to one line. Macros will still work on the texual level, rather than the language-syntax level.
** Eventually, people will stop referring to C/C++ as "C/C++". A crippled mutually-compatible subset of the two languages will continue to exist, however. The C++ community will bemoan this departure; however, legions of diehard C programmers will continue to insist that C is not a subset of C++, and that C is and should be perfectly able to do things differently than C++. Many will suspect (as was the case with C99) that incompatibilities with C++ were intentionally introduced in order to ensure that C++ would never again be regarded as a superset of C.
** These changes will bring some new interest in C, and C will continue to be deployed in areas where a minimal runtime footprint is essential. Much open source infrastructure (the Linux kernel, gcc/glibc/binutils, GTK) will continue to be written in C. Some of the new features in C will be re-imported into C++; the latter will still be preferred for general-purpose projects where a higher-level language isn't more appropriate.

### Great stuff. That's the kind

Great stuff.

That's the kind of thing I had in mind...

### More and more C++ apps will

More and more C++ apps will be written with a garbage collector in mind; many third-party libraries will be shipped that will require garbage collection (lotsa "new", no "delete").

I strongly disagree. My prediction is that no C++ third party library will use garbage collection, i.e. things will continue to be as they are today.

Unless, of course, you mean shared ptrs.

My prediction is that nothing will change in the C/C++ beyond what is already planned.

### +1

Agreed: GC is not part of C++ 200X, so it'll take a long time to be integrated into C++ (if ever) and definitedly not 2008..

### Many people use Boehm today

I agree that few *libraries* will require GC, as that constrains any app that uses them.

I can see frameworks appearing that will require GC. And as the comment indicates, many C++ apps use GC already.

(In my comment, GC means tracing GC, as opposed to implementations using only refcounting).

From what I've heard out of the committee, the reasons for not considering GC for the next revision of the standard have to do mainly with the amount of work the committee is willing to take on; not because of technical obstacles or committee resistance to GC. It is quite common in the C++ world for the committee to standardize on things which have already become informal "standards" among C++ vendors. STL existed long before being standardized (and several parts of it remain outside the official standard library). Boost is in much the same place today. Boehm is used quite a bit today; many c++ toolchains ship with it preconfigured.

### Proper tail calls in C

Scott wrote, "C might get mandatory TCO."

I like that prediction. It's surprising, yet it has the ring of truth.

"Hygenic macro arguments will be introduced."

That, on the other hand, strikes me as ludicrous. :)

I predict that 2008 will see the first major U.S. sports franchise named after a PLT concept. Probably the Cambridge Arrows.

### a)POPLmark progress, preservation, puns b)quiet GADTs c)no Hype

The Coq tutorial at POPL08 (as well as the UPenn project) will lead to more formalization of metatheory overall and more use of Coq, locally nameless representations, and cofinite quantification (over other mechanized metatheory tools and techniques).

GADTs in C# 3.0 will not catch on.

There will be no great language hype even comparable to that of Ruby (and Rails).

### Language prognostics (plus Google trends!)

I have put up an article about the future of languages.

The graphs of the popularity of search terms for various languages are interesting (generated using Google Trends). Click on the graphs to see up to date data (I wrote the article about 3 months ago but didn't release it).

My summary for the next five years is:

• Java will become more entrenched.
• Browser and phone based solutions will primarily use HTML/CSS/Javascript or close descendents. Other single vendor solutions fail due to conflict-of-interest (a vendor language ecology doesn't compete with an ecology based on independent standards).
• No functional language will become significantly popular (i.e. F#, Scala, Haskell etc). For various reasons they don't produce a significantly sized ecology (of programmers, tools, libraries and applications).

### javascript 2.0 predictions

Javascript 2.0 with its added support for nominal/structured types, reflection/meta-programming, genericity, parametricity, improved scoping, namespaces, modules/packages/program units, generators, comprehensions, invoke/set/get hooks, will continue to dominate the front-end web development sector (over redmond's proprietary technology) and gain even more momentum - in addition - it'll acquire some useful concurrency oriented features by the end of the year/beginning of next year, then a RoR like framework shortly thereafter and will thus offer the possibility of a highly portable, much more comprehensive end-to-end solution using mainly one PL.

Prediction or Fantasy? Soon to be fantasy if redmond continues to do what it does best and the IE-team bullies JS2 out of its lunch ;)

### Executable UML will go Mainstream

Model driven architecture will start to gain mass market appeal as open-source model compilers for UML become freely available. Action languages for UML will take a cue from Scala and blend functional programming techniques with an object-oriented programming language.