## Want to learn something new

Hello LtU's,

I'm new here, although I've been visiting this site for more than a year. I'am a CS student, interested in programming languages and compiler construction. I'm good at C, Java, C#, doesn't matter. I love Lisp and read about it more than I did with any other language, but I never really used it much. I also love Ruby and try to use it everywhere I can.

The thing is, ever since I learned Ruby, I never had the urge to learn any other programming language, and I'm starting to feel like I'm missing something.

So, is it Scala? Haskell? Nemerle, OCaml? Anything else? Which one should I learn now to become a better programmer?

Thanks,

LuÃ­s Pureza

P.S.: By the way... any Book recommendations? Thanks again.
P.S.2: Wondering if this is the right place to ask this... Please, feel free to delete this post if it isn't.

## Comment viewing options

### This isn't exactly on topic

This isn't exactly on topic for LtU, but I think posting some suggestions would be ok, so long as the thread doesn't become a holy war between advocates for different languages...

If you aren't familiar with Haskell, I suggest trying to get a feel for the language. You may decide not to use it instead of Ruby, but the educational value will be great. To really answer your question we should know your goals in learning a new language.

### 10 Programming Languages You Should Learn

As good excuse as any to post a link to an article I ran into a while back on 10 Programming Languages You Should Learn. That list has the choices of PHP, C#, AJAX, JavaScript, Perl, C, Ruby, Java, Python, VB.Net. But this turns out to be a rather boring list for me since I've used all these languages (other than Ruby).

My subjective list of 10 would be something like (in no particular order): Eiffel, Smalltalk, Lua, Oz, Alice ML, OCaml, Scheme, Haskell, Icon, Mercury.

### Teach Yourself Programming in Ten Years

Great page by Peter Norvig (director of research at Google, author of PAIP).

### Plug the type system gap

Given that the languages you mention all either have weak old-style static type systems (C, Java, C#), or no type systems at all (Lisp, Ruby), I'd say you should learn one of the "big three" polymorphically-typed functional languages: Haskell, SML, or OCaml.

One of the biggest benefits of those languages is that their type systems help (read: force) you to reason about code in important ways that you can all too easily avoid otherwise. There's no question that learning to program well in one of those languages will improve your ability as a programmer, and improve your understanding of programming. Familiarity with one or more of these languages will also make it possible to understand many academic CS papers more easily.

As for which one of them to choose, you could let your own preferences guide you. Haskell has some of the most advanced type system gadgetry anywhere, but that's not necessarily what you want to get into right away. OCaml is probably the one used most in non-academic projects. SML has a formal definition and a number of excellent implementations.

Note that I'm not making a "static vs. dynamic typing" argument here — I'm saying that to round out your experience, the polymorphically-typed functional languages are essential.

As for book recommendations, see the Getting Started thread. One book linked from that thread which could make sense for you is TAPL, since it teaches PL type theory using OCaml.

### Stepping stones

I'd actually recommend learning Erlang before Haskell or OCaml, because it helps plug the gap between dynamically-typed multiple assignment (Ruby) and algebraic data types and pattern matching. I was hopelessly lost the first time I looked at Haskell, because it includes all sorts of syntactic sugar that aren't really necessary to understand pattern-matching or ADTs (n+k patterns, guards, @ patterns, : for cons, definitional syntax, etc). Stick Erlang in there, and you can think of it like this:

• Erlang pattern-matching is equivalent to multiple assignment in Ruby or Python, except that patterns may contain literals. If a pattern fails to match (you tried to assign a value to the literal, and value != literal), then continue on with the next pattern.
• Ocaml algebraic data types are equivalent to Erlang tuples & pattern matching, where the first atom in each tuple is the constructor. A "type" represents a set of alternatives; it specifies the legal values that a variable can have.
• Haskell patterns are just like Ocaml, but with some additional syntactic sugar. :: (Ocaml) = : (Haskell) = cons (Lisp), [elem1, elem2, elem3] is shorthand for (elem1 : elem2 : elem3 : []), n + 1 is shorthand for "match value - 1 with n -> ..." and so on.

Also, Erlang is purely functional, so it forces you to start thinking in terms of functions and tail recursion without introducing all syntactic sugar that Haskell and Ocaml have that lets you avoid thinking in terms of functions and tail recursion. And the core Erlang language is quite small; it was much easier for me to digest than either Haskell or Ocaml. (SML may work in that regard too; I don't know it.)

### SML

And the core Erlang language is quite small; it was much easier for me to digest than either Haskell or Ocaml. (SML may work in that regard too; I don't know it.)

Yes, I'd say that SML is significantly simpler than both Haskell and Ocaml and correspondingly easier to learn and master.

### SML Also Has...

...the support of the great ML for the Working Programmer text, which seems to me ideal for those approaching SML with a pretty good foundation in the currently popular languages.

### Just jump! :)

I was hopelessly lost the first time I looked at Haskell, because it includes all sorts of syntactic sugar that aren't really necessary to understand pattern-matching or ADTs (n+k patterns, guards, @ patterns, : for cons, definitional syntax, etc).

I agree, Haskell can be confusing, but some people do seem to manage to get into it on their first try, so there's a YMMV factor here. It partly seems to depend on whether you're good at ignoring things you don't yet understand.

I also agree that Erlang is a reasonable addition to the list, but I say that more because of its approach to concurrency than because of the typing aspect. I can see the need for stepping stones to get to Haskell, but that shouldn't be as necessary for SML or even OCaml. I agree with Vesa that SML has an advantage in that there's no danger of being distracted by things like objects or an overdose of syntactic sugar.

### One More Thing...

...we should add, and that's that the choice of either Standard ML or O'Caml is somewhat pragmatic: they are functional languages, but not purely so. Haskell or Concurrent Clean are the languages of choice if your question is "What could happen if you followed 'purely functional' to its (il)logical(?) conclusion?" whereas SML and O'Caml fall into the "A FORTRAN programmer can write FORTRAN in any language" category: you can write straightforward imperative code in them, and then rightly wonder what all the fuss is about. :-) Hence the need for good learning materials in addition to the language, as mentioned elsewhere in this thread.

### Erlang purely functional?

Is Erlang really purely functional? I previously thought it was somewhere in the ballpark of, say, ML.

As for the syntactic sugar that doesn't look functional, are you referring to list comprehensions and the like? If that's what you mean, I'd have to agree that I find those kind of abstractions distracting.

### degrees of purity

People have criticized OO because there is no clear set of features that make a language OO, so the definition of OO can't be pinned down. It seems that FP has the same problem. What makes a language functional (or purely functional)?

Functions as first class values? (If so, many OO languages qualify.) No mutating of local variables? No mutating of fields in data objects? No side effecting calls? Automatic currying? Tail call optimization? Static typing with some form of polymorphism, subtyping and/or type inference?

FP seems as much a feature grab bag as OO. In both cases it seems that it is the preponderance of the features that decides.

### Doesn't Erlang have

Doesn't Erlang have uncontrolled side effects for I/O and such?

That's what I meant as a rubric of determining purity, if I am correct in that assumption. I agree that sometimes purity is a bit of a rhetorical point, but judging from some of the comments I've read here, a "purely-functional" language is defined as a language that lacks any side effects that break it's algebraic properties.

By the way, I agree that the important parts of FP are the abstractions, I do most of my programming in Scheme, which most people seem to say is "less functional" than Erlang in general. Although I could add all of the neat features of Haskell to Scheme, it's so much work that sometimes I wish that someone else just wrote it for me, either in the form of a library or a new language. (maybe I should just use Haskell or Joy, but I am lazy) It's really hard to write purely functional code for I/O in Scheme, for example. In Erlang I think the situation is similar, but to a lesser extent since I think function definitions are immutable.

Another note: when I was talking about the list comprehensions, I wasn't talking about Erlang, I was responding to his comment about distractions in Haskell.

### All the sideeffects in

All the sideeffects in Erlang has to do with concurrency, and that seems to me to justify them, as the whole point of the concurrency is to make the system somewhat nondeterministic. (Allowing IO in undetermened order.) And that i think make erlang more pure than lisp or sml or ocaml or..

### Ok, that makes sense.

Ok, that makes sense. Thanks for clarifying.

### Huh?

Lisp has no type system at all? Please explain.

### Contentious Point

Regrettably for the purposes of communication among programming language researchers, there is a discipline, called "Type Theory," that predates electronic computers—it evolved from Mathematical Logic in the era of Russell and Whitehead's "Principia Mathematica" in order to resolve various paradoxes, such as the misnamed "Russell's Paradox," that arose at the time. In the early days of programming language design there wasn't an explicit connection between a language's type system and Type Theory, but around 1980 the Curry-Howard Isomorphism was identified, and since then, to one degree or another, some language designers have been explicit about the relationship between their type systems and Type Theory.

The crucial point is that, since the purpose of Type Theory is to disallow meaningless statements in Mathematical Logic, the Curry-Howard Isomorphism implies, to the majority of language designers who are familiar with it, at least one phase distinction, with the phases popularly called "compile time" and "runtime," and the type system's job is to disallow "meaningless statements" in the language from being accepted by the compiler—that is, it's generally understood by those familiar with the Curry-Howard Isomorphism that a "type system" is static. Since Lisp lacks a static type system, it is understood by most language designers involved in "type systems" in the Type Theory sense not to have a type system at all.

As has been discussed extensively here on LtU, this perspective, while perfectly valid, hinders communication (of which I have been among the most guilty parties) and, perhaps even more importantly, neglects the entire arena in which constructive blurring of the phase distinction is taking place, e.g. in the realm of partial evaluation or, more generally, multi-stage programming. In the case of Common Lisp, it also ignores the fact that, in most implementations, you can express type constraints at compile time and, if they are violated, get an error at compile time, just as in any other statically-typed language. The challenge for type theorists, IMHO, is to extend our understanding of Type Theory to accomodate richer contexts than just the classic two-distinct-phases one; the challenge for the dynamic language community, IMHO, is to recognize that there's value in mandating that some classes of errors be caught at compile time, even if the programmer's "intuition" tells them that the "error" is, in fact, not an error semantically.

### Phase distinction

The challenge for type theorists, IMHO, is to extend our understanding of Type Theory to accomodate richer contexts than just the classic two-distinct-phases one;

Note that most systems with dependent types never had such a phase distinction. Still, they enjoy fundamental soundness properties, which cannot be said of any "dynamically typed" language I'm aware of.

### Explanation

I meant "no type system" in the sense described in Cardelli's paper Type Systems. I should have qualified that statement, but the qualification was implied by the rest of that comment, i.e. the type systems of languages such as Haskell and ML are the kind of type system which Lisp lacks, which is why those languages are worth learning if your prior exposure has been to languages like Lisp.

Common Lisp implementations which support explicit type annotations don't change this situation significantly, since neither they nor the CL language specification provide, in general, a means to assign a (nontrivial) type to program terms that don't have type annotations. In this context, "nontrivial" means, at least, some type other than "any value", a.k.a. a universal type.

(Arguably, simple types such as integers, strings and symbols could also be considered trivial, with the real test being how a language deals with more complex types, particularly those of higher-order functions.)

### Thanks

Thanks for the reply and clarification. (You, too, Paul.)

I've tried to read a few of Cardelli's papers in the past. I usually get lost pretty quickly, but I'll give "Type Systems" the ol' college try.

### For you and for everyone...

I'm sure lots of people here would be glad to help if you have questions ...

### Indeed!

And when the subject is type systems, the best intro, IMHO, is still TAPL.

### Prolog

So, is it Scala? Haskell? Nemerle, OCaml? Anything else? Which one should I learn now to become a better programmer?
For something different, take a look at Prolog and logic programming. For a book, I liked The Art of Prolog.

### Perhaps Scala or D

Depends of what you're looking for: if you're looking for a 'practical' language in this case D can be interesting to you.

I found that Scala is a nice blend of OO/functionnal language, but note that this is a research language and I think it'll stay that way.

I've tried to learn OCaml but didn't like the syntax (and the book was "lying": at the introduction it said that OCaml is a mix of imperative/functionnal language and then it introduced only the functionnal part) and Haskell scares me.

### Scala

I found that Scala is a nice blend of OO/functional language, but note that this is a research language and I think it'll stay that way.

or possibly be released commercially as Java 12.

### Which Book?

All the O'Caml books I've seen do indeed deal with references, mutable arrays and records, etc. It would be difficult not to: O'Caml doesn't provide built-in monad support, and the type system lacks Concurrent Clean's "uniqueness types," so "imperative features," i.e. side-effects, are essentially the only way to do a broad range of things in O'Caml.

### What is a research language?

# I found that Scala is a nice blend of OO/functionnal language, but
# note that this is a research language and I think it'll stay that way.

In your opinion, what constitutes a research language? Differently put, what would you expect to see for a language which is not research?

### Note that it's not me who

Note that it's not me who says that this is a 'research language' but the author of the language.
As for staying this way: if memory serves Scala just had quite a big syntax change, a 'not research' language should be stable.
To be fair, Scala's compiler had some switch to help port to the new syntax.

In general what I would expect for a 'not research' language is an emphasis on the libraries around it: a language by itself is not very useful if it doesn't have libraries around it to do GUI, database,etc.

Sure Scala can use Java's libraries, but from 'real world usage' point of view, I fear that making big application with a mix of Java's library and Scala could hinder maintenability.

### Re: Note that it's not me who

Note that it's not me who says that this is a 'research language' but the author of the language.

Somewhat humorous as your post is in response to the designer of Scala who's asking why you consider it to be a research language. :-)

In my view, Scala is a blend language - one that tries to bridge what you might label as "research" and "real" worlds - though I'd prefer to see it as a bridge between OOP and FP.

### I was sure that I've read

I was sure that I've read Scala was seen as a research language: I have probably confused it with one the many language I've looked at: my apology for the mistake.

I agree with you that Scala is a blend language, and a nice one too, I've the mostly imperative mindset and don't like much functional language but Scala has caught my attention.

Good to know that it can be considered for serious usage (I still remember the stupid Pascal language which had too many difference between implementations and too little standardisation to be truly useful).

### what constitutes a research language?

What would we mean by the labels "research" and "real"?

Is saying something is a research language just a way to dismiss it from consideration?

Is saying something is not a research language, shorthand for it's been around for 15 years and we guess that an unknown but significant number of commercial projects have been completed using it?

Is this about some property of the language implementation at all? Is it a bundle of assumptions about network effects? Is it simply happenstance?

### Research languages

Is saying something is a research language just a way to dismiss it from consideration?

Certainly not. It will probably get it dismissed from consideration for large and risky projects, however, unless the project has very special needs.

Is this about some property of the language implementation at all?

By saying "the" language implementation, you're already showing a bit of a bias. I can't think of any language that has evolved beyond the "research" stage that doesn't have multiple, independent implementations.

Common differentiators of 'research' and 'production' languages include:

• size and quality of library support
• breadth and quality of tool support
• breadth and quality of documentation (no, a language spec and an introduction doesn't cut it)
• size of community (including the number of third-party players trying to sell to the community)
• size of recruiting pool (if I can't hire a maintenance programmer for the language within four weeks, it's a research language)

Things that you would imagine that might distinguish research from production languages, but in practice don't include:

• quality of the language spec
• quality of any given implementation
• presence or absence of any given feature
• existence of successful projects written in the language

Is it a bundle of assumptions about network effects?

I would call it a bundle of network effects. No need to call them "assumptions" when they are so easy to see in practice.

Is it simply happenstance?

No, it takes an enormous amount of money, time, and effort to create a viable production language. Whether a given language attracts the necessary money, time, and effort is certainly partially chance (anything with so low a success ratio is going to be strongly influenced by chance), but also does depend on the language, the needs it fills, and the state of the computing environment in which it is introduced.

### By saying "the" language

By saying "the" language implementation, you're already showing a bit of a bias. I can't think of any language that has evolved beyond the "research" stage that doesn't have multiple, independent implementations.

Perl?

...O'Caml?

### Not exactly

Perl 6 wich is sadly still in research stage has at least 2 under active
development (pugs,MiniPerl6) , with a couple others waiting for better times. But one could also say that Perl 6 and Perl5 are different languages.

### Differentiators

Common differentiators of 'research' and 'production' languages include:

• Stability of language syntax and semantics

### terms of convenience

Mostly I was interested in whether someone had crisp definitions for research language and production language, because they seem like terms of convenience which we can bend to suit our purpose - basic post hoc justifications.

The common differentiators you list are certainly dimensions on which we might seek to compare languages - but where on those dimensions does a language flip from research language to production language?

### production

As soon as someone beats an implementation into good shape for the programs that I want to write, and proves it by writing one. :-)

### One self-described research language...

...would be which is intended to be an experimental platform for the C# language. In the Java world, there are a number of languages that seek to extend Java with different capabilities. For example, GJ implemented extensions for generic programming - much of which became incorporated into JDK5.

Since the question of research languages in the current thread originally arose from Scala, I think the answer to the current question of whether Scala is a research language should be framed in those terms. Is Scala designed to be an independent language from Java? Or is Scala intended as a proof-of-concept experiment aimed at influencing future extensions to the Java language? Just guessing, but I'd speculate that the answer from the designers of Scala is likely to be yes for both questions.

### self-described

Yes, there are self-described research languages were the intent is clear and limited.

The question is less clear when the language is not self-described as a research language, when that judgement is being made by other people and not by the language designers ;-)

### Not a judgement.

when that judgement is being made by other people and not by the language designers ;-)

Note that this 'judgement' was not a judgement at all but my mistake: I thought that Scala's designer said that it was made for research purpose (a confusion), and as Scala just had a big syntax change, this reinforced my mistake.

I already apologised for this mistake..

### Re: Not a judgement

Your original mail started a useful discussion, so I am glad you sent it.

### Is Scala a research language?

Chris Rathman wrote:

> Since the question of research languages in the current thread
> originally arose from Scala, I think the answer to the current
> question of whether Scala is a research language should be framed
> in those terms. Is Scala designed to be an independent language from
> Java? Or is Scala intended as a proof-of-concept experiment aimed at
> influencing future extensions to the Java language? Just guessing,
> but I'd speculate that the answer from the designers of Scala
> is likely to be yes for both questions.
>

I was on the road, and am therefore coming back late to the discussion. Actually, I would answer yes and no. Yes: Scala is designed to be an independent language from Java, it just piggybacks on the JVM for its libraries and JIT compiler. Also, there might be again a version for .NET in the future. No: Scala is not intended as a proof-of-concept experiment aimed at influencing future extensions to the Java language. It's too far removed from Java to be very useful for that. If the Java language designers or anybody else want to pick up some concept from Scala to incorporate in their future language designs, that's great, but it's not one of our primary goals.

I think Scala started out as a research language 3 years ago. But now we have gained confidence, and would like to make it as widely usable as possible. The aim is to grow a user community that can sustain itself. Dave Griffith's list of differentiators is very useful in that respect. We are working on them. Most important in my mind is a real book about programming in Scala (I mean, one you can buy in the bookstores).

### Begging the question

"post hoc justifications", "terms of convenience" and "bend to suit our purposes"? The point of differentiating research and production languages is to give an indication of relative riskiness. Like it or not, evaluating tool choices based on riskiness is actually an extremely important activity for many projects. Moreover, it's one that actually is done with good faith by competent engineers, every single day. Inevitably, simplifying assumptions get made in this risk-management task, but that doesn't imply that the task is done shoddily, and certainly doesn't imply bad faith.

I'm truly not understanding what's causing you to gear up with the heavy verbiage about this issue.

### Sorry Dave, but your

Sorry Dave, but your discourse makes no sense to me. The only difference that can be made between "research languages" and those for "productive usage" is simply that the latter were designed for solving special problems in an existing production environment while the others are designed to explore programming language features. Most of the non-research languages are actually domain specifc. They are scripting languages for computer algebra systems, test tools, game engines, 3D modelers or Web browsers. There are uncountably many languages that come and go with the single tool / environment they were designed for and not even Lambda the Ultimate knows them all and you also won't find them in Job offers because it's clear that no one knows them outside of a few companies. They live and die silently. Otherwise a research language and its environment can become pop-culture and its combatants may attempt world domination. It can be advertised in its own mythology, it can be industrial strength and justified with all kinds of pseudo-scientific arguments but stays nevertheless a research language. Haskell would be a good candidate these days. But why not also mentioning C++ in its spring time?

### Huhh?

The only difference that can be made between "research languages" and those for "productive usage" is simply that the latter were designed for solving special problems in an existing production environment while the others are designed to explore programming language features.

I almost mentioned these in my initial response, but it was already overly long. I realize there are an innumerable number of "little languages" custom-cut for use in production systems. While obviously not "research languages", I'm not willing to call them "production languages" either. In addition to the smallness of scope, these little languages are usually poorly designed/implemented/documented in comparison to either production or research languages.

### evaluation? what evaluation?

Like it or not, evaluating tool choices based on riskiness...
I like evaluations, I don't particularly like flat assertions ;-)

I'm truly not understanding ...
I've seen other engineers use "it's not a production language" as dismissal without evaluation.

Where on the dimensions you put forward does a language flip from research language to production language?

The endpoints are easy enough - self described research languages and long established languages - what about the grayzone between those endpoints.

### IMO trying to errect these

IMO trying to errect these distinctions is unhelpful and a waste of time. Languages and language implementations differ in many aspects, and each project has specific requirements. Use what best matches your particular needs.

### these distinctions

I think trying to erect these distinctions and not succeeding is lesson enough.

### Not a technical distinction

These distinctions are an unquestioned part of reality for vast communities. However, they are more social than technical. If you really want to take the trouble to try to pin them down, it's possible to meaningfully do so, but the result is going to involve descriptions of what managers in charge of hiring think, dollar amounts devoted to marketing, etc. Probably not really a topic that can meaningfully be grappled with here, but OTOH, dismissing such distinctions as unhelpful may be misleading, if the goal is to gain an understanding of what's going on out there.

These are the sort of distinctions which could perhaps best be expressed using the "You know you're a redneck when..." format, as in "You know you're a production language when the biggest software company in the world makes a compiler for you", or "you know you're a research language when one of your biggest proponents works for the biggest software company in the world, but they still don't sell the language in a shrinkwrap box."

### these distinctions

are absolutely necessary, and part of lived reality for the vast majority of working developers. They are also very fluid, based on idiosyncratic knowledge bases and risk-aversion profiles. Insulting those who try to explore these distinctions, or impugning their motives, won't change any of that even slightly.

### To Become A Better Programmer

Insulting those who try to explore these distinctions, or impugning their motives, won't change any of that even slightly.

Since I started my days on LtU as a voice for "pragmatic programmers", I may be just the guy to respond to this.

The level and kind of support for a particular languages can be an important factor in chosing it for a particular production project: no one here would dispute that.

If the question by the original poster had been "What language should I learn to get a job" or "I work in industry X: what are the most common languages in use there and are there any technical arguments to support the use of one or other", the question of research vs. production languages might be a relevant factor (though the job question is probably OT for LtU).

But the original question was "what language should I learn to become a better programmer". In the context of answering this particular question, choosing a language with new ideas that you are not familiar with is probably the most important factor, whereas the level of production support for that language is probably irrelevant, and may even give the edge to "research" languages that are experimenting with the latest ideas or old ideas in a new context, and so haven't "taken over the world" yet.

In such a discussion, saying "don't bother with that: it's just a research language" or even trying to make the distinction could be perceived as devisive, or unhelpful.

After all, ALL PLs started life as "research" languages. ;-)

### Shameless plug .. but maybe meritorious.

The Cat programming language is also an example of a blend of programming styles ... it is strongly typed and functional, but it is also stack-oriented.

I am mentioning it here not just for purely self-promotional reasons, but because I honestly believe it might help making the leap to more advanced strongly typed functional languages (like Scala, F#, Haskell, and OCaml).

Cat is hopefully not too hard to learn, you can read the tutorial online, or just start poking around using the "help" and "man" commands. I'd be curious what your experience is in learning the language, given that you have some background in Lisp.

I would then recommend learning Scala next. Scala allows you to take baby steps into the world of languages with advanced type systems, by introducing a very familiar syntax. In the short time that I studied Scala, I quickly became very competent.

On the subject of "research languages" it is my humble opinion that Scala (along with perhaps F#) offers the most promise as a strongly typed language which scales well to large scale software development (hence the name!), where developers come and go, and the code based remains easily understandable by developers of various levels of expertise. Part of the Scala appeal is that it provides access to a massive pre-existing and well known Java code base: a key component of mainstream language success.

You'll be surprised.

http://squeak.org

### Or...

I recommend Oz as a 'new and different' language to learn if you are just trying to broaden your horizons, though of course it's good to know too if you actually want to get stuff done. Its just so saturated with neat toys that it makes a nice stepping stone to many other languages later on. Its online documentation is also world class, and the mozart implementation is very coherent, comprehensive, easy to understand, and just plain fun.

### Indeed

Don't forget to mention an odd construct present in Oz (didn't see it in other languages) which is dataflow concurrency(accessing an unbound variable waits until it is bound).
Also the book "Concepts, Techniques and Models of Computer programming" is a great introduction to the language and a broad collection of concepts.

For someone who's stuck with Java at work, I find it more interesting to learn languages that I can incorporate in my daily tasks. I used Jython for example to write a DSL to monitor J2EE servers through JMX, I could use also JRuby, but also more interesting is Scala because it fits the mind(even if not too much) of those who're against dynamically typed languages, and feel more secure with a compiler at hand's reach.
IMHO, Scala is actually the best "Java Wrapper" and I really like it's expressiveness (have a look at Views, the pilib package..). In terms of performance Scala is also much better than JRuby and Jyhton.

Another recommendation is perl6 (Pugs) which is also a multi-paradigms language(lookup for hyperoperators, junctions,rules ..).

### Another recommendation is

Another recommendation is perl6 (Pugs) which is also a multi-paradigms language(lookup for hyperoperators, junctions,rules ..).

Perl makes me neurotic. It is both fun and it scares me. It is so far away from this petit bourgeoise puritanism and pornographic utilitarism that rules everything these days but it also lacks the beauty of true decadence. ASCII is definitely not made for Perl but how else expressing the baroque ideal in programming then being terse in ASCII?

### perhaps Coq?

It might be open to question whether Coq should be thought of primarily as a programming language, but it's at least partly in the same problem space. And especially when people are answering "learn *foo* to learn about type systems," I think Coq, or some other prover based on Curry-Howard isomorphism, would be a reasonable way to do it.

(I have studied Coq a bit myself, but I am no expert. I also don't know enough about other proof systems to make this a recommendation of Coq vs. alternatives like HOL Light. I checked _Coq'Art_ out of the library and got interested enough to set out to prove the fundamental theorem of arithmetic in it. I ended up stopping short of that, and only proving that every nonprime has a prime factor, but still I learned a lot, especially about type systems and logic. I stopped my proof when I realized (1) how many advanced topics get pulled into the project and (2) that after I had too many times dug myself out of holes by reading other people's solutions, I was sufficiently familiar with other people's solutions that it was no longer fun to try to come up with my own solution. I think _Coq'Art_ is a pretty good book but beware that to me, at least, it seldom conveyed the limitations of the techniques taught in each chapter. For example, around chapter 6 I thought I knew enough to prove my theorem, but in fact key things end up being handled more naturally with hairier techniques: in particular, people seem to handle Euclidean division using techniques related to chapter 15.)

(If not a theorem proving system, then perhaps Erlang, Haskell, or STL (and that subset of C++ which makes it possible).)

### I recommend Alloy.

It's not a programming language proper, but a declarative language for describing and automatically analyzing relational models. It's an academic tool that is actually extremely useful to debug your thinking, as opposed to debugging your implementation, which is much easier...

### VHDL

I recommend VHDL. It debugs clearly your thinking. It is by no academic BS you will never hear about anymore but a definite language and you can do really amazing things with it - define your own processor architecture. It requires highly concurrent programming abilities and much dilligence.

### breaking the von Neumann bottleneck...

Yes, if you really want to get your hands dirty working in a highly parallel universe, start playing with VHDL or Verilog. That seems like the easiest way to break away from the von Neumann bottleneck. Especially since FPGA's are starting to add things like hardware multipliers now.

### Barrier to entry is lower than people think

In the past I had perceived the barrier to entry into FPGA programming as very high, requiring thousands of dollars worth of equipment. I was surprised when I found a development board for under $50 and a complete development kit, including VHDL and Verilog support, for nothing. Today the barrier to entry is actually around$30 assuming you have a PC. Actually, it's even lower, you can program a CPLD (costing around \$5) in Verilog or VHDL with a few wires connected to a parallel port, but I don't recommend starting that way. I found learning basic VHDL and Verilog pretty mind-expanding, although I haven't figured out if I can transfer any of the knowledge I acquired to programming.