excitement in language research?

I'm starting grad school in computer science next year and I'm hoping to focus on languages. Needless to say, LtU is a daily favorite. The contributors here have introduced me to many interesting papers and discussions.

I'm curious to hear from people who are working in the field, going to conferences, etc... What are some active areas in language research today? What do you think are the most exciting recent developments?

What about visual programming? Multi-paradigm programming (or programming 'paradigms' period)? Compiler design and optimization? Language support for concurrency? Formalisms for computation, translation, type-systems...

I hope this isn't too intense. Thanks in advance.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Not in the field, but...

Here's my personal take as a language junkie outside of academia:

* Focus on getting real implementations up and running from the get-go.
* Compiler design and optimization is only interesting in terms of difficult to compile languages. Getting Python fast is more difficult and exciting than getting C fast.
* Concurrency is a big deal that will only get bigger.
* Research into type systems is a black hole that pulls good minds away from more useful work, but maybe that's just me :)

Focus on getting real

* Focus on getting real implementations up and running from the get-go.

Unfortunately, that is primarily engineering, not research. As somebody who has wasted much too much time on implementation I can say with confidence that this brings you nowhere research-wise. If you want to make a real career I would strongly advise against taking this path.

* Concurrency is a big deal that will only get bigger.

Seconded.

* To me, research into type systems is a black hole that pulls good minds away from more useful work, but maybe that's just me :)

I have to disagree. Type systems are a must for any serious attempt in understanding and evolving the inner structure of programming languages (even those "dynamically typed" ones). Too many misfeatures in wide-spread PLs are obvious symptoms of a serious lack of understanding in this area.

Type System Black Hole

I'm in the same predicament as the original poster, so I'm curious on a personal level:

Why do you say that "type-systems" are not a good/productive area of research to get into? What could be done to make them more practical, in your opinion?

Also, anyone who disagrees with the parent poster's statement, could you explain why type systems are a good idea to go into research in?

(edit: removed redundant "why?")

Two Cents

With respect to research, it seems to me that untyped systems are already quite well understood: we have the untyped lambda calculus and a few untyped process calculi and the body of work around them. Doing original work in the field would seem to me to be more difficult than in type theory, e.g. Barendregt's text on the untyped lambda calculus is embarrassingly comprehensive.

Type theory is a good research area, and useful, IMHO, because although there is essentially one very popular type system—the Hindley-Milner system—research into type systems with desirable properties beyond those offered by HM is ongoing—and tends to inform many of the discussions here on LtU. When you read about Software Transactional Memory, you're probably reading about it in the context of Haskell and Haskell's (specifically, GHC's) type system. Not long ago there was quite a bit of discussion about lightweight static capabilities, i.e. using the type system to ensure, at compile time, that certain operations don't leak authority, where "authority" can be broadly defined to include anything from index-out-of-bounds errors to write operations on read-only files, and more. Finally, Why Dependent Types Matter is a good launching-off point for the most sophisticated end of the discussion on types, where types and terms collide, and "static" and "dynamic" aren't quite as clear-cut as we tend to think of them as being. This is where the action is.

re: Two Cents

I'll be sure to follow up on these references. This is very useful to me. I haven't spent any time learning about types or type systems per say, and this will be a good start.

I'll third the

I'll third the recommendation for concurrency.

One, Two

As near as I can tell, there are only two answers to your question that matter:

1) What topic are YOU excited about? After all, you are going to spend WAY too much time reading, writing and thinking about this topic for the next while, so you had better be really fizzed about it.

2) What topic are the people who will be writing your recommendations, grading your work and serving on your hiring and granting committees excited about? If these people aren't interested in what you are doing, you won't have a career.

I don't know how many people here are in category 2, so probably what we think doesn't matter. ;-)

[Edit: fixed negation on point 2 so it makes sense ;-)]

I agree that (1) is most

I agree that (1) is most important. It is also worth keeping in mind that PLT is large field, and people are working ona variety of issues. You should decide which general area is of most interest to you, and explore the work in that subfield.

If you tell us more about the specific topics that keep you awake at night, we might be able to to suggest interesting research that might be related.

I agree bigtime

Of course you are right, my own interests and inclinations should motivate any research directions I take. (Though it sounds a little selfish when I say so myself). Hopefully I will find a natural balance with what I am interested in and what others deem worthy.

Part of what appeals to me about PLT is its formal rigour and purity (contrasted with certain other branches of CS). The lambda calculus was exciting to learn (even more exciting now that I use FP and realize some of its power), and I am fascinated by automata. I'm sure that this is just the tip of the iceberg.

I also love languages as they are conceived and used. It is great to see how various languages and 'paradigms' (for lack of knowing a better word) give rise to different programming idioms. I love the odd-ball languages too. Visual languages seem very interesting to me (not just programming languages, but also languages for mathematics or reasoning - eg. commutivity diagrams, geometric/diagrammatic proofs). Of course, I get a big charge out of so-called 'esoteric' languages - languages which serve as a kind of game or riddle in and of themselves.

As I'm sure will be clear from the above, I have a lot of reading to do - just to get an idea at of the parameters of this PLT thing. I suppose my intention here was to generate some study material for myself.

This a *little* more applied...

But playing with grgen has me excited.