Career paths and concerns

This may be slightly off-topic, but I was wondering if some of you in the community could point me in the right direction with some questions I have about a career in computer science research. To give you some background, I recently graduated from college with a CS degree, and I'm currently working for a small-to-medium-sized consulting firm doing software development in .NET.

Lately I've been thinking that software development in general might not be for me, and that I might be better off getting into research, or at least something closer to it. I feel like most of the problems I run into in my work are not that difficult, and that I just won't be putting all of my technical skills to good use in this line of work. I want to work on technical problems with some real "meat" on them, instead of just putting buttons on a form. One of my interest areas is programming languages, which is why I've come here for advice.

I guess I was initially turned off from graduate school and research after I read this article on Joel on Software, specifically the part about the dynamic logic class. After seeing that, I thought that the only way to do work that would actually make it to the "real world" would be to get into software development. I realize now that that's not entirely true, but I'm still concerned about making sure any work I do is useful, and doesn't solve a problem that no one cares about.

With all of that said, here are my questions:

  • Can anyone explain to me, or link to an article that explains, the general timeline of research getting implemented in industry? I at least have a vague idea of how researchers find problems, come up with solutions, and publish papers, but from that point there's a gap in my knowledge - how does the information from that paper trickle down to a point where developers in the trenches like me know about it? I know not every piece of research is used everywhere, but something like object-oriented programming might be a good example: It originally started, more or less, with Simula and Smalltalk. How did it get to the point where practically every developer uses it, and bookstores are filled with shelves on design patterns and object-oriented languages like Java and C#?
  • As a follow-up to my first question, are there careers that bridge that gap between research and industry (in this case, specifically in programming languages)? Is there a job where one can read research that others have done, evaluate it, and decide if it would be useful to put that work into some form usable by the masses? And if so, what kind of path would one take to get there? Grad school? Climb the ladder at a certain kind of company?
  • Am I at all justified in being concerned about the usefulness of my work if I were to go into research? Is that something that researchers worry about, or does it tend to not be an issue?

As you can possibly tell, I'm primarily just ignorant of the various options out there, and trying to figure out what they are. As a result, any answers, advice, or links to relevant information are appreciated, and thank you for reading this long message.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

It depends

It all depends, really, on what you want. There are all kinds of researchers, some working very close to industry and its current practices, some burrowed deeply into theory, and many somewhere in between.

There's always someone to complain about those "ivory tower theoreticians" who don't do anything remotely related to the real world, but I wouldn't be so quick to say something like this. After all, it is difficult to know beforehand if some theory will be useful some day, or not. Number theory was a completely useless branch of mathematics a hundred or so years ago, but now it's used everywhere.

Besides, theory can be very beautiful. It's like art. You don't compose a music or write a story to fill some real-world need, but to satisfy a need that comes from within. That it may be interesting to other people is a side-effect (do we need a monad of an effect type system for that?:)

Evaluating PhD programs

Andrei wrote: It all depends, really, on what you want.

It sounds like jschuster doesn't really know what he/she wants. Doing a PhD is a good way to do a lot of exploration that might help make the decision, but there are two issues to bear in mind: (i) becoming more knowledgeable does not always help to figure out what one wants, and (ii) the length of time it takes to do a PhD usually constitutes a big career break, and jschuster might regret the time taken over it if he/she does go back to software development.

My advice is to avoid the trap of evaluating potential PhD programs by how appealing or applicable the research they do sounds. Of course being enthused about the research environment is important, but remember that most PhD students chart their own course with nearly complete freedom; you are likely to find a research problem that engages you if your character is at all disposed towards academic research. Instead gather information on what past PhDs from the program have gone on to do and base your decision largely on what you think of what you find.

Takes all kinds...

There are all kinds of researchers, some working very close to industry and its current practices, some burrowed deeply into theory, and many somewhere in between.

This is worth emphasizing. Some people do pursue very theoretical topics, while some focus on very cool applications. Some pursue both, or go from one to the other. Among those doing theoretical work, some are very strongly motivated by at least the notion of applications, while others are happy to study theory for its own sake. Some are defiantly insistent that theory should be its own reward. Some may even feel that they have no choice: they are perhaps obsessively compelled to think about some particular theoretical tangle.

If you are very strongly motivated by real-world use (not just applicability, but actual deployment), you may have a hard time in either academy or industry. It is of course possible to do interesting and valuable work on the interface between research and practice, but it's a pretty thin slice. The best side from which to approach this interface depends on your inclinations, circumstances and opportunities. And of course it's also important to remember that the winds of popularity are extremely capricious...

Your second point is a

Your second point is a classical definition of "engineering" such as in electrical engineering for example. Electrical engineering still plays a big role in computing and even software. There are a number of IEEE journals devoted to software in various forms. My impression is that this usually does have a "classical engineering" flavor. There still are computer science plus electrical engineering programs, but I don't know how common they are. Good Luck!

Same boat

This is the same situation I find myself in. I'm wondering if my direction should change toward academia. I would propose that the OP's preoccupation with immediate usefulness of research may be misplaced. I've found that once I understand what's motivating research in a particular area the applications become more apparent.

Take for example process algebras and the pi calculus. I posted a thread some time ago wondering how useful the theory was. I got some great repsonses. But then some time later a link was posted to an interview with Robin Milner who talked about his work with process calculi. From his perspective the point of the research was to possibly provide an alternative formulation of computation: Turing machines, lambda calculus, could process calculi also express universal computation? (I may be totally off on this, if so I welcome clarification.)

Another thing I found inspiring: the formulation of Grand Challenges in Computer Research. The applications involved with those are awe inspiring. That's my .02...

For certain values of "alternative"...

From his perspective the point of the research was to possibly provide an alternative formulation of computation: Turing machines, lambda calculus, could process calculi also express universal computation?

Not quite. The development of the process calculi wasn't really motivated by a desire to come up with an alternative formulation for the existing notion of computation. Rather, Milner and others (most notably Hoare, Bergstra, and Klop) developed process calculi to provide a way to formally reason about an alternative notion of computation (one in which interaction is fundamental). Process calculi can certainly be used to express Turing-computation: there's a well-known encoding of the lambda calculus into the pi-calculus*. But they are really intended to express interactive systems in which the intermediate steps of the "computation" are as (or more) important than the final result. It's for that reason that process calculi use things like bisimulation to define equivalences between computations, instead of relying on input-output equivalence.

*There's also an encoding of the pi-calculus in the lambda calculus by Phil Wadler.

In my experience, the

In my experience, the relationship between computer science academia and industry is, mostly, dysfunctional. There are noble exceptions, but they're rarer than you might hope.

Before changing career, I'd suggest that you do two things:

  1. Investigate opportunities to do something more interesting in industry. There are plenty of jobs with really challenging problems to solve and which certainly couldn't be described as "just putting buttons on a form". You may have to spend time hunting them out, but they definitely exist.
  2. Read "Software Creativity" by Robert L. Glass. He discusses he relationship between academia and industry, and why that relationship isn't working as well as it might. It may or may not put you off, but it's worth thinking about before you make any life-changing decisions.

The areas in which there's a good relationship between academia and industry tend to be those where there's a well-understood problem to solve. The area that I have direct experience of is Natural Language Processing and Machine Learning. The company that I work for has a relationship with Cambridge University and we're using quite a bit of technology which comes directly from the research carried out there.

It sounds like jschuster

It sounds like jschuster doesn't really know what he/she wants.

You hit the nail right on the head. There are several options out there that have some appealling attributes, but I've yet to find out about a career path and say to myself, "That is exactly what I want to do." So for right now, I'm just evaluating the aspects of my past and current work that I've really enjoyed, and gradually figuring out what I would like to do.

In response to Paul's comment, I'm certainly not switching careers, or even jobs, without a lot of thought and research. Everything I've read so far (online, books, etc.), says to make sure that you really want a PhD before you go for it, so I'm taking that advice.

I should probably also clarify that my current work isn't all just "putting buttons on a form". There are some more interesting problems I work on, but I tend to spend more time than I would like making boxes highlighted, showing a nice error message to the user, etc. - basically stuff that translates directly from what the business wants to the actual code, with little room for creativity in between. I guess part of that is just a matter of proving that I can be trusted with more difficult tasks, but I still wonder how much I'll really be able to grow out of that from where I am now.

Thanks for the good advice so far. You've given me a lot to think about, so keep it coming!

Funding and Academia

Academics don't like to talk about it, but the most important success factor in academic research is funding. If you don't get that, nothing else really matters.
Before committing yourself to an academic research path, take the time to look carefully at what has been happening to academic research funding over the last decade. DARPA has exited, NSF allocations have not kept up, and grant proposal success rates have dropped dramatically over the last ten years. EU funding is even less sane.

On the other hand, you'll do multiple careers in your lifespan. Don't look for something perfect. Look for something good enough to be worth doing.

As to research time lines, it depends enormously on what you are doing. Some transfer times are as low as 12-18 months. Others are as high as 25-50 years. There aren't any hard and fast rules. Researchers do worry about the utility of their work, but they tend to take a long view.

One possible middle path might be to come work on BitC. Even if that isn't a good match for what interests you, you might gain from talking with us about it, because you would get a pretty good sense of whether your undergraduate education has prepared you effectively to serve in a tech transfer role for stuff at the front lines of research. Knowing the answer to that might help to clarify some of your decision process.

BitC is here.

BitC is here.

Funding part two

One of the nice things about CS, including PL research--is that funding is only necessary if you want to pursue research as a *career*--i.e. become a professor or equivalent. There are many areas of CS where conducting research does not require expensive labs, teams of RAs who get up at 3AM to make measurements, and other things which cost lots of money (and thus require external funding, assuming you aren't wealthy and willing to foot the bill yourself). And, CS is far more open to other non-traditional forms of publishing and peer review than are other disciplines; you don't have to publish in Communications of the ACM or some other traditional journal for your work to be taken seriously. Many interesting CS work can be found in blogs and in other forums; and places like LtU can provide some level of peer review.

Of course, if you do aspire to become a professor (in the US sense of the term--a tenure-track instructor/researcher at a college or university, as opposed to the holder of an endowed chair), then yes, you do have to begging for your keep. (Even if your research itself is inexpensive to conduct).

<rant political="1">
Perhaps the political situation in the US will change next year WRT research funding--the next administration can't possibly be as flagrantly anti-intellectual as the current one is, which seems to regard open scientific inquiry as not only wasteful, but dangerous.
</rant>

This is way off topic, but...

In 2006 or so, I observed to David Farber that the funding tap for academic systems research had now been off for four years, noted that if it was instantaneously turned on it would take more than three years for the pipeline to spin back up, that the total delay therefore now exceeded the length of the tenure cycle, and that this was going to be deeply bad for CS. It's not just that the money went away. It's that the program managers went away too.

Seven years is long enough to kill a full generation of basic science systems tenure cases (which did happen, because a lot of this work requires 4-5 person research teams, and that was the funding that disappeared). This has educational and mentoring consequences. If the bubble goes on too much longer it will be self-sustaining because senior systems faculty doing basic science will exist only in sub-critical numbers to train new researchers.

Hoping I was being unduly pessimistic, I asked Dave to explain what I was missing. He responded that as far as he could tell it was worse than I thought, and it seemed to him that my outlook was optimistic.

Today, the funding pipeline has been off long enough that several senior systems faculty I know are looking and saying "I can't do my think alone, I don't want to become a research manager of a center-sized group, and there isn't a viable middle position anymore. Perhaps it's time to do something else."

There are areas of academic research where one person, or one person and a student, can make a huge difference. Core systems areas are not usually among them. PL can be, but go look at funding there carefully before you jump into that pool.

And Scott is right. There are other valid ways to do research, and there is other valid research to do. The reason to be concerned is that if we lose our ability to retain those few teachers who have built larger-scale stuff, US leadership in computing practicum will disappear very quickly. There are a lot of other places in the world where really fine work is done in this field, but Americans bring a pretty unique attitude to what they do. Sometimes good, sometimes ugly, but always unique. I believe that losing US computer science leadership would be bad for just about everyone, and I think the risk is fairly near term.

Getting back to the original topic, all of this is a great reason to get in to CS deeply right now. If you're one of the last generation that still knows what the darned machine actually does, you'll be very valuable over the next 30 years.

that's not off topic

I think that in this thread, explicitly about career paths, that is not off topic. At least, I'll briefly act as if it isn't off topic because I'd like to give some experience-based comments:

One can do C.S. research outside of academia, with next to no money, and with a little luck even come up with original results that have demonstrable value. And, yes, one can even publish those results in non-traditional ways and have them noticed and taken up.

There are still problems here.

You can have results taken up but, don't expect any kind of remuneration. I had what was probably the most commercially significant results of mine taken up with vigor. A now fairly prominent corporation was in part founded on the stuff. Earlier work also had commercial take-up, but of a lesser variety.

The problem with this go-it-yourself path is that nobody feels any obligation or even reason to deal with you. Success is not rewarded with funding. If you can be cut out of the commercialization deals, you will be.

Meanwhile, while I would have trouble counting the number of times I've been advised "Go apply for NSF grants", I can't stress enough how unrealistic that path is. That system may be under-funded from the academic viewpoint, I can believe that. For someone like me, it doesn't matter because that system is closed to me. If you aren't recognized in the professional society of the program admins and reviewers, it is a waste of paper to bother applying.

It's also not just academia. With few and only quite small exceptions, industrial-backed research also distinctly dried up over the past couple of decades. I suspect a mix of two primary factors: the popular narrative about how Xerox lost big with PARC (a story that doesn't make a lot of sense but goes around anyway); and the emphasis on cost-cutting to keep up with competitors in the securities markets during bubble bursts and during the big push towards capital flight globalization.

This industry and the academic world around it are a barren wasteland compared to what it was like when I first got started 25 years ago.

-t

Systems?

What does 'systems' mean in this context:

"Today, the funding pipeline has been off long enough that several senior systems faculty I know"

"Core system"? I assume operating systems?

Meaning of "systems"

By "systems", I mean those sub-areas that provide the underpinnings for building real stuff. Operating systems, compilers, architecture, and so forth. One property of these areas is that deep innovations require larger teams. This means that fundamental or disruptive research in these areas is expensive to fund relative to some other subject areas.

systems + deep innovation -> larger teams

i know nothing of this, but am curious. what is your feel for why it is that the teams must be larger? are there tools (e.g. possible programming languages or theory or something) which would make it possible to have less large teams?

view from the trenches

Panel: The Impact of Database Research on Industrial Products (Summary) by José A. Blakely, Dan Fishman, David Lomet, and Michael Stonebraker.

4. Lomet's View

Good industrial research focuses on marketplace and existing products. Academic research usually does not, and hence it rarely has impact on industry. However, much of industrial research is no more relevant than academic research. So the problem of having impact is not confined to academia.

Much is made of the NIH ("not invented here") syndrome, hence blaming potential research recipients for the problem. But this misses the main problem and is too pat in absolving researchers from responsibility. The problem is deeper than NIH.

4.1 Conflicting Goals

The root of the difficulty is that researchers and product engineers find themselves in dramatically different environments that enforce dramatically different goals.

4.1.1 Researchers

4.1.1.1 What They Do

What researchers do, judged objectively, is produce papers. Our literature bulges with them. Some contain real advances, but many are irrelevant or worse, actually impeding progress. They lead others astray with techniques that are worse than current practice or that are not complete solutions. Many researchers produce techno-nibbles. They begin with a small idea, which is then diced into several papers. Referees frequently fail to weed them out, giving excessive weight to novelty and having too much tolerance of complexity.

Referees are overly impressed with syntactically correct papers. These follow something close to the format: (i) introduction, (ii) background, (iii) main idea, (iv) analysis — sprinkled with equations, (v) performance results — sprinkled with graphs and tables, and (vi) a discussion explaining why the new technique crushes the prior methods. A bibliography cites the work of all likely referees. Some very good papers are syntactically correct. But syntax is not a substitute for real understanding. Papers should be judged by quality of ideas. That is hard, which is why few referees do it.

4.1.1.2 Why They Do It

Academic researchers produce techno-nibbles to build a tenure record. They continue this to be promoted or to move to a high status university. In the research community, too often stature is measured by number of papers. So, to impress friends requires a long vitae. This is very sad.

Some serious re-thinking is needed. University promotion committees need to look for impact instead of vitae length. Most database research is engineering and should further the engineering art. A good conference paper should count more than a journal paper, which all too frequently either was not accepted at a good conference or was.

4.1.2 Industry

4.1.2.1 What They Do

In industry, papers are rarely rewarded, so few are written. Engineers are expected to build and improve products. Improvements can be in functionality, robustness, performance, i.e., attributes important to real customers.

Industry wants the best practice, not the latest paper. Novelty is unimportant. A good twenty year old idea, like B-trees or two phase locking, is just fine. It wants simple ideas that fit with the current system. The closer the fit and the simpler, the better. Even simple ideas are complex to implement. Complex ideas maybe impossible.

Industry wants high leverage ideas with great cost/benefit ratios. These ideas need not be publishable. For example, a great way to boost TPC-A performance is to support multi-statement procedures, hence reducing the number of application/system boundary crossings. The improvement is dramatic when code paths are short elsewhere, as with DEC Rdb.

4.1.2.2 Why They Do It

Engineers are mostly rewarded for product marketplace success. If the product makes money, raises and career advancement follow. This is how industry folks impress their friends.

4.1.2.3 Resulting Impediments

Since researchers are not measured on industrial impact, they frequently don't know the state of industrial art, which can lead the research literature. A goal of mine, as editor of the Data Engineering Bulletin, is to disseminate information about the state of the industrial art. Our December, 1993 issue on commercial query processing is an example.

Few engineers in industry really know the research literature. Also, groups with existing products want incremental improvements with small costs. They resist revolutionary technology, as network and hierarchical database groups resisted relational technology.

But you left out the best part!

From Lomet:

The technical enterprise is a vast genetic algorithm. Like mutations, most ideas are either irrelevant or bad. Even good ideas have uncertain prospects. The standards are higher than for even the most selective conference. To succeed requires focus, effort, inspiration, patience, and luck.

Here Lomet focus on industry, but the first three sentences apply equally to academia.

more of the same...

Start a REAL Journal?

Anybody want to start a journal that publishes only actually semantically important stuff? (Does PLoS count along that vector?)

"You and Your Research" by Richard Hamming

You and Your Research by Richard Hamming is a must read. It has been posted twice on LtU: here and here.

Some answers, a few years later

I know I'm resurrecting an ancient thread, but I've managed to find answers to some of my questions (over the course of a few years), and I wanted to post a few notes in case anyone ever stumbles across this page in the future.

In general, from what I've seen and been told: yes, there are careers that bridge the gap between research and industry, and they mostly require a PhD. Mozilla's research group, for instance, does work like this on programming languages. I also got to talk to Benjamin Pierce about this at POPL this year, and he mentioned both James Gosling and Martin Odersky as examples of people who have done this - Gosling did this within Sun, and Odersky eventually founded his own company to work on Scala.

Another note that I've been told and found helpful: a PhD is simply a "certificate" of sorts that tells others you know how to do research. When I wrote my original post, I didn't totally understand that (1) the types of jobs I wanted to do required research skills, and (2) getting a PhD is what trains you in those skills.

Finally, I found a paper that's a great example of the sort of article I was looking for in my first question: Erik Meijer's "Confessions of a Used Programming Language Salesman: Getting the Masses Hooked on Haskell". It gives a detailed history of how some various research ideas were tried in a number of contexts and finally became the LINQ framework in .NET.

As for me, I put off grad school for a while after the conversation on this thread and gained a lot more work experience (on some much more enjoyable projects), but still thought about grad school/research a lot. Finally, I decided in 2010 that a research career was right for me, applied to grad school, and I'm now happily working towards my PhD as a first year in the PL group at Northeastern. Thanks again, everyone, for all the advice - it definitely helped me make my decision.