Joel Spolsky views on CS education

As CS ed. is being debated, Joel Spolsky is out with a new article warning about the perils of an all-Java training.

Tasy pieces:

The recruiters-who-use-grep, by the way, are ridiculed here, and for good reason. I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days, and create better Java code than people with five years of experience in Java, but try explaining that to the average HR drone.
CS is proofs (recursion), algorithms (recursion), languages (lambda calculus), operating systems (pointers), compilers (lambda calculus) -- and so the bottom line is that a JavaSchool that won't teach C and won't teach Scheme is not really teaching computer science, either.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

The IT industry needs workers, as any other industry.

The IT industry needs its workers. Java makes it possible to make 'programming language users' that are quite capable of covering 90% of programming needs. There is no real use for them to know C or Scheme or ML or Haskell, because their job can be perfectly summarized in re-using what Java offers. And Java offers a lot of stuff that simply does not exist (or it is not laid out in the same easy way) in other programming languages. With Java, you even get a set of already coded patterns to use for most kinds of applications.

Well

So, what part of his "CS != vocational training" do you disagree with?

There's more to his point...

I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days, and create better Java code than people with five years of experience in Java...

Do you disagree with that last part? It's not a question of whether the industry needs Java programmers, it's a question of whether they need good ones. In my experience, the idea of the 100% dispensable one-size-fits-all "Java developer" is pure myth. I see the argument a lot that the industry doesn't need real programmers, and it is simply not true. There are many, many of these so-called coders out there, and believe me, in the long run they cost their employers far more than they save. Staffing projects this way is a short-sighted mistake. It may look good on a quarterly financial statement, but it is an extremely effective way of running a company into the ground.

My experience differs.

Most Java projects in my company are stuffed with one or two "good" programmers with good knowledge of CS and a bunch of Java coders with minimal experience outside Java. This model has been quite successful. The same goes with .NET.

Daily WTF

There are many, many of these so-called coders out there, and believe me, in the long run they cost their employers far more than they save. Staffing projects this way is a short-sighted mistake.

The excellent Daily WTF would seem to bear out that observation. Rather a depressing website in many ways.

A little bit of management is necessary.

Just checking what your programmers have been doing all day long is enough to make daily progress. My former manager asked us every morning what we did the previous day. It was obvious that if we played around, we would have nothing to answer for. And the manager did have access to CVS, therefore he could check if everything we said was true.

oh yeah

daily cvs entries will definitely prevent the kind of code one sees on Daily WTF.

What is CS?

Does everyone here agrees that compilers and operating systems are core computer science concepts? I don't have a problem with they being part of most CS courses curricula, but to say that they *are* (along with the other topics listed) computer science seems like a stretch to me.

Yes

I think that compilers and programming languages lie at the heart of computer science, for obvious reasons. So do algorithms, complexity and computability theory. Operating systems and databases are, too, because that's where most students are educated about critical ideas like concurrency and reliability.

He oversells the importance of pointers, though.

Gotta pick topics so people can understand

He oversells the importance of pointers, though.

I think he picks that because it's something the "average Java programmer" thinks is hard. It's certainly what many C books tote as difficult. Similarly, fixed points are the first thing that could be called hard in Scheme/Lisp. If he simply went off the deep end about catamorphisms and functional data structures (guess what I find hard...) he would lose his average reader. The man simply thought about his audience.

Pointers are hard!

Don't get me wrong; pointers are extremely difficult to reason about. And I think that a mature computer science would have a lot to tell us about pointers, because pointers are how we express and exploit sharing.

But computer science isn't at that stage yet. We can't confidently tell programmers how to use pointers safely and effectively, because we don't have a good enough theory of them. We're close, I hope, but it's fair to say that the effective use of pointers is something that's part of the craft of computer programming, rather than the science of programming.

A good programmer should be able to use pointers effectively, but his or her skill at that will only have been modestly improved by a thorough knowledge of CS. (Mainly by learning of algorithms and data structures that rely on them for their efficiency.)

systems and compilers

Sorry, I deleted my long post to keep my self-imposed discipline to avoid general comments regarding what I think of other folks; what I had to say veered too far in this direction.

The really short replacement I offer here lacks flavor and significance. But basically, I think computer science is mainly concerned with explaining why computing systems behave the way they do, and what kinds of limits apply to possible behavior, especially in the context of complex siuations like concurrency. And so, operating systems, languages, and algorithmic theory are quite central to these and therefore very important to computer science. This version sounds flat, doesn't it?

Not too flat... And I general

Not too flat... And I generally agree with your sentiments, FWIW.

computing systems

Again, what do you mean by "computing systems"? Do you consider both an infinite memory turing machine and an Intel pentium PC to be computing systems? Is computer science concerned with physical limits to transistor size as well as decidability and complexity?. IMHO the answer is negative, CS is similar to mathematics in that it should deal only with abstract formal systems; matters such as transistor temperature and virtual memory performance should not be considered parts of this science.

I would disagree very strongly

Concretely, consider the question of how you would write a piece of database software. Your database has to satisfy the ACID properties; updates have got to be atomic, consistent, isolated and durable. To determine whether or not you have achieved these properties, you have to have a failure model for your computer -- that is, you have to consider the computer not just as a Turing machine, but as an actual machine that can suffer from power failures, disk crashes, memory glitches, and so on.

Fault analysis can tell you whether you've succeeded or not, and trying to rule that out of CS strikes me as a difficult line to justify. You can think of the most theoretical parts of CS, like PL and complexity theory, as formal mathematics in a resource-bounded world: what can we prove when we *can't* enumerate all the integers as part of our algorithm? What's effectively provable, rather than what's true? And once we've taken that step, it's very natural to ask, okay, what can we do when we might make mistakes? And now we're suddenly talking about something very close to the sorts of computer systems we actually build.

Finally, some of the best CS papers ever writen, like Lamport's Byzantine Generals paper, are about systems that can fail on us. It would be a shame to have to assign them to some other discipline. :)

W

I think by computing systems he means a computer modeled in discrete math. Quantities such as CPU frequency, memory latency, etc. are ok. Transistor temperature would fall under computer engineering.

We can allow such quantities, but we don't necessarily use all of that information. It's like symbolic math: Computer scientists in the industry and academia usually abstract away such figures, e.g. with the O(n) notation. However, as a computer "engineer" sometimes you need to know if your application fits in the memory, or about the network latency, etc. Thus, I don't think it's wrong to include some of the constants in the curriculum. Physicists don't always study mechanics with some abstract value of gravity.

discrete simulations

[Added late to this post - this paragraph explains the relationship to programming languages. The material below talks about simulations frequently. Programming languages are the fabric for creating new simulations and connecting to old ones. So you can consider this to be a definition of programming languages in terms of simulations, which might be a slightly novel idea to some readers.]

Hi Raphael. I was intentionally value when I said "computing systems". (Brevity is hard to master; precision can be verbose.) My earlier deleted long post had a part where I cited discrete math classes as among few I found especially useful in college, along with assembler, operating systems, and databases. I'm a *really* big fan of discrete math; to me it was like playing video games in terms of being interesting. I actually consider myself an applied mathematician more than a computer scientist, despite a degree.

So I think computer science as more math than anything else, but math in a more inclusive sense than most folks recognize as math, including logic and game theory and whatever else belongs in the mix, such as 'operations research'. If I had to summarize it, I guess I think computer science is the study of discrete simulations with lots and lots of embedded and nested simulations inside other simulations. The study also includes when the simulations don't work because of failures.

I think cs graduates should know one assembler language or another, enough to recoginze how one chip's instructions map onto another's -- how you'd write a simulator for one in terms of another. I see assembler as just a formal model of how to make the chip's discrete simulation behave. Other than in failure analysis, the difference between real chips and mathematical simulations isn't material to the math of computing problems. I usually think in terms of logical systems and not physical ones.

I roughly agree with your comment about dealing with abstract formal systems; I just find overly abstract and overly formal presentaitons of math hard to digest. I like to visualize things as simulations in space, and timeless and spaceless algebra tends to puzzle me when I can't map it onto my mind's eye view of a simulation.

Transister temperature seems completely irrelevant to computer science, but performance of virtual memory does seem relevant, as the goal of designing a simulation that works 'better' than a slower one. Virtual memory is just a discrete simulation following some rules that give you answers you like better than others. But I like designs for simulations to include some analysis of robustness in the face of failures when the abstractions break down. Some ways of doing things are more sensitive to others under some failure scenarios.

When school graduates seem lame, it's when they seem poor at cost/benefit analysis because of inadequate grasp of costs, as if they don't have a good model of the hardware costs, where hardware is viewed as a predictable simulation that uses cycles and memory an in order to get effects. Great coders are better than crummy ones because they know how to get desired effects more cheaply, with less complex code and less complex simulation layers, with less exposure to perverse failures down low or up high in the tower of simulations. It's nice to end up with code that's faster, using less memory, that's less complex and easier to use, more likely to do what's expected in more situations, that has better chance of surviving or adapting through changes in coming usage situations. When creating designs with more advantages and fewer drawbacks, a lot of the thinking involved seems like math to me, but folks never say this that I notice.

Richard Feynman used to complain about smart students who knew lots of facts, but couldn't seem to apply the basic reasoning involved when appropriate to a given situation in the real world. Factual knowledge in the absence of general reasoning ability seems like some kind of traditional bane in education. (I'm just going to stop abruptly here, though the stopping point seems kind of arbitrary.)

Requisite skills

Here is my idea of what constitutes requisite skills that I expect someone to have fresh out of college. (Some ideas from Steve Yegge, others from supervising a young graduate at work)

1) Ability to use grep and regexes ... on Windows
2) SQL ... including outer joins
3) working knowledge of one version of source control
4) unit testing
5) Java or .Net ... basic familiarity with framework libraries
6) Being able to recognize what operations are expensive but not being too attached to it
7) one scripting language
8) (for now) what XML is, and knowledge of how to use XML parsers

Wow...

I just graduated last May and I didn't learn any of those in school with the exception of 2, 6 and 7 (we touched on scripting in an undergraduate programming languages course, which was historical/smorgasbord in its approach, although I wish it were of the interpreter approach). I've got to be honest though, although it was difficult getting a job (my resume looked sparse in the skill set area) I'm glad I went to a school who's computer science education wasn't "Java vocational training". Instead of learning XML, source control systems, grep, and unit testing (granted different professors mentioned these things---especially in my software engineering class---but we didn't learn the systems or API's) I learned algorithms, data structures, pointers (an understanding of pointers, although not directly important, is important in the understanding of reference types and data structures), lots of math, and operating systems (how they work, not simply Windows or Unix APIs).

Consider a student who is capable of understanding how the quick sort works and how the A* search algorithm works and how you can make A* a BreadthFS or BestFS by changing the function for choosing the next node to expand, and can implement both of them in C. This student understands trees and hash tables and understands the tradeoffs in using each. How long does it take that student to learn DOM or SAX? How long does it take that student to learn Java or C#? How long does it take that student to learn how to use a Version control system and regular expressions? And when "The Next Best Thing" comes along, and the project lead or manager wants to use "The Next Best Thing," how long does it take that student to learn "The Next Best Thing"?

Remember, version control systems will go stale, regular expressions may change (drastically) over time, Java and .NET may not be around in 10 years, scripting languages come and go, XML may go away sometime, but knowledge of data structures, algorithms, recursion, and the ability to analyze any algorithm or data structure is a skill set that will be useful regardless of the technology you happen to be using, because those concepts are core to computing and will never go away.

Re: "regular expressions may change (drastically)"

But the class of regular languages doesn't change over time.

Good point!

:)

Hitting the ground running

Ben,

Thanks for your comments. The skills you learnt will be adequate if you wish to continue with graduate studies, but you will find inadequate when you start work.

I went through engineering school 15 years ago, and in my first year out in the workforce, I realized there is a big difference between knowing and doing. There is a difference between ability to know and already know. I know of plenty of people who know English and should be able to work out J2EE after reading through the tomes of specifications. But that is plain hard going.

If you have the skills I referred to above, you'd be able to be effective far quicker. Given that you've grounded yourself well, it should not take you more than a week to fill out the rest. It is good investment. You will understand what I mean when you see other graduates struggling through the most basic of tasks.

The skills I referred to are those which are the minimum necessary to hit the ground running without being a burden to your immediate mentor.

Basic knowledge of XML (I'm talking about tags, and attributes, not namespaces) is requisite in any job you take over the next 3-5 years. I personally would vote "no hire" to anyone who'd not take the initiative to pick this up, and expect the supervising engineer to hand-feed them when they start.

The question...

...as I see it, has little to do with the importance of XML, Java, or version control systems in industry, but the importance of teaching these skills at the university level. It is scary that employers look through resumes for the keywords "Java" and "XML" and wouldn't even consider hiring anybody who doesn't know either of these technologies. (Granted, I don't really blame them when they have a couple hunderd resumes to look through every day. They would need to hire several knowledgable people just to really give each resume the time it needs to find the real gems, but it's dissapointing and scary none the less.) Wouldn't it be a much better investment to hire graduates from trusted universities that teach the fundamentals well (in other words, we can be resonably certian these graduates understand their fundamentals well because they graduated from such-and-such university.) and give them a couple of books and an initial two weeks to a month to learn the basics of the technologies so they can start coding?

Ideally universities would teach everything computer science has to offer. Unfortunately, you couldn't teach everything computer science has to offer in 10 years, let alone 4. There are trade offs that need to be made. One of the best choices a university can make when trying to decide what kind of technologies should be taught in their coursework is to teach those concepts and technologies that:

  1. will cause the student to look at programming and computer science differently
  2. are difficult to pick up on their own
  3. students are unlikely to study independantly of a structured course environment
Programming langugages like Scheme, C (or C++), assembly languge, and Smalltalk (or even Java, but Java may only satisfy the first criteria, but that's okay!) would be good languages to teach because they meet all three of the above criteria. Scheme gives the student a high level, functional view of programming and problem solving in Scheme is much different than in other (imperitive) languages, plus Scheme helps the student focus on the what they are doing rather than how to do it. C (or C++) gives the student a lower level view of the world and how data structures work. Assembly helps the student understand the details of how a computer works and helps the student understand why abstraction is a good thing (assembly language is hard because there are no abstractions!), and Smalltalk (or Java!) shows the student the object oriented worldview. Java's okay to teach, as long as it's not all you teach or it doesn't become a Java vocational program as opposed to computer science degree.

I had very little Java in school (it was one of the languages we covered in my programming languages course), in fact I had more Lisp than Java in school (most classes were in C or C++, depending on the course). I had even less C# (as in none). I am now employed as a developer at a company that develops telecommunications software and hardware. The company is a big Microsoft shop, so, as you can imagine, almost everything is done in C#. Less than two weeks after starting I was already working on a project in C#. Granted C# is enough like C++ that it wasn't difficult picking it up, but even if it had been a lot different, it wouldn't have been very difficult. On my first day, my supervisor set four books on my desk and told me to start reading them. (They were books on C#, .NET, and Windows programming). Now, remember, in school I had no C# or Java, and the school was pretty UNIX/Linux centric, yet I didn't struggle endlessly trying to figure out C#, because I had the fundamentals down solid.

I currently don't work much in XML right now, but if I did, XML is much easier to figure out than C#. In fact I would say that it shouldn't take more than a day to understand the concept of XML, its structure, and how to use it. It shouldn't take much more than another day to learn the general idea of DOM and SAX. Remember, because you have the fundamentals down, you already understand the concept of trees and graphs, and you already understand all about parent child relationships in those types of structures. This makes understanding DOM very simple. In fact you should understand DOM almost immediatly (you might need a reference, but you know what sort of things need to be done).

My belief is that there is nothing new under the sun. Everything in computer science builds upon a certian set of fundamental concepts (until something really new comes out!). These concepts may be somewhat difficult to understand, but once you understand them, they form a foundation upon which you can build your knowledge of specific technologies. Lists, trees, hashtables, arrays, these are all concepts that can be understood in terms of the more primative concepts of a linear array of memory and pointers. Now I may not ever use pointers in my career, but I know how the above data structures work because I have an understanding of pointers. I also have an understanding as to why I should use a list and not an array if I'm going to be inserting items into the middle of my structure, and (this is the big one here, what makes this so important) I now have the understanding to create almost any type of datastructure my mind can conceive because I have a knowlege of the building materials. I may not even be working with pointers (references in C# and Java) but my knowledge of pointers (and hence references) lets me know what is possible. I could make a simmilar case for recursion, higher order functions, and other general concepts in programming languges and computer science. It is the basic concepts, the ideas, that are make a good programmer, not knowledge of a specific implementation.

Does any of that make sense?

Concepts

Joel's rant was not about the abject uselessness of Java, but that it was an equalizer, making the average look as good as the best on the basis of GPA alone.

The industry is not looking for dumb Java programmers.

In contrary, the industry is looking for prepared and street-smart programmers. Those who are prepared to learn continuously, and streetwise enough to learn ahead of a job.

Ben, it is a free world, and it is every young CS graduates choice what they want to do between finishing university and starting their first job. I'm just giving a perspective of what may just give one a headstart over others.

Your right, of course,

But I saw Joel's rant a little differently. (I'm sorry if I implied Java was useless, because I don't think it is. I don't particularly like Java, but I think it is a much better programming language than C. Just look at how many buffer overflows have caused the security nightmares in the past 5 years!) I saw it not only as a compliant that Java is a great equalizer, but many students will not reach their full potential at a Java School. In other words: the B and C students of a more traditional computer science program may get A's and B's at a Java school (the equalizing part) but they also will learn less. The student coming out of a more traditional program may have B's and C's, but I bet he/she is a better programmer than if he/she had come out of the Java School program with A's and B's.

I'm sorry, but I'm afraid I may have misinterpreted the list you originally gave. Yes, given two equal programmers the one that already understands the technologies you listed is the better candidate. When I first saw the list I interpreted it as: the list of important things that should be taught at a university level. It's not that I think some of those things are unimportant (on the contrary, I knew everything on that list with the exception of 5---I had very basic knowledge of Java and .NET, but it only took me a couple of days to get up to speed after I got my job---because I saw them as important, even though many of those concepts were not taught in class.), it's that I think what should be taught at the university level are those things that form a foundational knowledge, and those things that test your capability as a programmer. The rest is easy, and can be learned throughout your career. (After all you should never stop learning!)

The industry is not looking for dumb Java programmers.

Nobody is claiming that industry is looking for dumb Java programmers. Industry is looking for a large pool of programmers from which to choose because it puts less pressure on wages. (Obviously they want to get the best, but they want them cheap. It's all about economics: supply and demand.) In order to get a larger pool of programmers, there has to be more graduating from the universities, so universities are actively encouraged to lower the dropout rate for computer science majors. Easy way to do that is to make things a little easier. I know curriculum has been dumbed down, because one of my professors told me it has been dumbed down. Unfortunately this produces less adept programmers overall because the students are not reaching their full potential. Students need to be actively encouraged to strive for excellence because it is human nature to hold back and not work as hard. If things are too easy, students may find themselves only working as hard as they need to just get by. (Would you really study and actually learn the material if you knew you were capable of getting an A anyway? Your friends/girlfriend want to go have fun. You can study tomorrow. etc...)

I guess my major problem has been seeing some of my coworkers working on projects with other coworkers, and seeing a few of the better programmers doing all of the work, because the others don't have a clue as to what is going on. So, in the end a select few people end up spending half of their time explaining things to the rest, a third of the time writing their own code, and a sixth of the time fixing the other peoples code. For some reason that doesn't seem right to me. On the other hand, maybe there isn't too much that can be done about it.

But those bad programmers MAY be useful anyway

I agree with your points, but I would offer the following: bad programmers can be useful. Often they have better social skills, meaning they can act as buffers between management/customers, and the good programmers.

The good programmer explains things in a slightly dumbed down technical format, and the bad programmers have just enough of the concepts that they can translate this into terms non-technical people can understand.

Mind it is the rare bad programmer who is good at this, but look for them. When you find them groom them - if they recognize this as well as you, they will pull you along with them, and you both will be more successful.

The right basic idea

Wouldn't it be a much better investment to hire graduates from trusted universities that teach the fundamentals well (in other words, we can be resonably certian these graduates understand their fundamentals well because they graduated from such-and-such university.) and give them a couple of books and an initial two weeks to a month to learn the basics of the technologies so they can start coding?

Almost. The best investment strategy I've found for new college hires is to do as you say with respect to trusted universities, but only hire the ones who have already shown the drive and curiousity to have already taught themselves the basic practical stuff on their own. There's just too many risks to do it the way you describe. You could end up hiring college students who were good at passing classes, but horrible at actually building working structures. You could end up hiring zealots who believe they have found the One True Tool to create software, and haven't had enough practical experience to have that nonsense pounded out of them. You could end up with someone smart as a whip and totally unable to hit a deadline. You could hire someone who can make the code do whatever he wants, but is actively opposed to doing the UI work which provides the most value to users. I've hired all those guys in my time, and have had to fire them later. I hate firing people.

I've interviewed hundreds of fresh-minted college students from good universities. Trust me when I say that a high GPA isn't nearly enough to base a hiring decision on. When an employer makes a professional hire, they are making an enormous investment. The rule of thumb is that a mis-hire has an average cost of eighteen months of the the employee's salary, due to lost value and rework. Mis-hired software developers can be even worse, as a bad architectural decision at the wrong time can cost millions, and mistakes can be subtle and hard to see when they are made. I loved my time in school, and learned a lot, but there's no way I'm gonna take that sort of risk based on nothing more than university classwork.

Yes

The best investment strategy I've found for new college hires is to do as you say with respect to trusted universities, but only hire the ones who have already shown the drive and curiousity to have already taught themselves the basic practical stuff on their own.

Yes. You are right. I'm sorry I didn't make my point very clear, which was: universities shouldn't teach things like XML, etc. because there are more important things to learn that the student most likely will not learn on their own. In other words. The motivated student can learn XML on their own, and probably will because it is of great practical importance in industry. (That was the approach taken by professors at my university. They encouraged students to learn the stuff on their own, but when questioned as to why they didn't teach it, their answer was that the students who have made it this far should find no trouble learning XML on their own.) Understanding lists and arrays is certianly important to industry, but not in the same respect as XML. I guess I'm just dissapointed that there seems to be this emphasis on technologies such as XML (not that XML is inherently bad or anything, but that is another discussion entirely) and that issues that make good programmers good have begun to be ignored.

I'm a CS student yet, but I u

I'm a CS student yet, but I understand it. Some time ago I've taken a "Files Organization" course in my university. While it's been a profound course about leading with seek time, B-trees, collisions and whatnot, when I've had the classes it was just about read slides and make a XML-based program. So, I didn't know anything about XML, but my previous understanding about data structures make me see it's just a tree.

I've made my homework in Python using the standard Python XML API, but the teacher didn't know Python (she had some difficult to cite other "languages" but Java an .NET) and I rewrote the program in C using libxml2 in two days. No segfault, no memory leak.

I don't need to learn XML in the university, just learn trees. Actually, I bet teach XML in an universitary course is *potentially* loose time because we'll need really much time to learn stuff like OS'es internals, parsing, FP, software engeneering. Ok, you *can* learn XML in the Academy, but you *must* (or at least I think so) learn data strucutures, algorithm analisys, low-level programming, abstraction development. Without it, you have something that at least in Brazil we name "technical course": a less embased course for a less prepared (but indeed needed) professional category, something like a computational mason.

Just one more case: in the penultimate edition of the ACPC/ACM Programming Contest, one of teams of my university joined a paralell event called "Tempesta", wich was a Robocode-like championship. Nobody here knew Java, but with the Eclipse autocompletition help :) they won the competition. I really think Java (or C#, or Python or whatnot) is not a so important academic course.

Not what you learned in school

but what you know. That's actually a really good list, and not too far from mine. I don't expect new college hires to have picked up the practical skills in school. In school, they should be picking up theoretical knowlege. I expect them to have picked up the practical skills on the side, through personal coding projects. If they didn't have the curiosity and ambition to do personal coding products, I don't hire them. Period.

It's fine and necessary to learn the "core concepts" which will "never go away" but the next version of our product ships in 12 weeks, and you better be able to pick up the tools and cut some worthwile code before then.

Heh. Sorry about that.

Ignore my above post. I didn't see this comment when I wrote the response to your other post. Looks like you understood what I was trying to say better than I previously thought. :)

Practicality

I think Chui is right here - there is a huge difference between having the ability to learn, and actually knowing things right out the door.

The fact is that many employers don't wish to wait whilst new graduates bootstrap themselves into the working world. Sure, you may say that learning version control is easy for a smart person and it would only take a week to become fully fluent in tagging, diffing, patching, branching, merging, studying logs, maybe experimenting with distributed VC. Maybe even less than a week. Though version control is odd in that it takes practice to really use it well - to know when to commit and how often, how to write descriptive log messages that can be read alone (oh how many commits have I seen that simply say "upload todays work" and so on.

But of course, there are many simple skills like this that a professional developer needs.

If you are a truly excellent and hard worker it may only take a week to grok the XML specifications, a week to get good out regular expressions, a week to explore the Java class libraries but when you add all these together you find it takes a month to even start being productive!

And then of course, knowing version control/XML/Java is simply the ABCs of much professional development work. For a truly excellent programmer I'd expect a good knowledge of several widely used languages (for instance C++), a thorough understanding of hardware (I've seen programmers who don't understand why their software takes a minute to start when simply listening to the hard disk can tell you ...), ability to write thread-safe code reliably (I don't think anybody really learns this in university as it requires a fair bit of practice), etc.

This is a rather facile demonstration of the point, which is that knowing is different from being able to know in an employers eyes. OK, sure, it is important to know data structures, recursion etc as well but generally I think CS courses focus far too much on these skills at the expensive of the ones you use every day in real jobs.

And to be honest, whilst the maths may not change much, pointers have been around pretty much since the dawn of computing. Version control concepts don't change much either ;)

They have to wait anyway

It doesn't matter how good you are, your first few weeks at a new job will be spent learning how the company does things.

If the code is well written you can jump in the first day and add some hot new feature (I did this in one job), but in the back of your mind will be the thought: what if there are side effects to this call that I'm not accounting for?

Then there are all those business processes that are different. Sometimes worse, sometimes better, mostly just different.

I use algorithms and data structures every day. I rarely calculate the big-O of anything anymore, but it is in the back of my mind.

Today XML is the big thing. 10 years ago when I was in school it wasn't invented yet (it was in the process of being invented, but nobody yet saw that it would be a big thing). 10 years ago Java was the big thing, today it is considering important and useful, but it isn't getting the hype (Much of what it didn't live up to). When my dad was in school his text books had a foot note "When liquid crystals are exposed to electricity they glow - this is an interesting, but useless phenomena." There is a lot of money to be made as a Cobol programmer, but nobody cares about that in school.

The point is things change. Algorithms and Data structures are universal. Java, or any other language, is not.

Right

Exactly - if it wasn't bad enough that joining a new company fresh out of college involves learning about the business, they also have to learn how to do the job as well! That's huge pressure and nobody is going to perform well like that.

They'll catch up, mostly, eventually, but it'll take a long time because it's all ad-hoc learning and not really directed ... kinda the reason we have schools and universities in the first place is that it's better to have teachers and a proper learning environment that to try and learn everything as you go.

Nonetheless, that's how most industrial programmers learn their craft, because it's the only way on offer ...

Oh, and I'm not sure about your last point. 10 years is a very long time in computing, I don't think it matters that 10 years ago XML didn't exist. It does exist now, and people are expected to understand it ... somebody saying "Well I didn't bother to learn about it because it was only invented 10 years ago" won't be very impressive in an interview.

I'm also growing less convinced of algorithms and data structures being some constant fixed thing all the time. Big-O analysis of common algorithms in particular ignore things like locality which can have a big impact on performance ... so big, in many common cases, that it's faster to use a simple linear search through an array than say a hashtable despite the theory telling you otherwise. Why? Because the theory is based on an ideal computer that doesn't bear much resemblence to what really exists. Even that changes, with time.

On the topic at the end of the post

I came across this somewhat interesting research area a while back that I believe has not been mentioned on LtU. Cache-Oblivious Algorithms. That is a pointer to a thesis which makes a decent introduction. Google will turn up other papers. Some may find it interesting. One thing to note, however, is that the class of cache-oblivious algorithms is separated complexity-wise from the class of cache aware algorithms in general (i.e. you can sometimes get better asymptotic complexity with a cache-aware algorithm than is possible with a cache-oblivious one).

CS vs CE

I'm also growing less convinced of algorithms and data structures being some constant fixed thing all the time. Big-O analysis of common algorithms in particular ignore things like locality which can have a big impact on performance
[...snip...]
Why? Because the theory is based on an ideal computer that doesn't bear much resemblence to what really exists. Even that changes, with time.

Actually, the 'theory' isn't based on an ideal computer. The theory only establishes a minimum time unit, and using a simple (for purposes of teaching) evaluation-to-timeunit mapping, estimates execution time symbolically. It doesn't prohibit anybody from using complicated mappings where cache sizes, pipeline depths, etc. are represented, which makes it a CS + CE problem. Yes, the real world is complicated, but the theory still holds.

[..snipped portion...]
... so big, in many common cases, that it's faster to use a simple linear search through an array than say a hashtable despite the theory telling you otherwise.

The theory tells that as input size grows, there will be a point after which the linear search is guaranteed to be slower than the hashtable lookup. This is definitely the case even if hash table lookups consist purely of cache misses. This threshold input size may be 100GB (i.e. irrelevant for practical purposes), but the theory never told you that it was always 1 byte.

Exactly - if it wasn't bad en

Exactly - if it wasn't bad enough that joining a new company fresh out of college involves learning about the business, they also have to learn how to do the job as well! That's huge pressure and nobody is going to perform well like that.

No, they only have to learn the technology. That's different from learning how to do the job. It's the difference between a civil engineer learning AutoCAD on the job because they used a different program in University, and a layman having to learn about stress, and tensile strength. I now program in C# in my job, even though, I had hardly touched either C# or Java in University. I didn’t take very long to learn it either. It took at least as long to learn about the project I would be working on as it did C#, if not more so.

work with a real CS practitioner

if you ever work with a real freaky-good programmer/CS-type person, you will know why Chui's list is pretty silly.

In relation to the mediocre graduates, this discussion makes sense. However for the top people, fundamental skill are gold and specific technologies are a waste of time.

If you think of how to get the most out of really good people, Joel is dead on the money. I did a really hard degree that enforced the skills required (not CS) and a computing degree that encouraged the skills. I think Joel is lamenting the prevelence of the latter. Of those graduating my first degree, I would give any job to a high GPA graduate. Of those graduating from my computing degree - you have no idea what you are getting. Even the best, who should be brilliant, are just good.

On top of that, there are some seriously bad graduates coming out of these degrees, I mean completely useless.

By-products (not biproducts)

IMO, none of the things you mention are things which should be taught in a CS curriculum; they are by and large things which you "pick up" as a by-product of doing programming on your own and for classes. If you don't have enough natural curiosity and motivation to investigate these topics yourself, then you will not make much of a programmer anyway, and not only because in ten years your skills will be worth half of what they once were worth. The point of programming is not the individual technologies, standards and languages which you may or may not have to use daily; those are only (a few of) the trees of a forest of concepts.

Curmudgeon

Starting a rant that consists of "In MY DAY, programmers were REAL MEN and didn't have this fancy-schmancy ..." with a confession to that fact makes it no less of a rant nor does it add any substance or merit to it.

He makes good points, very occasionally. FP gets more than just a nod, it gets a mention of real-world application (you can't get much more familiar to ordinary internet users than Google). But he then proceeds to run completely off the rails by presenting segmentation faults of all things as some sort of eye-opening enlightening experience. I'm really surprised he didn't start on the familiar refrain regarding Assembly (something which EE majors can themselves sneer at as being "too high level").

Then of course he falls into the same old trap "that which I do not use/understand is useless in the real world", by relating Scheme to function currying (huh?) and claiming it has no use in the real world of programming. I simply stopped reading at that point, having no faith that the last couple paragraphs could redeem this rant from the affected ignorance it revels in.

I think

You may have misunderstood what he is saying and, hence, you may be missing his point. Really, how much of the real world (industry) uses function currying? My guess is: probably (much) less than 1%. Most people would simply conclude that that means it is absolutely useless (after all a vast majority of people don't need it).

Joel most certianly was not making the claim that function currying was useless (sometimes people like to use irony to make a point!). His whole point was that current CS curriculum is way too focused on what industry wants (Industry wants Java/C# programmers, and it wants them cheap. What better way to make them cheap than convince the univercities to pump them out like a factory? And really, how many CS graduates at a "Java School" have even heard the term "currying"?) That something isn't used much in industry (is "useless in the real world") doesn't mean that it is really useless.

In fact, his point seems to be that the "Java Schools" are pumping out useless programmers, because these programmers don't understand the fundamental concepts (data structures and algorithms) that make a programmer a good programmer. Instead they only know how to make use of the data structures included in the Java language (really now, knowing how to use Java is, in a very naive sense, what is really useful knowledge in the real world, afterall it will get you a job, and will get you paid for several years too, until something else comes along and replaces Java). Since they don't have an intimate understanding of these very same datastructures, they don't know how to make use of them beyond their obvious uses. That was his whole point in the article. Maybe you should finish it.

Inheritance

Let's make this a little PL related.

I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days

Don't you think that one has to learn how to use inheritance properly to use Java well?

Inheritance can be quite a dangerous tool, if one doesn't know how to use it. I don't think this is the core of CS, but it should be taught, shouldn't it?

Joel's example is not complete...

... for the sake of brevity, I think. Obviously, if we're being taught theory, it should include inheritance.

However, I still agree with the incomplete example: compared to those three, inheritance is a simpler subject, although it's misused a lot. Weird, eh? Maybe that itself is telling something. I haven't seen as many (maybe any?) misuses of pointers or monads by the people who 'get' them.

Yes

I concur. As a recent graduate who was, not that long ago, learning inheritance for the first time, I find inheritance to be a much simpler concept than, say, the Y combinator and the concept of recursive functions being the fixed point of the Y combinator applied to some other function. Very interesting, but also much more complicated than inheritance. Understanding inheritance is not to difficult. Understaning how to use inheritance on the other hand, requires much more thought.

...although [inheritance] is misused a lot. Weird, eh? Maybe that itself is telling something. I haven't seen as many (maybe any?) misuses of pointers or monads by the people who 'get' them.

Yes. Maybe this has something to do with the difficulty of applying the concept of inheritance vs. the difficulty of understanding them?

how to use inheritance badly

Here's my very informal characterization of how inheritance can be done badly: it's another case of indirection gone awry by interruption and dilution of intended theorems. When you bother to have a base class and derived class, the intent is a form of indirection that allows you to substitute a specialized derived instance where code will invoke the generalized base interface.

A base class is like the start of a thought, and derived specializations are the completion (assuming you need one or more concrete examples for the base class to even make sense). The more these are spread out, and the deeper the chain of inheritance, the more confusing this gets and the harder it is to see the whole thing. Badly done indirection tends to have too many intermediary nodes in the inheritance tree, which fuzz up the picture.

Anyway, the longer the inheritance chain gets, the longer the indirection involved, and the bigger the window of opportunity a problem to creep in the middle. Basically, chain length can be seen as an informal measure of risk. At first indirection is nice because it adds flexibility, but as you add more you get totally spineless jelly that tends to prevent strong predictions because you aren't sure.

In terms of intended benefit, inheritance is supposed to increase your degrees of freedom by allowing you to substitute variations. But in practice, when a coder is sloppy, it's easy to cause so much coupling between classes and implementations that degrees of freedom get lost again due to constraints of coupling. Suddenly the options are gone, and maybe you can't even get a clear line of expected behavior anymore. You can get all the confusion of indirection without getting the benefits.

Programming languages tend to focus on the 'how' of code -- here's how you can specialize this base class -- without clarifying the 'why' of code. Maybe better programming language technologies would help reduce the cognitive cost of long indirections while encouraging the addition of more intended 'why' aspects of code behavior which can be verified. Long indirection chains make empirical analysis difficult. If programming languages improved the empirical examination part (what does this long chain of inheritance mean?) then effective penalty of too much abstraction might be lessened.

Sctattered specialization

This discussion is far to abstract for me and I'm afraid it produces pseudo-measures of code quality. The problem is "scattered specialization" and I do know a little trick how to produce compact procedural code without abandoning the benefits of OO-like specializations. I call it the "tiny object hierarchy" pattern and I did not find it yet in the literature ( but this tells as much about me as it tells about the literature ;). Though it does not solve all problems of scattered specialization ( how could it? ) it is quite simple and usefull in many cases. I plan to publish a short description together with a Python implementation within the next week ( the original implementation is in C++ but it is tedious and the software is owned by my customer ).

And see this.

And see this.

... and this

I couldn't agree with him more

Here is what I wrote to Mr.Spolsky after reading his article.

Dear Mr.Spolsky,

I read with a lot of interest your article about Java and JavaSchools and I couldn't agree with you more. My life story is as follows: I got interested in programming as a kid, studied six years of college in computer science and came out of college thinking I am going to do all sorts of great stuff. The first project I worked on happened to be in Java (I had worked primarily in C/C++ before that and Java was new then). Unfortunately, one thing leads to another in this industry and almost a decade later, I have been branded a "Senior Java Developer", whatever that means. I am totally fed up with Java. More than the language itself, I am fed with the number of incompetent and mediocre people it forces me to work with. I am not saying this in an arrogant way - the truth is that most of the Java programmers out there simply aren't interested in software design or programming concepts or computer science. Many of them switched over to this field during the "boom time" and learnt Java in a few weeks. Now, for that reason, they are senior java developers too and their only skill is that they can write code in Java that compiles.

Even the ones who are mildly interested in software end up being totally captivated by the latest shiny toys in the Java world - JThis and XThat. No one wants to actually develop anything anymore; they just want to assemble things together. Download a few third party tools (mostly from apache), add spring, hibernate etc. and wire them all together in XML. Hallelujah! Of course, you have to throw around the names of the design patterns you used. After all, the more patterns you use, the better your program is, right?

A decade after graduation, I realize that my so-called safe choice of programming language and tools has basically made me indistinguishable from anyone else in the industry. I am not sure if I will have the guts to correct my line of work, but I hope I do.

Thanks.

regards,
Sriram Gopalan

Being a victim of an all-Java

Being a victim of an all-Java university CS education, I thought I'd share a couple of my thoughts and feelings.

I was a very good programmer even prior to university. By that time the only languages I had been programming in were roughly half a dozen BASIC dialects, assembler, Pascal, Modula-2, C and C++ - the usual stuff a hobby programmer back then would know I think.

I don't think I enjoyed university. I didn't go to university to learn the AWT API, or Java at all. I could have done so at home. In hindsight, I should have picked the university more carefully, specially since I knew that programming languages would be an important topic for me (always had an interest in PLs, I already designed small special-purpose languages in my schooldays and wrote compilers and interpreters for them). However, without contact to other people in the field, very limited time on my hands (I was in the army when I had to choose a university) and no internet access back then, I made the mistake to pick the university which made the biggest buzz about their CS department (lots of money, modern equipment) at that time. I should have been scared when in the introductory course, the profs told us to stay away from the books of a certain Donald E. Knuth which were all seriously outdated and would not teach us anything of what modern computer science was about.

It's been three years since I graduated. With my interest in PLs, lots of reading, and the blessings of the internet, I have taught myself Scheme, Common Lisp, Haskell and Prolog, learned about the lambda calculus, Curry-Howard isomorphism, type systems - the stuff university forgot to teach me.

I consider it extremely shortsighted to teach a particular language instead of a set of fundamental concepts when the world is moving on constantly. Languages are evolving. Mainstream languages get slowly but steadily more and more features from more "academic" languages. It doesn't hurt to know concepts that you don't need right now but might/will rediscover in your language in the future. Where is this evolution of mainstream programming languages coming from? Obviously not from people whose minds were locked into Java.

My personal feeling at this point is that with all-Java, all-OO teaching, I got a rather bad education, that I am angry that I now need to sacrifice my valuable spare time for learning things that should have been taught, and also that as I'm learning more and more of these things that "the industry doesn't need", my horizon broadens significantly, big pictures appear, my overall skills grow no matter what language I'm writing in and I have learned that there's tons of stuff I don't know that's worth to keep on learning for.

I'm preaching this here in this company quite often. The problem is, nobody really cares and one here outright defends all-Java teaching as being the ideal education. Just one person here has listened to me. He's now started learning Smalltalk and Lisp in his spare time. At least a very small victory for me I think.

This is only a very personal, very biased opinion of course. Most CS students probably just want to learn some fashion mainstream language that gets them a job where they can code the days away (see reviews of SICP on Amazon ...). And lots of companies seem to insist on hiring academics only when they really just need some coders to put a GUI together, glue ready-made components together or type in a formula from paper. Perhaps those people/companies would turn towards other institutions that teach and concentrate on the more practical aspects of CS. But then again, I understand if you've got exclusively MScs this and PhDs that working on your program, you can better impress your customers.

Agree and Disagree

I agree with most of what Joel says. I think there are far to many Java coders out there and that many schools are now trying to turn out Java coders rather than programmers. In my experience, despite what the managers may think the programmers are many times better than the coders and in the end do all the work and are the only reason the company gets anything done successfully. The coders usually just slow down the programmers.

I do take exception to a couple specific points. First, he says

You may be wondering if teaching object oriented programming (OOP) is a good weed-out substitute for pointers and recursion. The quick answer: no.

He is simply wrong. I went to a college where the intro class was in pure Java and was a weed out course. The school administrators ended up hiring a new professor and basically firing the old one because the class was weeding out too many people! My professor taught pure OOP in Java like others teach functional programming. He set up the class so you would fail if you didn't understand OOP like the back of your hand in a deep down in your soul kind of way. He wouldn't accept resorting to imperative programming solutions. What is true and what Joel should have said is the half-hearted watered down way that OOP is taught is not a substitute for pointers and recursion. Furthermore, it seems the advantage of pointers and recursion is that they can't be taught half-hearted and watered down.

The other thing I have to disagree with is when he says:

I have never met anyone who can do Scheme, Haskell, and C pointers who can't pick up Java in two days, and create better Java code than people with five years of experience in Java...

Although that experience prepares one to write better Java code then people with five years of experience in Java (except for those people who already have both), it does not mean you can become better in a short period of time. I see so often that people spend Years actually getting OOP even if they have a good computer science background. Indeed, sometimes because they have a good computer science background they are worse off. They either fail to move from imperative or functional styles to a truly OO style, or they never get far enough past the belief that they are all different ways to say the same thing to really get the advantages that OOP can offer to some problems.

Well , It seems people are ar

Well , It seems people are arguing to me over nothing. The real deal is to be able to learn or pickup anything! Even understanding the business processes of a company! The problem is that most programmers do not have a grasp of what is important! If your employers require that you code in Java then by all means do so! If it is Lisp tomorrow you should not have a problem with that either! Programming is a concept! If you understand it be it imperative ,functional or OOP. I have not heard people talk about management skills? Why do people feel that you are a better programmer just because you can code scheme. I picked up scheme in a day? We should stop advocating programming languages,rather concepts! How many programmers can design a large app from scratch, produce great documentation, prepare and give a presentation to management to justify a project and Manage customer expectations throughout the whole software product development life cycle. We are talking about skills to survive in the real world, in any job!

The problem is the word language

We need to eliminate the phase "programming language" from our vocabulary. Complete strike it out. When everyone here goes to learn to program in "Henry" (A mythical programing "language" I just made up - named after me of course), do not tell anyone you learned a new language.

As soon as people hear about a programing language they think about foreign languages. Most people in the US have taken a couple years of a foreign language (Spanish, French, and German are nearly universally offered in high schools), and they cannot speak it at all. Other countires make their students learn a language (often English), and they spend years teaching it to get any fluency, and even then many people just get by.

They assume therefore that because work in 5 "languages" daily, and have worked in more than dozzen others over my lifetime; that it is equivalent to learning 15 foreign languages - truly an impressive feat.

Unfortunately those who hire often know nothing about programing (nor should they - I expect HR to know more about benefits, than programming), they relate to what they know.

Thus we have job advertisements that require 15 years Java, and 10 years c#. They wouldn't consider a Spanish interpreter who had less than 5 years experience in the language.

Dear HR

Unfortunately those who hire often know nothing about programing (nor should they - I expect HR to know more about benefits, than programming), they relate to what they know.
Nobody is happy with HR, but that's a bit unfair. HR don't make you hire those people; after all the hiring decision is of the hiring manager with the help of technical interviews. Now, they probably don't suggest the sort of people you need, but that's another problem.

Regarding the "must-have-known-Jesus-for-3000-years" kind of minimum requirements, they did exist once (during the boom), but having looked for a job earlier this year I can tell that you don't see them anymore.

Are there companies where

this isn't true... where HR takes the lead in selecting and approving candidates? Where I work, hiring managers get the final say; HR is there to support the hiring manager (by dealing with benefits, paperwork, background checks, and such). HR may have veto power over candidates obviously unsuitable for non-technical reasons (say, a past conviction for embezzling)--but doesn't participate in evaluation of technical merit.

Similarily, the skillsets listed in job postings are generated from the hiring manager(s), not HR.

Of course, how things work at my employer may have little to do with how things work at other companies--especially big, old corporations with lots of institutional inertia (which may cause them to disregard what is considered best practice these days).

Okay, so it isn't HR

Yeah, so it isn't HR that is making the decisions. Some of my best managers have had no programming skills, and they knew it. Some of the worst ones did. The job of my boss is to assign projects, priorities (perhaps with help from other leaders), and other management tasks. His job is not to be an expert coder, and expertise in coding will not help in his day-to-day activies (Other than hiring, and knowing if I'm BSing him).

So the hiring manager suddenly needs to learn a lot about a complex discipline, and worse it is one that demands a thinking style that otherwise does not match his. So when he gets someone who has 15 years of the last buzzword (Java), 5 of the current one (C#), and interviews well, he hires that person.

I haven't seen near as much of the must-have-known-Jesus-for-3000-years" kind of minimum requirements lately, but they still pop up. Last time it was expert with the linux kernel source. (Describes maybe 1000 people in the world, most of whom are paid to hack the kernel full time, while this was part time kernel hacking and part time other work) I'm sure there are other examples, though it raise and fall as the job market tightens.

Now there are good hiring managers who are also good coders. However there are a lot of good managers of coders who have to hire coders. (Sometimes you can get your coders to help interview, but this is only useful if you have good coders who can interview)

Its hard. I see the problem. I do not see a solution.

Head first

So when he gets someone who has 15 years of the last buzzword (Java), 5 of the current one (C#), and interviews well, he hires that person.
I don't know why such a person would act completely alone in this. Good managers without adequate technical skills should know better and rely on their senior programmers for technical assessments.

Regarding the Jesus3k jobs postings, some of them are intentionally so tight on the requirements. Objectives can be such as grabbing a direct competitor's senior coder, or (in the US) greencard applications for immigrants, i.e. they don't intend to hire anybody in place of the temp worker, so they don't want to deal with applicants. (Disclaimer: This is not always the case. Yes, I'm such a temp worker. Waay off-topic.)

What if it is the FIRST programmer?

Say said manager is an expert at marketing and sees that with some new law there will be a need for a new program in a few months. Said manager quits to start a new company. The (seemingly) good programmers he knows from his current job are not willing for the risk of a start-up (which may mean putting in some of your own money and/or working without pay until the investors catch on the the opportunity)

I know programmers who worked 6 months without pay to get a company off the ground. They are not good programmers (they do fine for small scripts, but don't know how to design are large project), but their situation was such that they were willing to work under those terms. Who will tell the boss that this candidate isn't good?

What if it's not meant to be?

Said manager quits to start a new company.
I've lost track of this conversation somewhat. If a person ventures into a brand new field where they lack knowledge or consultants, then yes, they will make mistakes. That's why such attempts are called "high risk."

Sadly, these mistakes are done more often due to ignorance rather than lack of consultants.

We need to eliminate the phas

We need to eliminate the phase "programming language" from our vocabulary. Complete strike it out.

I kinda like the term "programming language" ;-)

Think what would happen to our tag line without it...

The Weblog

Why, it would simply become "The Weblog", which ascribes us even more importance! The one and only...

It seems people still dont get the point!

The whole point of any undergraduate program in any discipline is to teach you how to learn. It does not translate to instant success in the real world! An undergraduate education in computer science should furnish you with the necessary skills to be able to survive in any IT related environment and this has got nothing to do with Java, Smalltalk, OOP, C/C++,ML ,Lisp or Scheme. I have met people with all these so called programming skills that can't get a job just because they cant communicate properly. A lot of the problem is that many of the so called smart programmers have terrible oral and writing skills, hence they dont even make it through the door because of the wrong idea that "super coder" skills are enough! I agree with Joel that undergraduate computer science is defficient, but it has nothing to do with the teaching of Java! I think it has a lot more to do with the fact that universities are closed environments .

I think a good technique is w

I think a good technique is when CS fundamentals are the emphasis, and industry related tools and languages are thrown in on the side.

For example, I took a compilers course where one of the assignments was to write a parser for (a subset of) XML. The point of the assignment was to learn how to write a recursive decent parser by hand, but we also ended up learning XML as a side effect.

Another example is when I took a course on Human Computer Interaction (ie interface design). All of our assignments were written in VB, but the prof never talked about VB in class. She only talked about the core concepts in class. We were expected to learn VB on our own in order to get the assignments done.

I have to say that I fell that I have a pretty well rounded education.