## Dijkstra on analogies and anthropomorphism

In connection with the recent discussion concerning how people think about programming, I thought it might be worthwhile to revisit some of E. Dijkstra's writings on the subject.

I think anthropomorphism is worst of all. I have now seen programs "trying to do things", "wanting to do things", "believing things to be true", "knowing things" etc. Don't be so naive as to believe that this use of language is harmless. It invites the programmer to identify himself with the execution of the program and almost forces upon him the use of operational semantics.
and:
And now we have the fad of making all sorts of systems and their components "intelligent" or "smart". It often boils down to designing a woolly man-machine interface that makes the machine as unlike a computer as possible: the computer's greatest strength—the efficient embodiment of a formal system—has disguised at great cost.
and:
Another analogy that did not work was pushed when the term "software engineering" was invented. The competent programmer's output can be viewed as that of an engineer: a non-trivial reliable mechanism but there the analogy stops...
It is the most common way of trying to cope with novelty: by means of metaphors and analogies we try to link the new to the old, the novel to the familiar... Coping with radical novelty requires an orthogonal method... one has to approach the radical novelty with a blank mind, consciously refusing to try to link it with what is already familiar, because the familiar is hopelessly inadequate... Coming to grips with a radical novelty amounts ot creating and learning a new foreign tongue that can not be translated into one's mother tongue. (Any one who has learned quantum mechanics knows what I am talking about.)
and:
A number of these phenomena have been bundled under the name "Software Engineering". As economics is known as "The Miserable Science", software engineering should be known as "The Doomed Discipline", doomed because it cannot even approach its goal since its goal is self-contradictory. Software engineering, of course, presents itself as another worthy cause, but that is eyewash [sic]: if you carefully read its literature and analyse what its devotees actually do, you will discover that software engineering has accepted as its charter "How to program if you cannot".
and:
We could, for instance, begin with cleaning up our language by no longer calling a bug a bug but by calling it an error. It is much more honest because it squarely puts the blame where it belongs, viz. with the programmer who made the error. The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking is intellectually dishonest as it disguises that the error is the programmer's own creation... My next linguistical suggestion is mor rigorous. It is to fight the "if-this-guy-wants-to-talk-to-that-guy" syndrome: never refer to parts of programs or pieces of equipment in anthropomorphic terminology...

and finally there is the now-classic example (a domino-tiling problem) of why "operational reasoning is a tremendous waste of mental effort."

## Comment viewing options

### This sowng is not a rebel sowng.

However you feel about Dijkstra, his writing is at least thought-provoking. His opinions about anthropomorphism here probably explain why he wrote, "Object-oriented programming is an exceptionally bad idea which could only have originated in California."

In short, he felt anthropomorphism hinders rather than helps the practice of programming because he feels that the advent of computers, which motivated the founding of the field of computing science (not computer science!), presented such a radical novelty that trying to understand programming using old ideas is hopeless.

This is assuredly not to say that he thought the science of programming needs to be approached in an ad hoc manner, by developing a new foundation for it. Dijkstra was convinced that programming is an "unescapably" [sic] mathematical activity, but that the scale of the problem is so much larger than what we are accustomed to in ordinary mathematics, that the minor sins one regularly commits there are magnified into capital sins.

Indeed, he exhibits a palpable disdain for attempts to "solve" the problems of programming merely by inventing buzzwords and terminology, which he regards as a symptom of marketing rather than insight.

The interested reader can find many, many, many other notes from him at the E.W. Dijkstra Archive.

### If I could, I would... let it go

I agree 100% that Dijkstra is well worth reading, and he is without a doubt a giant in Computer Science, but he was also a grumpy old fart.
(That's partly why I like him; I identify with him. ;-) )

I think it is good to be skeptical about the endless stream of "silver bullets" that ricochet around the programming world, but on the other hand, most people want to solve problems with computers, and most of these problems are not particularly mathematical.

You don't need a math degree to shunt email around or puke out a web page (though both of these have their own difficulties).

Having said that, I do think knowledge of theory and math, added to other skills, makes a better programmer.

All this to say, let's not let this petition to authority side-track our quest for principled "human factors". ;-)

### Don't Forget the Handwriting

And let's not forget that he had incredibly *cool* handwriting! Someone even turned it into a font, though I don't remember where it can be obtained.

### Font

Google to the rescue: http://www.luca.demon.co.uk/Fonts.htm. Yes, that's by the Luca Cardelli.

### Font

Yippee! This is the last missing piece I need for my evil secret plan of an April Fool hoax faking an EWD.

### Take-chan-man desu!

All this to say, let's not let this petition to authority side-track our quest for principled "human factors". ;-)

It was not intended as a petition to authority; it was intended to provoke discussion.

If you look carefully at what I wrote, you will see that I was rather careful to write things like "Dijkstra felt..." and "Dijkstra believed..." And, though it will come as no surprise to you that—and I do not deny it—I share many of Dijkstra's opinions, I don't share all of them (in particular, I don't think computers are such a "radical novelty"—at least, I don't think the characterization is so useful), and I think his reasoning is sometimes specious, overly alarmist and too loaded with value judgements. Nevertheless, I think he usually ends up being right even when his reasons are dubious.

You don't need a math degree to shunt email around or puke out a web page (though both of these have their own difficulties).

That is entirely beside the point, and you know it.

I agree 100% that Dijkstra is well worth reading, and he is without a doubt a giant in Computer Science, but he was also a grumpy old fart.

Agreed. But I think we desperately need people like that. He was not afraid to say unpopular things, and he was brilliant enough that people took note of what he said. Even if some of his claims are not scientifically supportable, I think they are valuable in the sense that they set a scientific agenda.

This is something I've discerned in the writings of a number of brilliant people. I won't mention names (because they're still alive :) but there are a couple well known researchers who tend to write papers loaded with value judgements and derision (but also exceptional technical work), and who are well known for being obnoxious nitpickers at talks and conferences. These people can be annoying because they sometimes seem out of touch, and often it is not easy to see why they object to certain techniques or arguments, and I have the feeling that they probably might not even be able to adequately explain it themselves. But, they are also very experienced, and I think their strong opinions are often based on lessons they have learned but cannot quite articulate.

To do good scientific work, you need to be a good technician. But you also need to pick a good direction for your work. Hamming once said (roughly) that the only way to do important work is to work on important problems, but most people don't. (You and Your Research) I think it is the case that figuring out which problems are important and which are not is very difficult.

So, I think that divining at the attitudes and values of these brilliant, though perhaps difficult, people, and trying to explain to yourself (more rigorously) why they might feel that way, is a good way to see into the future.

### Furi Kuri

But I think we desperately need people like that. He was not afraid to say unpopular things

No disagreement here. ;-)

That is entirely beside the point, and you know it.

Not sure about that. If we take your electrical engineer/electrician analogy, it wouldn't come as a surprise that each had his own tools and mental models that were appropriate to his job.

Furthermore, it wouldn't surprise us if the EE wasn't able to do the electricians job at all (or vice versa), unless he had learned those specific tools and skills as well.

So why does it surprise us that computer scientists / industrial programmers might have different PL needs?

I think the thing that really cheesed Dijkstra off is that people weren't making the distinction between the different types of endeavour.

### As an Electrical Engineer I must comment...

Actually, having gotten my BS in EE, I have to admit to botching several design projects because of a lack of technician skills. And in the process of independently developing those skills I discovered something: Formal models are judged by how closely they match reality. The difference between Theory and Practice(EE and Electrician) should constantly be decreasing because theories get dumped if their predictive and explanatory powers are excelled by newer theories(what this means for CSC is an not an simple discussion, but nonetheless).

I find that being a good "technician" gives me insight to understand the theory, in fact, I often the little corners of theory that many "theorists" overlook. When an equation has no solution its not a failing of the theory, on contrary it usually means you physically blew something up(as an exercise, charge up a large capacitor and then short it, Circuit Theory says it will provide infinite current, personal experience tells me that while its not infinite, it will sure feel like it, especially if it blows up on you :)

### theory and practice

Why not another authorative quote. :)

"The best theory is inspired by practice.
The best practice is inspired by theory."
--- Knuth, "Theory and Practice" talk.

I have half my foot set in EE. I think from my understanding it is fair to say that in engineering it is not only the case that theories are developed to account for practice, but also that practices are developed to exploit theory.

Programmers (and ordinary, shallow people) try to stereotype theory as a chore rather than a resource. That is what went wrong with programmers. In the present state of affair, they place too much emphasis on the first half of Knuth's quote and do too little of the second half.

### Oretachi Hyokinzoku

For what it's worth, I personally feel a large number of views of Dijkstra will be, and must be, regarded as a hopelessly outdated opinion of a mathematically inclined man on a field which was then in its infancy, which still is far from mature, and which is probably completely misunderstood ;)

For one, from his statements on engineering I assume he probably thinks too much of engineering as an application of math and science (the popular academic definition) instead of a purposeful collaborative effort of a highly diverse collection of individuals. This is in strong contrast with my own view (after having supervisioned various small projects) that it is easier to develop a small deeply embedded real-time system than a small succesful multimedia webapplication as the first is "more" mathematical, and therefore, better understood.

Science ~= Engineering ~= Programming ~= Applied Math

As far as the "completely misunderstood goes". I find it remarkable that it took a long time for academia to actually understand why and what OO was about, and why it became such a great hit in industry. I doubt that the software engineering perspective on classes as reinstantiable, extendable module declarations is more responsible for the popularity of OO than the "anthropomorphic" view on a program as a collection of interacting agents.

Maybe the attributed resentment of Dijkstra with regard to OO is nothing more than the discontent of a "computing scientist" who did not recognize on time that anthropomorphism is good enough for most programmers (implementation efforts) and the role of math therein is overrated?

(Hey, don't shoot me, I actually like ML and Haskell more than Java or C++ ;-)

### Komanechi!

For what it's worth, I personally feel a large number of views of Dijkstra will be, and must be, regarded as a hopelessly outdated opinion of a mathematically inclined man on a field which was then in its infancy, which still is far from mature, and which is probably completely misunderstood ;)

Funny. To me, most of what I hear from programmers and read in trade journals also sounds hopelessly outdated. :)

This is in strong contrast with my own view (after having supervisioned various small projects) that it is easier to develop a small deeply embedded real-time system than a small succesful multimedia webapplication as the first is "more" mathematical, and therefore, better understood.

Which is probably why he would urge us to understand your web application mathematically as well.

As far as the "completely misunderstood goes". I find it remarkable that it took a long time for academia to actually understand why and what OO was about, and why it became such a great hit in industry.

Dijkstra would have pointed out that he did not view computing science as a service industry: that the purpose of a college is not to churn out programmers to feed the fickle demands of industry but rather to produce graduates who have enough general knowledge to be able to master any technology they are likely to encounter in the remainder of their lives (and also graduates who can change and lead, rather than blindly follow, the industry). (There is an EWD note on this very subject, but I couldn't find it.) Moreover, to him, "computing science" was not "computer science", and analysis of fads in industrial practice are not its proper subject.

The last also connects with what Matt wrote above: "Formal models are judged by how closely they match reality." For Dijkstra, I think, a formal model in computing science cannot be wrong in any sense, since it is a mathematical statement. The application of it could be wrong, certainly, but that is "computer science" and not "computing science". Computing science, the mathematics of computation, is a broader subject.

I doubt that the software engineering perspective on classes as reinstantiable, extendable module declarations is more responsible for the popularity of OO than the "anthropomorphic" view on a program as a collection of interacting agents.

Ah, the myth of OO extensibility... I will not treat it here. :)

Maybe the attributed resentment of Dijkstra with regard to OO is nothing more than the discontent of a "computing scientist" who did not recognize on time that anthropomorphism is good enough for most programmers (implementation efforts) and the role of math therein is overrated?

Dijkstra does not strike me as a man who considered any of the products of industry as "good enough".

[Marc wrote:] Furthermore, it wouldn't surprise us if the EE wasn't able to do the electricians job at all (or vice versa), unless he had learned those specific tools and skills as well.

Well, maybe I'm wrong, but my impression of the difference between an EE and an electrician is that, though an EE may not know the details necessary for doing an electrician's job, he can learn them quickly enough as they are just particular instances of what he already understands in the abstract. In the same way I am confident I can master .NET, even though I have never programmed with it.

The point is that if you understand something from a theoretical viewpoint, you are equipped to understand its concrete instances.

### Read or die!

The point is that if you understand something from a theoretical viewpoint, you are equipped to understand its concrete instances.

A piece of advice for aspiring industry programmers reading the above: don't use this line in an interview, it could cost you the job. ;-)

Of course, I agree that having a solid theoretical grounding in something can ease practical learning, and make you more effective once you've learned it.

But good luck to the EE who tries to rewire his house on his own because he thinks his theoretical grounding is strong enough. ;-)

The same is true with programming. Sure, learning a new PL is easier if you understand the theoretical properties of PLs generally, but there is a whole lot of stuff that is not rational or consistent that you just have to learn the hard way, and you can't be said to be fluent in that PL until you've mastered it.

Some examples are syntactic weirdnesses, idiomatic patterns in the language, the standard libraries, language specific jargon, and popular frameworks.

Then there are the particular domains that you must understand to be effective in an industry job: interfacing with databases (with their vendor specific weirdnesses), dealing with vagaries and inconsistencies of business data, the trials and tribulations of web development, etc.

So sure, if someone is hiring for a junior position, a good theoretician with no practical experience could probably learn faster than a total newbie, but for a senior position, you need to see experience, with theory or not.

I'm sure this is equally true for electricians. ;-)

### Masque of the Read Death

You and Dominic are both being rather uncharitable. I wrote, "if you understand something from a theoretical viewpoint, you are equipped to understand its concrete instances," meaning—since I have to spell it out—that you understand the essence of the matter, and so need only concentrate on the details. Granted, it may take time, but it is only that—a matter of time. And you can expect that time to be shorter than someone who lacks an abstract understanding of the subject.

It would of course be absurd of me to claim that, just because I know something about programming languages, that I am a fluent in .NET. Do you really think I would say something like that?

But I am pretty sure that, if I were to sit down and try to learn .NET, I would not discover anything new or challenging; it would be mostly rote learning, and that, to my mind, is much simpler than trying to grasp a new idea. Moreover, I know it is within my capabilities and so just a matter of time before I master it, whereas I cannot say the same about trying to learn, say, quantum gravity or descent theory.

Having said that, I'm sure you're right that .NET is a mess. It is probably the case that academic languages are arranged so that, once you get the theory, the rest can be quickly acquired, whereas conventional languages are not so well-factored. But that, to me, is just a symptom of poor design; and why should we be surprised that a poorly designed artifact takes more time to understand than a well-designed one?

[Dominic wrote:] what I have of these things is my rod and my staff and my ever-present help in trouble. But the trouble itself is not mathematical in nature: it is empirical and fallen and he that touches it shall be defiled therewith.
Er... huh? :) Am I missing an allusion?

### Shadow over Innsmouth

Do you really think I would say something like that?

In the extreme version you present back, of course not.
In fact, I'm pretty confident that you personally could learn proficiency in one of the commercial languages pretty quickly, in the way you describe. I just think it would be too painful for you to be worthwhile. ;-)

Granted, it may take time, but it is only thatâ€”a matter of time.

Aye, there's the rub. Unfortunately, time is a primary limiting factor for all human endeavours. Our posited master electrician could say that because of
his enormous practical knowledge, with a bit of time he could master theory. And this might be true, but it minimizes the real difficulties that he may encounter along the way.

It is probably the case that academic languages are arranged so that, once you get the theory, the rest can be quickly acquired, whereas conventional languages are not so well-factored.

Now this is the part that really interests me. What property of these languages provides this benefit? Is there some way we could formalize it in a PL design to encourage or require it? What perspective or mental skill could be fostered in new programmers so that they benefit the most from this property and use it to organize their own work?

This is the basis for the discussion of the worthwhile human factors of PL use: how can a PL be designed and organized so as to alleviate the limitations on human memory and time.

As a first pass, I would suggest three elements:

• a small, simple core language
• a standard library with some mnemonic organizing principle
• simple mechanisms to modularize and organize new code written in the language, probably the same as those that organize the standard library

These points still leave a lot of questions unanswered. What does a simple core language look like? What are its essential elements? What are the mnemonic principles for code and libraries? (This might be one place where the psychologists actually may be able help us.)

I should acknowledge that some of these ideas are stimulated by the fact that I'm currently reading Van Roy and Haridi
and finding very interesting.

### You sure, must be trolling... I hope

It is probably the case that academic languages are arranged so that, once you get the theory, the rest can be quickly acquired, whereas conventional languages are not so well-factored.

As far as industry exists, industry always knows what it is doing - they are the experts - they have to be.

Let's build an embedded application for an Atmel 8bit processor: I'll take C, you please pick your language.

Let's build a massive responsive web system: I'll take Java or C# or Perl or Python (ok the latter one arguably is an academic language - but well-factored??). Please now you pick.

Let's build a complex multi-user real-time strategy game. Do I need to go on?

Where are the serious haskell, ml, clean, ... applications? The only academic language which is marginally used in industry I guess is Lisp - and I personally hope that dies of soon.

### I do hope you're trolling ;-)

As far as industry exists, industry always knows what it is doing - they are the experts - they have to be.

People in the industry doesn't need to be experts, not even in their primary fields. They just need to be better and/or bigger than the competition. I work in consulting and I usually say (only half-joking) that my customers pay me to say what is wrong with their business processes so they can better continue doing it the wrong way.

Let's build an embedded application for an Atmel 8bit processor: I'll take C, you please pick your language.

Forth*.

Let's build a massive responsive web system: I'll take Java or C# or Perl or Python (ok the latter one arguably is an academic language - but well-factored??). Please now you pick.

Erlang. PLT Scheme.

Let's build a complex multi-user real-time strategy game. Do I need to go on?

Erlang again.

Where are the serious haskell, ml, clean, ... applications?

What do you mean by serious? There isn't much propaganda about applications using such languages, but AFAIK there are companies using them and getting good results (e.g. Galois Connections, Aetion). Of course there are much more companies using C/Java/VB/whatever than FPLs, but that says nothing about the quality of such languages or how effective they are.

* One could hardly argue that Forth is conventional or poorly factored.

### Thank you for your manuscript; I'll waste no time reading it.

Your argument style leaves something to be desired, but because I enjoy a good romp, I will rise to your bait.

As far as industry exists, industry always knows what it is doing - they are the experts - they have to be.

Well, it is reassuring to be told that industry is infallible. I had not realized it was axiomatic. I guess Windows Update is just there to give users something to do with their spare cycles then?

Where are the serious haskell, ml, clean, ... applications? The only academic language which is marginally used in industry I guess is Lisp - and I personally hope that dies of soon.

While there are surely respects in which Haskell, ML, etc. do not address all the needs of industry users, pointing to their lack of popularity in industry is no proof of their essential "unsuitability" for industrial applications. I could point out numerous ways in which your favorite languages (C, Java and Perl, it seems) are also "unsuitable" for industrial applications, and in their cases they really are essential issues; that you have not observed this, I suspect, is symptomatic of tunnel vision and inadequate education.

Philip Wadler wrote an article, Why no one uses functional languages., for SIGPLAN Notices that argues that FP languages are not used often mostly because their implementations (as opposed to the languages themselves) lack some features. In a follow-up article, n angry half dozen, he listed some applications of FP languages in industry. You can read both here and make of them what you will.

In your other post, you wrote:

First, I am all for students learning as much of computing science as possible; and algorithms and combinatorics are a personal great love and hobby. Don't get me wrong on that topic.

Second, my point exactly is that I am sure you can master .Net in a short amount of time but that doesn't help a lot to get stuff done. It is only technique, not engineering.... My point is not that computing science doesn't have its use, but that it is the math part which in practice often is the easy part.

I see. How many of your programs have you proved correct?

I don't know exactly what "math" in the context of programming means to you, but judging by your use of the words "algorithms and combinatorics" I would imagine it is very limited in scope. Perhaps you recognize computing the time-complexity of an algorithm as math, but when it comes to organizing and structuring your code, you regard it as an engineering problem (where, I gather, "math" and "engineering" are disjoint fields for you).

But I think you do not see the forest for the trees. In my field at least, we treat modularity issues, reuse, abstraction, code organization, code transformation, optimization, refactoring, specification, implementation and correctness mathematically. (This was what Dijkstra advocated: treating the program as an object in the mathematical domain, and consequently subject to formal methods and reasoning.)

That does not cover all the tasks one encounters in engineering, but I think it is enough to challenge the idea that math is the easy part.

Consider the next thought experiment: what would have happened if Dijkstra would have been the leader of the MS-Word project. What would the result be like: nothing, vi, Emacs, or something better than MS-word? I wouldn't know. (Or rather, I think I know ;-)

Dijkstra wrote the software for the ARRA II, FERTA, ARMAC and X1 computers in the 50's. In the 60's he developed the THE Multiprogramming System (inventing the notion of "semaphore" in the process), a system that was so reliable that "no system malfunction ever gave rise to a spurious call for hardware maintenance" [EWD1243a]. At about the same time, IBM developed its OS/360, an operating system project so notorious for its unreliability, cost overruns, and time span (10 years) that it inspired its manager Fred Brooks to write a book about it.

Dijkstra also helped design Algol, which is the basis for all your favorite languages, and is in large part responsible for structured programming, which is probably so second-nature to you that you don't even realize it. He basically coined the word "vector", "stack" and "semaphore" in a computing context, and was one of the first people to call their profession "computer programmer". He introduced the Dining Philosopher's Problem, and invented the Shortest Path Algorithm and notions of guarded command language and predicate transformers. He won a Turing Award from the ACM, and two awards from the IEEE.

So, do I think Dijkstra could have led a large software project? It seems to me that he was able to accomplish pretty much everything he put his mind to, including completing numerous large software projects.

For you to suggest that Dijkstra was an incapable practitioner is not only wrong, it is aggregiously wrong. He was writing large programs before there was any idea of how to write large programs, before there was any literature on it, and when he needed an idea, he invented it, and his ideas were so valuable in practice that they pervade the software engineering field now. You, and indeed all programmers, owe so much to him that your snide aspersions rather reflect on your own ignorance.

(The funny thing, actually, is that without Dijkstra, we would probably all be programming now in either assembly or functional languages...! :)

I guess I am just saying that, yes, computing science is important but, hey, and that is where I disagree with Dijkstra, everybody in informatics is a part of a service industry and informatics is so much broader than computing science that I sometimes wonder if computing science will not become a marginal topic.

Computing science has applications outside programming, so you need not trouble yourself in that regard. But its impact on programming will become marginal only when software programmers start writing reliable and efficient programs quickly enough to satisfy the demands of customers. So far, I note, programmers have tackled this problem less by increasing their productivity than by decreasing the expectations of their customers.

### Program confidently with .NET in 24hrs!

In the same way I am confident I can master .NET, even though I have never programmed with it. The point is that if you understand something from a theoretical viewpoint, you are equipped to understand its concrete instances.

This is true if the theoretical view is comprehensive and coherent and the concrete instances are conformant to the theory that is supposed to describe them. But, haha, .NET isn't like that. Actually "mastering" it means amassing knowledge of a large collection of inadequately-theorized particulars - I mean the behaviours, glitches and trouble-spots of a large library of modules designed or accreted with a wide variety of purposes in mind. Learning C# is dead easy, whether you're a scientist or not. But "mastering" a framework means getting bogged down in details: understanding the compromises made by the people who constructed the framework, and working out compromises of your own in order to create functioning software.

I don't consider this an argument against mathematics, theory, logic or abstraction - what I have of these things is my rod and my staff and my ever-present help in trouble. But the trouble itself is not mathematical in nature: it is empirical and fallen and he that touches it shall be defiled therewith.

### Issunboushi!

Well, maybe I'm wrong, but my impression of the difference between an EE and an electrician is that, though an EE may not know the details necessary for doing an electrician's job, he can learn them quickly enough as they are just particular instances of what he already understands in the abstract. In the same way I am confident I can master .NET, even though I have never programmed with it.

First, I am all for students learning as much of computing science as possible; and algorithms and combinatorics are a personal great love and hobby. Don't get me wrong on that topic.

Second, my point exactly is that I am sure you can master .Net in a short amount of time but that doesn't help a lot to get stuff done. It is only technique, not engineering.

Consider the next thought experiment: what would have happened if Dijkstra would have been the leader of the MS-Word project. What would the result be like: nothing, vi, Emacs, or something better than MS-word? I wouldn't know. (Or rather, I think I know ;-)

My point is not that computing science doesn't have its use, but that it is the math part which in practice often is the easy part.

I guess I am just saying that, yes, computing science is important but, hey, and that is where I disagree with Dijkstra, everybody in informatics is a part of a service industry and informatics is so much broader than computing science that I sometimes wonder if computing science will not become a marginal topic.

### Is this some theory vs. practice thing?

Consider the next thought experiment: what would have happened if Dijkstra would have been the leader of the MS-Word project. What would the result be like: nothing, vi, Emacs, or something better than MS-word? I wouldn't know. (Or rather, I think I know ;-)

Do you think Dijkstra was never involved in "real-life" software development? A lot of his work (just like CAR Hoare) has to do with almost day-to-day problems encountered in "Software Engineering".

Maybe I'm infering to much from your post but opposing CS, as a harmless hobby for academics, to real down-to-earth software engineering is a very dangerous and easy path to follow.

### Yes, it is

I know Dijkstra did very important stuff and other people continue to do likewise.

However, I feel that a lot of his comments should be viewed as stemming from an era where computer science, computing science, and engineering were "almost the same thing", at least practiced by the same people. In that respect, I think the field is maturing and stuff he said is less true these days than in his time.

These days, computing scientists, information technologists, and engineers are very different people. Maybe, the same even starts to hold true for language developers and, say, type theorists.

As fas as opposing goes, I would say that it is just as easy and dangerous, or at least an often made mistake, to assume that being an expert in one field makes you an expert in another field, it doesn't.

Maybe my beef with Dijkstra's view on anthropomorphism is that he placed the two fields of programming/engineering, where anthropomorphism I find an ok-ish assumption, and CS to close together, and I hope at some time that will be viewed as an immature (or obsolete) view. And yes, I do believe the fact that engineering is so much more than programming is not recognized enough by enough academics. And yes, I believe that the whole field suffers from the fact that that is not recognized enough.

But ok, I will grant you that I am playing the Devil's advocate by putting an extreme opposing view there and take CS/engineering apart again ;-). But hey, it makes a good dialogue ;-)

BTW: Discussions cannot be dangerous - they are healthy, especially in academia. Even if it means standing on somebody's toe from time to time.

### "math"

There are two types of mathematics. The topics studied by people in the math department. The spirit, mindset, and style of mathematics.

A programmer does not need topology to shunt emails around. A programmer needs boolean algebra and Kleene algebra to reliably shunt emails around (cf. procmail).

In the latter sense, programming is a mathematical activity.

Aside #0: A friend taught a complexity theory course (NP completeness, halting problem) for computer engineering students. Needless to say, the class was full of sketch proofs of statements of the form "forall ... exists ... forall ... stuff". A student commented, "I don't mind learning this, but THIS IS NOT MATH!" To a student in the engineering faculty, math means algebra, geometry, calculus, and nothing more. Will a computer engineer use calculus? Some won't. Will he/she use reasoning involving forall-exist statements? Probably always, without knowing it.

Aside #1: I once walked by the actuary department. It posted a poster advertising why you want to be an actuarist and what it entails. One quote: "you don't need to be into math, but you need the aptitude". I think it's applicable to programming too.

### Anthropomorphism and LISP

"Both [G.J. Sussman and D.S. Wise] belonged very much to the LISP subculture, neither of the two proved a single theorem, both showed too much and made such heavy use of anthropomorphic terminology that they were painful to listen to."

- Edsger W. Dijkstra

### Isn't it your own interpretat

Isn't it your own interpretation to relate OOP with anthropomorphism, though? I always considered this citation to be meant like a joke (if Dijkstra ever said this, I've never found a source), after all he wrote "Structured Programming" with Ole-Johan Dahl (and CAR Hoare), creator of Simula with Kristen Nygaard. He certainly knew Simula wasn't born in California.

### A foolish consistency is the hobgoblin of small minds.

Isn't it your own interpretation to relate OOP with anthropomorphism, though?

Certainly it is my interpretation; I don't think it is hard to connect the two, though. Just pick up any book on OOP, or consult Google. It is hard to find an introduction to OOP which resists the urge to compare them to real world objects, and nearly impossible to find one which explains OOP mathematically or even objectively. Given Dijkstra's attitudes on programming and the state of the software industry, I think it is not hard to discern what his attitude toward OOP might have been.

if Dijkstra ever said this, I've never found a source

Yes, it is "attributed to".

after all he wrote "Structured Programming" with [Simula co-creator] Ole-Johan Dahl

That was in 1972, before OOP was touted as a panacea and substitute for the sort of rigorous program development Dijkstra was enamored of. Anyway, I have also published a paper with an OO researcher (Dave Clarke), and I am not known for my love of objects either.

Also, the part of "Structured Programming" that mentions OOP (chapter 3) was apparently written by Dahl and Hoare.

### I think your points are perfe

I think your points are perfectly valid, I kind of hoped you had a source for more Dijkstra discussion on OOP. I wouldn't be surprised to learn he was strongly against OOP, I was not trying to imply he was not.

I mentioned the fact that he wrote a book with Ole-Johan Dahl just because I think it could have been a kind of joke to attribute OOP to California, maybe even a joke directed at Ole-Johan Dahl.

### Dijkstra was convinced that p

Dijkstra was convinced that programming is an "unescapably" [sic] mathematical activity, but that the scale of the problem is so much larger than what we are accustomed to in ordinary mathematics, that the minor sins one regularly commits there are magnified into capital sins.

I have another view on this. Dijkstra is programming at a fine
grain, writing and reasoning about individual language statements.
At such a fine grain, indeed, programming is a mathematical
activity. But this changes completely when going to a bigger
grain. At a big enough scale, computers are simulators: they
are a substrate to model various worlds and simulate them. The
faster they go, the more realistic the simulations are. The
simulation algorithms are usually quite simple at heart (made
more complex by speed optimization, but that's not my point).
The complexity is not in the algorithms, but in the descriptions
of the worlds that are simulated. These descriptions are at
a completely different level than the computer's own instructions.
The approach of "programming as a mathematical activity" is
irrelevant for the descriptions. It all depends on what kind of
model or world we are simulating: wind tunnel, flight simulator,
weather prediction, interactive game, MUD, haptic interface
to four-dimensional space, Matrix-style virtual reality, word
processor, spreadsheet, etc. Each model or world being
simulated is a law unto its own.

### Ok, let's play Dijkstra

My second remark is that our intellectual powers are rather geared to master static relations and that our powers to visualize processes evolving in time are relatively poorly developed. For that reason we should do (as wise programmers aware of our limitations) our utmost to shorten the conceptual gap between the static program and the dynamic process, to make the correspondence between the program (spread out in text space) and the process (spread out in time) as trivial as possible.

So, that's Dijkstra summoning the "way humans think" to support his arguments against goto.

### The difference is that he's t

The difference is that he's talking about the way every human thinks, not the way "ordinary humans" or "everyday programmers" think. It is an entirely different argument. One is an argument for making an conscious effort towards simplicity, the other is an argument for adaptation to the "general public".

### I don't think so

I don't think there's a difference. When some OO proponent says that OO is good "because people think in terms of objects" he's saying everyone thinks that way, not "ordinary humans". And here's Dijkstra saying that gotos are bad "because people are good thinking about static relations, but not dynamic ones". It was the OP in the other thread that added "hey, other professionals learn how to operate in 'unnatural' ways, why this should be different with programmers" or, put another way, "programmers (should) spend years in training, they are not 'ordinary people', so we shouldn't base our opinions of what's natural on the untrained people's mindset". The argument, as I see it, is that we can overcome our limitations somewhat by means of training, so separating the untrained, "ordinary" people, from the trained programmers/computer scientists/whatever.

### A dove is a pigeon with good PR

The difference is that he's talking about the way every human thinks, not the way "ordinary humans" or "everyday programmers" think. It is an entirely different argument.

What mental tools does every person have? What mental tools do "ordinary humans" lack that "real programmers" have?

Without addressing these questions, how can you tell the difference?

### Dijkstra: It is the most c

Dijkstra:

It is the most common way of trying to cope with novelty: by means of metaphors and analogies we try to link the new to the old, the novel to the familiar... Coping with radical novelty requires an orthogonal method... one has to approach the radical novelty with a blank mind, consciously refusing to try to link it with what is already familiar, because the familiar is hopelessly inadequate... Coming to grips with a radical novelty amounts ot creating and learning a new foreign tongue that can not be translated into one's mother tongue. (Any one who has learned quantum mechanics knows what I am talking about.)

Just for fun, contrast with this:

Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past. The tradition of all dead generations weighs like an nightmare on the brains of the living. And just as they seem to be occupied with revolutionizing themselves and things, creating something that did not exist before, precisely in such epochs of revolutionary crisis they anxiously conjure up the spirits of the past to their service, borrowing from them names, battle slogans, and costumes in order to present this new scene in world history in time-honored disguise and borrowed language. Thus Luther put on the mask of the Apostle Paul, the Revolution of 1789-1814 draped itself alternately in the guise of the Roman Republic and the Roman Empire, and the Revolution of 1848 knew nothing better to do than to parody, now 1789, now the revolutionary tradition of 1793-95. In like manner, the beginner who has learned a new language always translates it back into his mother tongue, but he assimilates the spirit of the new language and expresses himself freely in it only when he moves in it without recalling the old and when he forgets his native tongue.

Karl Marx, The Eighteenth Brumaire of Louis Napoleon