Dennis Ritchie passed away

I have just learned that Dennis Ritchie (1941-2011) has passed away. His contributions changed the computing world. As everyone here knows, dmr developed C, and with Brian Kernighan co-authored K&R, a book that served many of us in school and in our professional lives and remains a classic text in the field, if only for its style and elegance. He was also one of the central figures behind UNIX. Major programming languages, notably C++ and Java, are descendants of Ritchie's work; many other programming languages in use today show traces of his influences.

Update

Bjarne Stroustrup puts the C revolution in perspective: They said it couldn’t be done, and he did it.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Ratforz

For several significant years, it seemed the easiest way to peg a programmer was to ask if he was a C person or a Pascal person. Not only did the two languages come to represent opposing programming styles (and programming language philosophies), many schools gradually switched from using Pascal in their introductory course to using C. Much as I liked C back then, I was definitely a Pascal person. This may explain why I always had a soft spot for Why Pascal is Not My Favorite Programming Language, Brian Kernighan's 1981 extended rant against Pascal. This essay reflected Kernighan's opinion on Pascal, after writing Software Tools in Pascal (1981). This book, of course, is a descendant of Kernighan and P. J. Plauger's 1976 book, Software Tools. This book used Fortran, which the authors deemed so defective for their purposes they used a preprocessor-implemented language -- the infamous Ratfor.

The software tools Plauger and Kernighan develop in the book primarily work on text files, and Kernighan's criticism of Pascal was directed (fairly or not) at the way standard Pascal handled strings. Naturally, Fortran did not do much better. Be that as it may, a major problem with Pascal was that procedures that accepted strings as parameters had to specify the size of strings they handled, since strings were simply represented as arrays of characters. Many solutions to this problem existed: from UCSD Pascal's variable length strings (1978) to Ada's complicated type system support for unconstrained arrays.

Null terminated strings, one of the contributions of C, allowed C programmers to bypass the issue -- but only because of the way C handled array parameters more generally. Simply put, C left the burden of array bounds checking up to the programmer. This led to funny arguments about whether the C type system is unsound or merely bad... In any case, this approach became dominant. The best evidence for this was that CPU instruction sets began including instructions for handling null terminated strings. I remember the day I found such instructions in an IBM's mainframe Principle's Of Operation manual, mainframes being one of the places yet to have been influenced by UNIX at the time. I stayed up thinking how these instructions could be used to do clever things (IIRC I then learned that they were an added feature, that had to be purchased separately). One could smell the micro-code loop from a mile away, of course -- but there is no better vindication for a language designer than seeing his language affect hardware design.

Community

People are bending over backwards to write C one-liners commemorating Ritchie. I think this is one of the best. (Naturally those that are inadvertently not K&R C are disqualified.)

into the void?

((void (*)(void))("ritchie"+8))()

While not a one-liner, I like this best....


#include 

int main()
{
    printf("goodbye, dad\n");
    return 0;
}

by Brian Raiter. Note the significance of the return value.

Ironically, I saw messages

Ironically, I saw messages of this kind in which main was declared to return void...

I must say that I expected

I must say that I expected more response.

Pyramids

Accomplishments like his that have so pervasively won, are more or less taken for granted for younger generations. They're like the pyramids; huge accomplishments, but part of the landscape, almost like natural features too massive to be the work of mere men.

Fish don't know they're swimming in water.

But surely there are plenty

But surely there are plenty of old timers here of all places...

old-timers

I'm an old-timer, and yes I have huge respect for Dennis Ritchie. But most of my exposure to him was through the Pascal / C debate, and I was very much on the Pascal side of that, so I don't have much to contribute here. I'm enjoying the thread, though. Thanks!

Fish

I agree, I wouldn't know what to write? May we live forever with the occasional null pointer exception in his memory? He wrote a jolly good 'revolutionary' language and some fine books, there is no thanks for that? I still remember his firm handshake?

Clueless. Best be silent about it.

still a bit odd

Good points. But sociologically this is still a bit odd. People who revolutionise the landscape like Ritchie did often ARE NOT taken for granted. Obvious cases include Russell and Church in logic. Maybe there's something specially hidden about programmers.

Heres a Wired article for

Heres a Wired article for you. The end of the article:

As Kernighan and Pike describe him, Ritchie was an unusually private person. “I worked across the hall from him for more than 20 years, and yet I feel like a don’t knew him all that well,” Pike says. But this doesn’t quite explain his low profile. Steve Jobs was a private person, but his insistence on privacy only fueled the cult of personality that surrounded him.

Ritchie lived in a very different time and worked in a very different environment than someone like Jobs. It only makes sense that he wouldn’t get his due. But those who matter understand the mark he left. “There’s that line from Newton about standing on the shoulders of giants,” says Kernighan. “We’re all standing on Dennis’ shoulders.”

C was my favorite language for years

I don't feel qualified to comment on Ritchie's work, but his passing bothers me a lot. Much of C's design strikes me as elegant and admirable. I expect every system has quirks by necessity, so their presence in C says little. I think it's beautiful work.

I learned C in 1983 from K&R, after I tried Basic, Fortran, and Pascal in school. So it wasn't the first language I learned, but I liked it the most immediately, and I spent several years using it exclusively then. I loved C's focus on expressions.

I resented rules in Pascal constraining when I was allowed to use a construct. Imposed semantics rankled. In contrast, C seemed to say: make any expressions you want, as long as you stay responsible for what they mean. This suited me fine. I loved decomposing problems into what expressions mean, then assembling them as needed.

I remember being wild about function pointers as soon as I grasped them. There was something about economy in syntax needed to directly manipulate primitive entities that made me feel in control of code I wrote, instead of hoping the compiler had a model matching the one I had in mind. It was though C aimed to get out of the way, instead of imposing layers of fuzzy ontology. I admire it's focus on mechanism over policy.

For the last year or so I've been using only C again in my day job, for the most part. So non-optimal things folks can do with it loom large in my view. But I can't pin problems on C, unless freedom itself is bad when folks do complex things without enough abstraction or scientific approach to putting error bounds around acceptable outcomes.

I would not be surprised if C is still used heavily 100 years from now, at low levels. I hope other folks imitate spare and elegant design styles in tools.

Macros

I would not be surprised if C is still used heavily 100 years from now, at low levels.

Yes, there are few contenders, at least for microcontroller programming which is something, language designers usually don't care about ( they are all in enterprise or web programming these days. Some may also still do OS-level or scientific programming for rather large scale systems ). But what about it's macro system? I know it can be separated from C and used as a universal preprocessor but it's still tightly coupled with our pointer-oriented language of choice. It doesn't seem to be on equal footing.

text-based macros

Macros seem a hot button for many folks; I hope I don't incite excess macro evangelism. My comments aim to encourage new folks taking up C despite odd features in practical use.

The C preprocessor seems completely independent of the rest of C. I've heard almost none of its history, and I haven't studied a C spec to see how its preprocessor fits into requirements. So this is just a summary of my understanding.

A preprocessor define binds (source code) text to a symbol, then expands that symbol later by substituting the text definition. If the define has arguments in parentheses after the symbol, those are substituted as text into the definition body. The net effect is text-based source transformation before the C tokenizer runs. Substitution has no language semantics -- it's just text substitution. Text inserted by substitution is scanned again for other defines to expand as text. Basically expansion is input-pushdown, so substituted text is then processed as if it had been there in the first place.

Calling this a macro is an old-fashioned way of describing things, and doesn't have much to do with language-based macros. So I have trouble thinking of this as a macro system.

(About ten years ago I interviewed at a company whose founder had a Lisp background; he asked me what I thought of macros. I asked what kind of macros did he mean? Text substitution macros like C before compile time? Or parse-tree based macros at compile time or runtime like Lisp? He was thrilled I knew the difference.)

Anyway, it sounds like maybe you suggest the preprocessor might change faster than the rest of C, since it's easily replaced. Sounds plausible. In general, you could write an entirely different language whose output is C source, which runs before the C compiler. This just sounds like an extension of the preprocessor.

I very rarely use preprocessor defines with arguments. I always use static inline methods instead, to better effect, unless it's necessary to declare symbols with global scope (which you can't do inside an inline method). In practice, I only use defines for compiler switches and named constants. This usage doesn't seem much like macros, so it seems I use C without macros.

C preprocessor

The C preprocessor seems completely independent of the rest of C.

Unfortunately, it's not. Its semantics is dependent on exact C tokenization rules.

Preprocessing tokens

Actually, no. The C preprocessor works in terms of preprocessor tokens, which are generally rather simpler and easier to recognize than proper tokens, and (if they survive preprocessing) are eventually converted into proper tokens. (It's undefined behaviour if a pp-token isn't actually a single valid token.)

The most striking example of the difference between preprocessor tokens and tokens in later compilation phases is in pp-numbers, which begin with a digit and contain further digits, letters, dots, and occurrences of `e' or `p' followed by a plus or minus sign. This therefore includes integer constants (decimal, octal and hex, including signedness and length suffixes), and floating-point constants; but it also includes other bizarre things. This has the perhaps surprising result that

int y=0xbcde+x;

is invalid.

Hardware independence

I worked embedded systems back in the 80's and early 90's, so his work had a much more direct impact on me back then. Though I did a lot in Pascal at the time, when you really needed freedom to move around different processors, nothing beat C - a thing of beauty for the level of abstraction it chose.

(Another article on Remembering Dennis Ritchie).

K&R C

Wonder if I should feel guilty that I borrowed the second edition of the K&R C book from a coworker in the late 80's and still haven't returned it?

I'm a little late to the

I'm a little late to the wake, but C was definitely a major marker in my development as a programmer and I learned it from K&R. I also find it fascinating to trace C's own lineage back through B and BCPL, finding myself at a shared ancestor with Haskell! Sadly I'm not posting this from a *nix system, but there's one in my pocket.

I'm not a proponent of Worse Is Better as a guiding principle, but I do think it's an evolutionary pattern fated to happen again and again (hands up everyone who thinks early LISP was a Worse Is Better lambda calculus?). We might consider the Von Neumann architecture to be a first instance of it (or the Turing machine when applied to any purpose other than convincing us that it is indeed a general computer) - and an enabler for an entire generation of both research and development. In the long run, Ritchie's work may be more significant.

Celebrate to code, then

What's the most remarkable piece of C code you encountered?

Duff's device has to be

Duff's device has to be mentioned first, I suppose.

Lack of artifice

The most remarkable thing to me, about C, is how little it has, but yet it still went so far. Yes, there are plenty of obfuscated C competitions - mostly through preprocessor abuse - but everything else is generally so explicit, mapping into machine code so clearly (of course this can be deceptive with optimization), that it generally dissuades the programmer from trying to get too clever with their abstractions.

I mean, in the absence of macros, I can look at a piece of C code, and know what it all generally means straight off the page - there can be some exceptions, e.g. it's not clear at the point of use whether a symbol is typed as a struct or a union, which can change semantics considerably - but generally it's plain as day. There are very few abstractions available, you don't need to worry about overloading, or custom control flow built out of libraries taking closure arguments, or spooky action at a distance. Non-error-recovery setjmp/longjmp is a relative rarity, so you can usually follow the code of a function from start to finish. There's very little magic in the language; none of Pascal's standard procedures, like the oddball Writeln format specifiers.

In fact, the subtle dance the variadic arguments feature does between platform-specific and platform-independence might be my nomination. The details of how stdarg/varargs actually work are platform specific hacks, and it all usually maps fairly closely to a reasonable stack argument passing convention. But to pick out this way of implementing printf, and draw the abstraction boundary where it is, is very well judged for the language it lives in. Just pass the arguments in the usual way, and pass along a little program that, when interpreted, will figure out how to pull the arguments back out again.

Anyone remember those ads

Anyone remember those ads showing a piece of C code and asking "do you know what this does??" Mostly, I think, the issues were related to operator precedence.

The nastiest bit (apart from

The nastiest bit (apart from what the preprocessor can do) is in the inside out, back to front declaration syntax, and how poorly it composes for data pointers, function pointers and arrays. Not difficult to read when you're used to it, but it takes a while to get there.

The many precedence levels (in comparison to Pascal) and distinction between bitwise and boolean operators make most expressions require fewer parentheses, but there are a couple of gotchas, e.g. x & y == z being parsed as x & (y == z).

These are indeed "fun". I

These are indeed "fun". I always liked the example program in K&R which translates these declarations into English...

any order is o.k.

The nastiest bit […] is in the inside out, back to front declaration syntax, and how poorly it composes for data pointers, function pointers and arrays.

As I see it, only the ‘inside out’ part is to blame, because it involves precedence rules. (Ritchie must have been willing to reuse the already present rules for constructing expressions, and avoid introducing new ones for declarations.) Neither of the left-to-right and right-to-left orders seems to introduce any difficulty, if followed strictly. For example, Algol 68 has strictly L-to-R declarations, [] proc (ref flex [] int) union(int,char) being read as ‘array of procedures, each with an argument a reference to a flexibe array of integers and a result a union of int and char’.

I agree with your nomination...

...of the languages I know well, I actually came to C last. I learned it only after I knew several other languages well enough that I could generally predict what the compiler/interpreter would do. So C didn't really impress me the way it impressed you, since betweeen SQL's EXPLAIN PLAN and Ocaml's ultra-direct mapping to assembly, I had come to expect that kind of transparency as a basic design quality issue.

Except for varargs and printf. I wouldn't say they are only well-designed for the language the live in, they are well-designed, period. It's all three of simple, flexible and efficient, which is quite a rare combination.

Ars Technica

I'm surprised this took so long to get an article on Ars Technica, and it's only a very short article, but at least it's there. http://arstechnica.com/business/news/2011/10/dennis-ritchie-the-giant-whose-shoulders-we-stand-on.ars

On NPR

Mainstream media homage.

C has been in active use for

C has been in active use for fourty years now – an unprecedented achievement. Its longevity alone is an eloquent evidence of its merits. It is all the more so impressive as C remained essentially unchanged through the years. (In this respect, compare it to the much more recent C++ and Java, for example.)
For at least part of what C was designed for, it still has no rivals.
Not only that, but C also has received new uses, such as serving as a target language for translating from other languages, and being a medium for interfacing programming languages between them (a ‘standard’ way languages talk to each other is by all of them talking to C).

C is known to have been influenced – in different respects – by BCPL and Algol 68. One way to appreciate how big an improvement C was over these languages – and thus how powerfully tallented its design was – is writing a certain program in all the three of them. From actually having tried this, I can say C was a huge winner in size and clarity. The same holds when comparing C to many other languages, including ones of more recent design.

Even if C was Ritchie's only contribution to computing, we already know it is huge and lasting.

Local optimum

It's easy to complain about C, but it's an incredibly difficult language to incrementally improve. Ritchie managed to find a very sweet spot in the design space: the pieces all fit together very well, there's very little extra (at least until C99), and the concepts are clear and perfectly matched to the semantics. I agree with others that declaration syntax is probably the only thing that could be easily and unambiguously improved. Even the much-maligned separate compilation based on automatic textual duplication is a really slick solution to a very hard problem. We can do better, for sure, but there's a cost.

And fundamentally, all of this is a reflection of the incredibly good taste of its designer (and maybe just a bit of luck).

I do believe (and fervently hope) that it's possible to do better than C in its niche. But it's very hard to see how to do it without an enormous amount more machinery of one kind or another. Just take a look at the C++ juggernaut, or the amount of type theory wizardry required by a system like ATS. Both of these are ostensible "improvements," but each pays a very heavy price.

Everybody knows the flaws of C, but for what it is, it's really a gem. Hard to cut, hard to change, hard to improve. I just can't quit you, C.

local minima

It does seem that C as a language facilitates small compilers and small runtimes, for a fast self-hosted systems. GCC is big now, but there are small C systems.

C does seem useful especially as the bottom loop in your system, self-compiling and on which you can build other things. I would like to see something small and self-compiling that could replace it, though with some more linguistic safety guarantees. I can't think of anything right now that fits the bill, besides C. Some Scheme systems, perhaps. I can think of larger self-hosted systems.

So yes, Ritchie did very well with C. Much respect!

Not so sure

Not so sure:
- C's undefined behaviour in case of integer overflow: if we had by default a signal or an longjmp/exception (with the possibility to have also modulo 2^32 arithmetic), all the CPU would have instructions for TRAP on overflow (instead of only the MIPS) and the world would be a better place.

- initializing variable by default to 0 or null or 0.0 (but providing a keyword to not initialize a specific variable for performance reason): this would be a big improvement in determinism yet we would have nearly the same performance.

So these two improvements (on top of the syntax change you're writing about) doesn't seem to me as needing a massive overhaul of the C language yet they would provide quite a few benefits..

Interesting

I guess I concur with your second point. Initializing variables seems like a better default (as long as it's avoidable when desired).

I find your first point more interesting. I'm not convinced you're right, but it shows something that I really neglected: the feedback from the design of C back into instruction set design. I suppose a lot of what makes C seem like it's in such a sweet spot is that the environment literally adapted a niche for C. Shades of Lewontin's Triple Helix...

I always belonged to the

I always belonged to the school of thought that said that since relying on defaults is bad, it is better for uninitialized variables to contain junk values.

Doubtful

If you don't like defaults, then it should be a static error as in Java. It's cheap and easy enough to check, at least these days (and at least for most kinds of variable). I think junk values often punish the wrong person (end user when they're hacked, rather than the programmer).

Point taken, though I am not

Point taken, though I am not so sure it is so easy to check statically in the presence of aliasing.

Efficiency

The standard C argument against default values is one of efficiency; why initialize to zero what is immediately overwritten by a (responsible) programmer? While for scalars this has no cost, since the compiler will usually optimize away dead code, for aggregate structures and arrays allocated on the stack, this could be a significant penalty.

Not that I agree with that argument, though. I prefer to have a default for more deterministic programs over a dataflow analysis + error for locals like in Java. It seems that in my own code, all the uninitialized variable usages caught by the compiler I end up fixing with a default initialization at the declaration site anyway.

At least it crashed

Yes, don't let errors pass silently, unless you intentionally program with fault tolerance, recovery, redundancy ... in mind. As a colleague of mine ones said in a complaint about a program he had to fix and in favor for another one: "at least it crashed".

True that. This is of course

True that. This is of course a classic complaint about C and the approach it takes. In fact, in some sense it represent the issue many have with C design philosophy in toto.

Improving on C

I agree that it seems really difficult to improve on C's data structures without paying significantly higher costs of various sorts... despite the fact that the C view of data combined with its standard library has made buffer overflows a chronic security problem.

I disagree with respect to control structures; I think you could significantly improve on C at pretty low cost by adding a few things such as explicit tail calls, and maybe removing post-increment operators. And then you have things like the syntax of the for loop. While C's for loop is a minor annoyance to experienced users, it isn't terribly user-friendly especially compared to Pascal or Modula-3. And C's for loops tend to create a fair bit of pain for people who haven't developed the discipline to program in C effectively.

I'm a little unsure with respect to separate compilation; I don't really see why C's solution is particularly slick compared to say, Modula-2's modules.

for-loop

I'm a fan of C's for-loop compared to Pascal's for-loop, because it's flexible enough to work easily with both 1-based and 0-based iterations where the limit is the count of items (in Pascal, you have to subtract 1 from the upper bound - in C, you just change from less-equal to less-than); and it also works well for linked list traversal.

In Delphi, the most commonly used Pascal dialect, it's extremely common to see loops that look like "for I := 0 to MyList.Count - 1 do" - the "- 1" is the bit that's ugly. I just did a quick grep over the Delphi runtime library, and "- 1 do" occurs over 4,500 times. It's also a source of beginner mistakes.

This in turn is because 0-based indexes have come to dominate over 1-based indexes, in no small part because of C.

It's unfair to criticize

It's unfair to criticize Pascal for having to gracefully contend with a world full of zero based arrays inspired by C...[*] While the hacker in me loves the C for loop, I think it is way more error prone than -1 in Pascal for loops. Just look at all those checklists for C programmers that suggest you check your for loops (and let's not forget the many errors that arise in for loops when equality is mistakenly replaced by assignment).

[*] Ada's For I in Arr'range is more elegant than either of them.[**].

[**] And let's not forget to compare how each language treats the loop variable (specifically its value after the last iteration).

I don't really see why C's

I don't really see why C's solution is particularly slick compared to say, Modula-2's modules.

Because you can do easily do simple things like this, or ridiculously ambitious things like this.

C's for

C's for loop […] isn't terribly user-friendly especially compared to Pascal […] C's for loops tend to create a fair bit of pain […]

I see it in quite the opposite way. C's for loop provides space for initialization (of whatever number and kinds of variables), as well as re-initialization for subsequent iterations. Most loops need these. Thus, a vast number of iterative processes are adequately (with proper structure) and uniformly (similar things done similarly) expressed. (Consider for example arithmetic and geometric progressions, list traversals, two-way and other concurrent traversals.)
For comparison, in Pascal, one uses for to traverse an array, but while to do the very same with a list. In addition, in the latter case the pointer is being initialized outside of the loop, thus is not syntactically related to it, and updated within the loop's body, indistinguishably from the latter.
Even array traversals become ugly in Pascal, given one has to enumerate, say, only elements at even positions.

C lessons

A few things that C taught us:

  • A highly optimising compiler is less important when the language is expressive on a low level.
  • Strong control over data representation matters, perhaps more than compiler optimisation. Not only for systems or machine-level programming, but also for general-purpose programming in a resource-constrained environment — the latter was the case for almost all applications for a long time, and still for a lot of things from databases to games.
  • Conciseness matters. Not to the extreme degree of APL, just the elimination of unnecessary verbosity. (Corollary: If the notation is concise for common constructs, then those constructs will remain common because of their handy notation.)
  • You need a language to implement all the new, much better ones. Don't be surprised if that implementation language just won't go away.
  • Qualities beneficial for initial success and continued survival often become liabilities later on (like the C preprocessor).
  • It seems difficult to get rid of the assumption that imitating the surface syntax of a successful language is a precondition for the success of another.

Agree

I agree with all your points, but I couldn't quite parse the last one. Are you saying that it is necessary to imitate the surface syntax, or merely that we assume (unjustifiably) that it is necessary?

My guess is that the word

My guess is that the word "assumption" was a bad choice. I took him to mean "escape the conclusion" rather than "get rid of the assumption".

Funny, I took it to mean

Funny, I took it to mean that we assume (unjustifiably) that it is necessary.

;-)

;-)

Clarification

The latter; sorry about the poor wording.

Pretty much for my entire

Pretty much for my entire childhood, I associated "C" as being synonymous with programming and computers. A tremendous amount of software that I use is written in C, and I'm sure that anyone who uses a POSIX OS can attest to that, including users of Windows.

For all of its flaws, many great applications have been written in the C language; OSes, Embedded Systems, Games, CLI applications, the list goes on and on. Dennis Ritchie will be missed.

Sidenote: The K&R style of indentation is awesome.

C had no real flaws given

C had no real flaws given its goal of basically being a portable high-level assembly language. I came of age toward the end of C's reign and only got to write two big projects with it (one was Kimera with Emin Gun Sirer, the other was a PalmOS JVM). However, I really appreciated its flexibility and had a lot of fun doing cool things like using arenas for memory management and rolling my own v-tables.

All of the languages that have tried to extend C to be more than it was meant to have been, in my opinion, disasters. Too complicated, not beautiful or elegant. That it took Java's (more) fresh approach to make OO programming mainstream should be a lesson.

I expected to hear a lot

I expected to hear a lot more on the "structured assembly language" angle. There is (by the way) a whole line of these beasts, outside the C ancestry, which deserve more study, I think (PL/S, PL/X etc.)

Yeah

Yeah. I'm not an expert, but it's very very hard to believe that structured assembly language had to look like C. It does NOW, of course, because processors are all built on the assumption that their most important job is to run C efficiently. I think this is basically the same as Matt Hellige's point above.

People who know more about this than me might like to say something about more recent alternatives such as C-- (http://www.cminusminus.org).

There are two levels here.

There are two levels here. One is with the presumed machine model, as Matt mentioned (my anecdote about null terminated strings hinted in the same direction). The second level is that of the semantics provided. These should presumably match the hardware pretty closely, but this does not mean that they necessarily should match the decisions made in C. Getting to this conclusion requires discussing particular cases, which we haven't done.

Trying to develop a "portable high level assembly language" adds a critical component to the story, of course. PL/S, for example, allowed the programmer to choose the register used to store register allocated values. C did not allow that level of control. Portability, of course, also becomes easier when hardware is designed so as to support the language, rather than vice versa.

Interesting. Thanks.

Interesting. Thanks.

C-- is dead.

It really sounds like C-- is dead, unless you are hacking on GHC which uses Cmm, something resembling C--. And even then, it sounds like Cmm's days may be numbered, as apparently nobody wants to put significant effort into GHC's native code generator, and that the future seems to be an LLVM-only GHC. There is a quantity of hand-written Cmm in GHC though, so Cmm will live on (for at least a while) via translation to LLVM.

LLVM appears to be fairly close, in spirit, to C--.

C-- is a cold-blooded reptile

I checked discussions around C-- and LLVM recently, and came to a more nuanced conclusion. C-- is not exactly dead, but it is a research device that moves slowly, rather than an industry tool meant for immediate consumption.

The people working on cminusminus.org's C-- are Norman Ramsey and Jaos Dias. They are doing active research related to compiler backend: on the web page linked above you'll find 2010 and 2011 POPL publications, both mentioning, if I remember correctly, doing software prototyping "using C--".

The state of facts seems to be the following: there is no coherent, solid, well-specified, well-implemented compiler backend named "C--" that you can reuse in your projects. There is a "C-- project" that drives research on how to do better the "backend" part of compilation, and fragile software artifacts that gravitate around it and provide disjoint implementations of some aspects of this research.

C-- is clearly not meant to be used. That does not mean it's dead. Future compiler backend for high-level languages may have been more inspired by C-- that by the current state of LLVM, which is, in fact, mostly centered around compiling C-like languages.

Note that there are several things that would deserve the name "C--". I don't think Norman Ramsey's C-- is the same as GHC's possibly-dying¹ C--, though there are evidently strong interactions between Norman and Simon Peyton-Jones. There is also Compcert's "Cminor" language which plays a similar role inside a whole certified compiler infrastructure. It is also very much alive on a research front, and may or may not become an interesting target for people doing certified compilation of higher-level languages.

¹: I heard that GHC people are not totally satisfied with the LLVM backed. It's very efficient for numeric code, but still underperforming in some important cases -- numeric code is good but it's *not* the usual application niches of functional languages; good successes for numeric computation won't compensate overhead in symbolic manipulations or heavily thunk-using algorithms. In the current state of affairs, I wouldn't bet too much on LLVM becoming the only backend. From an outsider p.o.v., there seems to be a mismatch in directions between the important numbers of people working on LLVM, and the much smaller group GHC-backend hackers : stability, efficient support for language runtimes...

Maybe not

I think the fit between processors and C is getting worse. For example the #1 complexity for processors is cache misses. Organizing low level code so that the cache can time its operations isn't handled by C type languages at all. The result is terribly suboptimal low level loops. For larger chunks of memory, i.e. C pointers and data structures we have the same issue, C does little to avoid/manage page faults.

As we are finally moving towards CPUs with more registers, I don't see C taking advantage of that. Computations aren't organized to match machines with large numbers of registers. Arguably C is optimized for machines with 3 integer and 3 floating point registers.

C makes no use of hyper-threading or predictive execution. Code written to assist here could yield huge benefits. Finally C doesn't really support more than one core very well.

So right now I think the match between C and CPUs is fairly weak. There is room for another systems language to step in. I just can't think of a good candidate.

is it really?

I have always been curious as to how or why C got to be described as ‘high-level assembly language’. I am pretty sure this has nothing to do with its inventor or anyone from the original Unix crowd. I also believe that such a characterization is incorrect.
If anything should be called a ‘high-level assembly language’, then BCPL deserves the title more than any other language. It is typeless (or, perhaps more properly, contextually typed): the only kind of primitive data value in it is the contents of an abstract memory cell, which has no type by itself, but, depending on the operators used to process it, can be interpreted as a bit pattern, an integer, a character, a Boolean, or a pointer. No checking is performed to determine whether an actual value can be subjected to a particular operation. The only data structure supported in BCPL is the vector – a sequence of cells. A pointer points to a cell. That is all you have (apart from procedures, loops, if, and switchon).
This is all very much assembly-language like, but very different from C, which is a statically, semi-strongly typed language with user-definable, nestable datatypes, and typed pointers.

I think the emphasis is on *portable*

I think the emphasis is on portable rather than high-level. Ritchie himself said so in The Development of the C Language:

The language is also widely used as an intermediate representation (essentially, as a portable assembly language) for a wide variety of compilers, both for direct descendents like C++, and independent languages like Modula 3 [Nelson 91] and Eiffel [Meyer 88].

This paper is worth reading, as is almost anything published by Ritchie. Aside from the real history of C, it contains nuggets like "B can be thought of as C without types; more accurately, it is BCPL squeezed into 8K bytes of memory and filtered through Thompson's brain."

I agree and I think we

I agree and I think we should keep the distinction in mind. Thus, to the discussion of other portable compiler backends, we should add a separate discussion of high-level systems programming languages inspired by C. Cyclone (now dead, I think) and, of course, BitC, fall into this category.

***

C is indeed portable, but so is or was (especially in the era of word-oriented machines) BCPL. Ensuring portability of both the language and the compiler was an explicit goal of Richards.
Assembly language, I think, is being associated with both code generation and low level of expression/abstraction. If the latter connotation is to be avoided, it is perhaps better to say compiler target than assembly language. In this sense, BCPL is truly a high-level assembly language while C is ‘only’ a compiler target language.
On the other hand, is is interesting to know that C was really born to solve issues of efficiently coping with the then evolving hardware. When I first read Ritchie's history article some years ago, the two key factors that I understood to be the main motivation for departing from BCPL and creating C were:
1. BCPL was designed for word-oriented machines, but those were already giving way to byte-addressable ones.
2. Introduction of types was considered necessary to ensure efficient addressing of data.
A notable quote from Ritchie in this respect: ‘Other issues, particularly type safety and interface checking, did not seem as important then as they became later’.

That's a good point

That's a good point regarding the systems programming angle. I was really referencing the criticisms brought up by the writers and fans of the "UNIX Haters Handbook," although I don't necessarily advocate their positions.

Do people remember when OS

Do people remember when OS APIs where specified using Pascal conventions? How quaint! (Though it looked absurd even at the time.)

Yes I do. Why did it look

Yes I do. Why did it look absurd?

Can you post a bit of code

Can you post a bit of code (I can't locate anything now). If I am not mistaken, taking a look should be enough...

Sorry: I can't locate

Sorry: I can't locate anything either, I gave away the relevant books years ago, and my memory isn't that good!

Inside Macintosh

Inside Macintosh (warning: 18MB PDF)

The sample program on page I-16 doesn't look too shabby.

Thanks. That's exactly what

Thanks. That's exactly what I was looking for.

Typical (arbitrarily chosen) API example (p.536):

PROCEDURE SFPGetFile (where: Point; prompt: Str255; fileFilter: ProcPtr; numTypes: INTEGER; typeList: SFTypeList; dlgHook: ProcPtr; VAR reply: SFReply; dlglD: INTEGER; filterProc: ProcPtr);

It's got those naughty fixed-length Pascal strings, but apart from that it looks OK to me. And beautifully documented.

Also used in OS/2, Windows 3

The Windows 3.x calling convention was Pascal, still used as 'stdcall' but the arguments are in reverse order (C order).

http://www.unixwiz.net/techtips/win32-callconv.html

The Win 3.x convention is

The Win 3.x convention is what I had in mind.

Since no one mentioned it:

Since no one mentioned it: Another holy war raged between the camp of the semicolon as statement terminator and the camp of the semicolon as statement separator. Those were the days!

studies of the semicolon

My impression is that there was a lot more concern about the usability of languages in those days, or at least in a more explicit form. Today, it has a smell of "software crisis" about it.

The "Why Pascal is not my fav" diatribe cites an article by J. D. Gannon and J. J. Horning on this subject and which I could not find online, but I did find this, which may or may not be representative.

Maybe the discussion today has moved from "how do we prevent programmers from making mistakes" to "how do we detect them", whether via tests, static typing, or formal proofs.

printf("goodbye, Dennis");

Economist obit. Very nice and tasteful.

Someone was complaining that Dennis Ritchie's passing didn't get as much press as Jobs, but I think the converse is actually true: it is getting more attention than it would have otherwise have gotten given increased awareness. I don't think that is a bad thing, but I also don't know if it will last.

O.K. we are now thankful to

O.K. we are now thankful to Jobs that he died because otherwise no one would have noticed Dennis Ritchie or John McCarthy.

dmr day