A Concept Design for C++

In the video A Concept Design for C++ and the related paper Design of Concept Libraries for C++ Bjarne Stroustrup and Andrew Sutton describe how they're going avoid the problems that lead to concepts getting voted out of C++11. In a nutshell they seem to be focusing on the simplest thing that could possibly work for STL (C++'s Standard Template Library).

C++ does not provide facilities for directly expressing what a function template requires of its set of parameters. This is a problem that manifests itself as poor error messages, obscure bugs, lack of proper overloading, poor specification of interfaces, and maintenance problems.

Many have tried to remedy this (in many languages) by adding sets of requirements, commonly known as "concepts." Many of these efforts, notably the C++0x concept design, have run into trouble by focusing on the design of language features.

This talk presents the results of an effort to first focus on the design of concepts and their use; Only secondarily, we look at the design of language features to support the resulting concepts. We describe the problem, our approach to a solution, give examples of concepts for the STL algorithms and containers, and finally show an initial design of language features. We also show how we use a library implementation to test our design.

So far, this effort has involved more than a dozen people, including the father of the STL, Alex Stepanov, but we still consider it research in progress rather than a final design. This design has far fewer concepts than the C++0x design and far simpler language support. The design is mathematically well founded and contains extensive semantic specifications (axioms).

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Research Paper

Here's the research paper for a deeper look at this experimental approach to solving the Concepts problem in C++.


Thanks, I'll add that to the post.

This things has been under

This things has been under discussion for ages. Just in case you thought technology and/or software were fast moving fields...

Hard problem

Well, I suppose it is a hard problem to put a roof on a building whose statics can't even support its own exterior walls.

That is...

...an astoundingly high-quality pun. Well done!

Very well put, I agree. It's

Very well put, I agree.

It's funny how language designers learn from the products of other designers, not from their tribulations.

Care to elaborate?

Care to elaborate?

Only at the point of a gun

Only at the point of a gun. :) I had hoped that the metaphor is sufficiently self-evident.

Not really...

Your analogy is cute, but seems to me you're certain of something that is in fact uncertain at this point... C++ isn't a perfect language..., but C++ Concepts - as described in the referenced paper and talk - do in fact stand a chance of being realized in some related form in the future. It's research at this point. Give it its due time... Further, and more in line with this PL community, the notion of Concepts in general is worth at least some meaningful discussion amongst PL design experts such as yourself. It's somewhat troublesome to me that this topic would elicit only a questionable analogy followed by "yeah, what he said!" remarks here. Really?

From the distance

I have followed the concepts development from the distance. But I'm sorry to say that I have never discovered anything there that had not long before been invented much cleaner with type classes, higher-order modules, or structural object types (despite repeated claims that it is something completely different). The research mainly seems to be about retrofitting something like that into the peculiar structure of C++, but I'm not sure there's much to learn for language designers outside that realm. So to be honest, I'm not surprised that all you see is shrugs.

An add-on to an add-on to an add-on

As much as the world yearns for adding jet engines and wings to garbage trucks and converting them to run on french fry oil, should we give it to them? Maybe we should think of something better and invest our time and energies into migrating people to that better thing, rather than add to this language that is in my honest opinion, a monstrosity.

If 10 years ago all PL people had simply said "to hell with C++" and instead devoted their energies elsewhere, I think we'd all be better off. (To be fair, at least 90% of them already have).

My projection is that all language features, particularly the most powerful and expressive, cleanest and most beautiful, will eventually find their expression in C++ in the most broken, horrid, ugly, bastardized, barely recognizable form conceivable. Maybe I'm alone in my hope that C++ would meet its bitter end sooner rather than later, but the worrying trend seems to be that smart and well-meaning people keep falling into its seductive traps and perpetuating its farcical existence.

10 years is about right, its

10 years is about right, its when a vast majority of PL people gave up on C++. (maybe much earlier, 15 years?)

Whenever I look at these proposals for C++, they always have to make way too many compromises in light of the language's cruft. Someone says: hey, "you should look at C++ again, its much better today," and then just 20 or 30 lines of code and I'm already done with it, I don't have to put up with this cr*p anymore so why should I? For many of us, C++ is the abusive ex-gf/bf who we don't think will ever change and never want to see again, especially if we aren't in an industry where C++ is the lingua franca.

However, there are still many C++ programmers out there so there still some PL researchers working on C++, and it is a nice/uncrowded niche if you can stomach it. But ya, most of your innovation time will be spent on creating workable bandages rather than doing general innovations.


"If 10 years ago all PL people had simply said "to hell with C++" and instead devoted their energies elsewhere, I think we'd all be better off. (To be fair, at least 90% of them already have)."

Please explain.

Human effort

This is a pretty basic time/resource investment question. Suppose a planetary scale.

Let's say that Mc is the effort spent maintaining and fixing bugs in C++ code every year, and Nc effort is spent writing new C++ code every year. Efforts to improve C++ as a language benefit mostly new code, so we might say that a p% productivity increase would reduce the new code effort to Nc' = ((100 - p)/100)Nc, but not really change Mc much initially. In the limit, Mc should go down as well because the new code is more easily maintainable. Suppose that it is close to p% as well. Mc continues to increase every year, but at a slower rate.

Given C++ has been around for about 30 years we might estimate the total amount of effort to write the existing C++ code if we were to rewrite it (now, from scratch) as 30Nc. This is a *huge* over-estimate because C++ maintainers claim that more new C++ code is being written now than ever before, not to mention all the C++ code that was written and has not survived.

Suppose instead that some huge, fixed effort Yn is required to make a better alternative to C++, with a big productivity boost, say a factor of 3, and a big maintenance boost, say factor of 2.

First, to replace all existing 30Nc effort in writing that C++ code, we need only 10Nc total effort. Suppose, as a planet, we stopped all C++ development and focused on doing that. Then we save 0 per year initially (since we still need to maintain the C++ code until it is replaced), ramping up to 0.5Mc per year savings when we are done (since we have to maintain the new code, which simply replaces the existing code). If we choose a timescale of R years to do the replacement, then we need to spend 10Nc/R per year to achieve that goal. Suppose that R = 10. Then we need only spend the same amount of effort we are expending writing new C++ code each year on writing using the new alternative to completely replace everything in 10 years. Over those 10 years we save a total of 2.5Mc effort in maintenance (integrate the savings of incremental replacement over 10 years) and then 0.5Mc per year afterwards in exchange for first paying the Yn fixed cost to develop the alternative.

That's the basic math. We could do the incremental thing and make C++ better, saving some small percent (say p%) every year, or instead pay a large up-front cost and a large switching cost, and amortizing it over a long timescale. (But even better, we can do both, though we will become profitable sooner the more effort we put into switching to the new, better thing).

I picked some numbers and put in some unknowns, but honestly I don't think my original estimate of 10 years was all that far off. Most C++ software, like most software, has already died, and I sincerely doubt that the total existing C++ codebase is more than 10Nc, in which case my argument is even more compelling.

Can you phrase the "research

Can you phrase the "research question" in C++ neutral terms?

What do you mean?

The research here focuses on defining a machine-efficient design for implementing Concepts in C++. It's certainly a research problem.

Neutrally, I guess you could say it's computer science research into how best to design Concepts for native imperative statically-typed programming languages that employ template-based genericity.

I don't know. Something like that? Does that sound researchy enough? :)

How about

"How to retrofit something remotely reminiscent of a type system onto something that primarily is an untyped macro system by design, and proud of it?"

It is quite hard to keep out the sarcasm, and that's probably where the last half sentence is coming from. The rest, I claim, is technically accurate.

(Btw, just to be clear, I (have to) use C++ on a daily basis, and wouldn't mind fundamental improvements. However, from a language design perspective the whole thing is so far past any hope of repair that I have stopped caring long ago.)

At its heart templates are a

At its heart templates are a form of macro system. In C++ the language in which you write the templates is different than C++ itself. There are two layers (two languages): the runtime C++ language and the compile time template programming macro language. C++ manipulates runtime values (like "1") and the template programming language manipulates types (like "int", i.e. C++ types are the template programming language's values). The problem is that this template programming language is untyped. So when you define a template (which the is template layer's analog to a function), it is untyped.

template<X> ... use X here ...

The body of the template will only be checked when the template is applied to a concrete X. Concepts are an attempt to make this template language typed:

template<C X> ... use X here ...

Now concept C is the "type" of X.

The language Magpie has something very similar to templates, but the language distinction between the two levels is removed: both layers are the same language. Multi stage programming generalizes the two stages to multiple stages, but is less general in other ways (e.g. types cannot be staged AFAIK, which is the whole point of Magpie's type system and C++ templates). Also related are F#'s type providers.

All in all I think this issue of how to design such a macro/staged type system is an interesting research question and is C++ neutral. Of course the best way to approach is is *not* by starting from C++, but by starting with a simple language where you have multiple layers instead of two and where each layer is the same language instead of a completely different language. It doesn't seem unlikely that there will be exciting research that unifies and simplifies template-like metaprogramming, a Magpie-like type system, F#'s type providers and staged programming into a nice whole because all of these feel like a specific use case of a more general feature.

What language design problem

What language design problem do you hope to see solved wrt to typed staged programming that hasn't been addressed already by systems like MetaML and the like?

I'm not very familiar with

I'm not very familiar with MetaML but IIRC while it allows you to generate expressions, it does not allow you to generate types. This is something that Magpie, C++ templates and F# type providers do provide. That is, MetaML doesn't have any analog of this:

class Foo { ... }

(of course the C++ template language is rather limited, but the same feature in F# is much more powerful because you have all of F# available to generate the type you want)

Secondly, I don't think MetaML has a compile time stage like the others I mentioned (and like Lisp macros). Instead of main having type unit, it should have type <unit>. The compiler will then evaluate the value of main at compile time, and compile the resulting code.

Third, I'm not sure if MetaML is powerful enough to express all code generation patterns you want. For example a state machine like the following, where each of the stateN calls the others in tail position:

let state0 x = ...
let state1 x = ...
let stateN x = ...

You want this when you compile regular expressions to an efficient DFA.

I'm not an expert on staged

I'm not an expert on staged programming myself, so I cannot answer all of your comments. But the examples you give should be expressible in e.g. Template Haskell (but it is not fully typed at the macro stage). Meta[Oca]ML supports an arbitrary number of (typed) stages. Surprisingly many things are actually expressible with type classes or plain ML modules alone. And I'm sure there are plenty of other contenders that have more expressive power to offer than what template hackery provides.

It is Worthwhile

As a professional C++ developer an amateur PL researcher, I would take issue with the claims that this is irrelevant/out of place.

C++ is hideous, I know that. But it will continue to be used for the foreseeable future. Sure, we have arguments about the number of libraries already written, etc - but the real reason is that people like us won't make the call. The decision will be made by someone who just checks "industry standard" off their list and moves on.

That said, as Andreas Rossberg said it well - template metaprogramming is basically an untyped macro system. The absolute purity of its functional nature (there are no escapes for IO, no monads, etc) lead to frequent comparisons to Haskell. Would you want to be stuck working with Haskell that's had its type system ripped out?

Having to work within the framework of C++ makes this challenging, and the fact that it basically amounts to humanitarian aid to C++ developers makes it worthwhile. :-)

Humanitarian aid for C++ developers

Now that's a fair and mostly accurate analogy :)

I can't agree with your "C++ is hideous" comment, but I guess it depends on what specific hideous characteristics you're alluding to. As an absolute statement about the language in general, it's a purely subjective remark (and wrong, which I guess is also subjective :)

"Untyped macro system" is fair. This speaks to the complexity faced by Bjarne et al at developing a reasonable and machine-efficient concepts facility for C++ templates. Bring on the aid!

I look forward to seeing how this research evolves and what emerges from it in the standard.

I'd also like to note that C++11 removes some the hideousness you experience (just a guess since you provided no examples, but the language has ceratinly grown cleaner, more expressive and more modern in its latest incarnation), though it will take some time for all compilers to catch up.

This speaks to the

This speaks to the complexity faced by Bjarne et al at developing a reasonable and machine-efficient concepts facility for C++ templates.

The supposition here is that the problems with templates (and everything else) are due to the low-level nature of C++. I keep hearing this sort of argument. But I fail to see the connection. Most of the issues with templates, and their type-checking, have very little to do with C++ being low-level, and everything with it being an arcane monument of accidental complexity (which can hardly be removed by adding more of it, btw).


By machine-efficient, I mean "it doesn't take longer to compile C++ code employing concepts than it takes to read James Joyce's Ulysses".

I'm not making the case that C++ concepts are hard to design correctly because C++ is a native programming language. As you stated earlier, adding typed-ness to an untyped macro-like functional system is really hard to get right. Further, any viable solution must guarantee that compilation is acceptably fast.

Over-simplified, of course. You'd be better served reading Bjarne's writings on the history and (as in the case of this thread) future of C++ Concepts.


Ah, sorry then, I thought you were talking about something else.

However, I have my sincere doubts that compilation complexity has ever been a serious concern in the design of C++, given that it is the language with -- by far! -- the most epic compile times I have ever worked with. Far past anything that would qualify as "acceptably fast" even in its current state. Case in point: a significant amount of the internal infrastructure at Google is devoted to vastly parallelising, globally caching, and otherwise improving build times for C++ projects. C++ would be flat out unusable at that scale without all this infrastructure -- because include files make compile times quadratic, and templates make them exponential.

And considering the direction in the concept design that BS has been championing (one with even more implicitness, and barely any structure), it looks to me like it is the most costly alternative imaginable.

(For the record, I've read quite a few of BS's musings. I don't always find them plausible.)

Quite fair. Specifically,

Quite fair.

Specifically, what I view as "ugly" about C++ is:
1. The inherited ability to subvert the type-system wholesale. (Useful, but not pretty).
2. Much of the template syntax (the rules on 'typename' for example)
3. The vector
problem (fixed in '11)
4. The almost-equivalence of struct and class (close enough to violate toowtdi, different enough to cause occasional confusion)
5. The inability to access nested types in your CRTP parameters.

I guess my belief is that if C++ we thrown away, then rebuilt, it would be better. But satisfying the requirement that each iteration of the language be (at least mostly) backwards compatible with the last iteration has lead to design cruft like these.

Frankly, if those are your

Frankly, if those are your most serious concerns, then you should consider yourself a happy user.

I do. I said it was ugly, I

I do. I said it was ugly, I didn't say it was bad. Elegance is only one of many desirable properties of a system. For the work I'm doing now, C++ is very well suited, and it makes my job a lot easier.

Let me guess...

You work as a technical book publisher?

Haha, nope. Very

Haha, nope. Very low-latency systems. We work in a space where many modern comforts of Java-like languages (garbage collectors, tossing everything on the heap) are unavailable. And functional languages are not as useful as they are in other domains because the very fine-grained parallelism we need isn't well modeled without state.

What we really want is a better C. C because we need to interact with other C libraries (as a domain requirement). That said, we have built up libraries centered on respecting the type-system (not subverting it at every turn even though we could), and leveraging TMP to give ourselves more powerful types.

Contemporary computational fabric

You do systems software development. C++ (and of course C) are the main players here, the infrastructure ingredients of contemporary computing.

Google, as Andreas mentioned, is powered by C/C++. Microsoft is built primarily of C/C++ (for the most important systems and software technologies that power the business...). Apple and Facebook, too. And on and on...

The point is that there is programming language design research on the one hand and then there's the computational fabric of the contemporary world on the other. Clearly, these are often at odds and for understandble, mostly academic reasons - in a good way: Computer science is science. The contemporary infrastructure languages can and do benefit from basic research.

Making what's here to stay better is a fine discipline, too.

I worry that students of CS aren't being afforded the opportunity to learn about - and help improve upon - the languages they will no doubt end up using to build software systems (like at Google and Microsoft, Facebook, Apple and.....).

Google spends a lot of time

Google spends a lot of time hacking around C++'s problems right? Systems builders are most definitely involved in fixing C++ problems. Most people don't build systems or write games, and don't need to pay the complexity cost for the perf, hence the general uninterest in C++. C++ not a universal anymore in terms of dev skills.


I worry that students of CS aren't being afforded the opportunity to learn about - and help improve upon - the languages they will no doubt end up using to build software systems (like at Google and Microsoft, Facebook, Apple and.....).

I have to say, statements like that really frustrate the hell out of me.

First of all, because they suggest, contrary to facts, that universities teach too little mainstream technology. I estimate that 97% of all CS students these days never see anything but mainstream languages at university, except perhaps as footnotes. So what you ask for is already the reality.

Second, this reality is sad at best. Most curricula are just rebreeding the same old mediocrity over and over again. Instead of actually teaching something people can't as well pick up on the street. Instead of spreading the word that advances have been made in PL over the last 30+ years. Instead of contributing to having this knowledge trickle down into practice eventually. Instead of not just putting out streamlined, conservative labour force that applauds the umpteenth iteration over the state of the art of the 70s (be it named C++11, D, Go, or whatever) as innovation, because they don't know any better.

Universities aren't teaching too little mainstream technology, they are teaching too much mainstream technology! Heck, they are teaching too much "technology"! Which is why we keep stagnating were we are forever.

But architect majors learn autocad...

For the most part I agree. Its just that it makes sense to teach languages conducive of learning the material, that don't get in the way of teaching like C++ would. I can't imagine even thinking about using C++ in a class unless it was in an area where C++ was fairly necessary; e.g., graphics, OS, embedded systems. Actually, I think this is what happens in the better American CS programs.

BTW, I took my programming II (CSE 143 at UW) under David Notkin using C++ as the teaching language (after using C in 142). We basically didn't learn much about C++, just some basics, because knowing esoteric C++ features wasn't the point. I don't even remember getting into how to use the "new" operator!

Principles, not tools

I would put it this way: teaching languages for their own sake should never be the point of a university CS course. Just like teaching how to use a pocket calculator cannot be the point of a math class. It's a tool, nothing more. Universities should focus on teaching principles, not tools.

Having said that, all classes I have witnessed or heard of that tried to use or teach C++ were disasters. BS's repeated claims that this is just an indicator of everybody teaching it wrong is, well, daring.

Principles and tools to learn them

Actually, Bjarne's intro to programming course at Texas A&M, where C++ is the tool employed to learn fundamental principles, is better than a daring approach.

Sean, I can't agree with your assessment that teaching C++ only makes sense in the context of specific computing domains. I find it odd that it's OK to relegate native tools like C++ to "only areas where they make sense" but it's fine to teach Java and Python as a rule of thumb for undergrad CS majors. Why? What's the reasoning here? At any rate, let's have this conversation on a new thread and not pollute this one with off-topic ranting (like I'm doing :)

PS: My apologies for forking this thread. Let's zip it back it up to C++ Concepts design.

I have to disagree

IMHO CS106B (Programming Abstractions) offered at Stanford does a pretty good job.

I also think that "teaching principles not tools" is too much of an extreme position -- not all of CS is TCS and in order to understand the theory it's often quite useful to experience the practice (which can serve as a pretty good motivation for theory and can help to identify actually relevant research problems even if one does choose the TCS path). Teach both (at the same time, sure, teaching solely the tools is mostly a job for vocational schools and/or professional courses). There's a point to learning the "low-level" early, too, though, c.f. http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools.html

C++ has its warts, no argument there, however, what a pure-academician type might see as "unacceptable" deviations from The One and Only True PL Design Ideal is often necessary in practice (c.f. backward compatibility with C) to achieve wide adoption in the market. If a PL has to be more than a mere academic exercise leading to a paper or two published in some forgotten collection of proceedings from an obscure conference and never to be heard of again (disclaimer: OK, OK, I'm exaggerating for the effect here), compromises might sometimes be necessary. Naturally, the starting point in time affects the compromises being made, e.g., a "compromise" Scala has chosen to make (JVM as the platform) arguably still wasn't drastic enough to impact the elegance thereof (which I think of positively, by the way). However, at the time C++ was being designed the existing PLs simply didn't even consider, let alone solve, the problems C++ had to tackle (like *high-performance* abstractions with generic programming paradigm). Some of the choices it has made are IMHO a sign of brilliance -- instead of adopting the 1959-style memory management (GC, undeservedly fashionable in certain circles IMHO -- thankfully, some recent developments I'm watching with interest, like Deca and Rust, seem to be abandoning this blind alley of PLT&D--or at least making its imposition optional) it adopted deterministic construction/destruction that allowed for universal resource (i.e., not just memory) management in form of RAII and smart pointers. Note that I am *NOT* arguing performance here -- I'm going much further (or just in a different direction), in my opinion automatic resource management is in fact more elegant and more programmer-friendly (thinking of "corner cases" like try/catch/finally, with-clauses, etc. in C# or Java here for non-memory resource management which have to be done manually from the end-programmer side, while in case of C++-style ARM implementation happens once-and-only-once on the library-writer side). Note also that I'm most definitely not claiming perfection of this solution either (yes, I'm aware of cycles and the weak pointers), I'm not even convinced that a perfect solution even exists or can exist.

At the risk of making a somewhat shocking/controversial (to some) statement, I happen to see these imperfections, impurities as the beauty and strength of C++ -- especially not being "purely-OO" (as was fashionable some time ago) nor "purely-FP" (as seems to be fashionable once again at the moment). This focus on multiparadigm programming is what I find often lacking in PLs considered to be the "competitors" of C++ (and is also the reason I think of Scala more highly than of, say, Java--which IMHO didn't contribute anything at all to the state of the PL design and is arguably even uglier (thinking of unavoidable 1990s-style OO boilerplate code here)). From the "academic" side the prominent example is Oz, but (sadly) it seems to be the exception rather than the rule.

AFAICS, nobody here has

AFAICS, nobody here has argued for the "The One and Only True PL Design Ideal". On the other hand, describing C++ as just having made a few "compromises" is One Big Understatement. To point out that myth once more: most of the mistakes in C++ have little to do with the overall goal it tried to address.

And if you really believe that hierarchical sequential memory and resource management via con/destructors is the ultimate way to go, then, perhaps, you never had to write a non-trivial concurrent program with that? Or made any interesting use of first-class functions?

If you've read my last two

If you've read my last two paragraphs, you should be aware that I don't really believe in "the ultimate way[s] to go"; in particular, recall: "[n]ote also that I'm most definitely not claiming perfection of this solution either (yes, I'm aware of cycles and the weak pointers), I'm not even convinced that a perfect solution even exists or can exist."

There's nothing mythical about the history and development of C++, it's quite well documented, cf. "The Design and Evolution of C++".
Out of curiosity, care to point out a few "mistakes" that get you particularly riled up? You seem to be mentioning them quite often, never being too specific, however.

Could you please clarify what do you mean by "hierarchical sequential memory", you appear to be the only person using this term in this context.

I do happen to rely quite heavily on concurrency in what I do, who knows though, perhaps it's all "trivial" and I've just never been told ;-)

Re "interesting use of first-class functions" -- "interesting" is a subjective term, but you got me intrigued, could you point out a use requiring GC? Closures and coroutines certainly don't.

Two examples

Two examples of what strikes me as particularly poor design:

1. Combining, in one language, user-defined implicit type coercions, ad-hoc overloading, and overlapping generic definitions (aka (partial) template specialisation). Each is already a questionable feature on its own, but throwing them together in the uncontrolled way C++ does is a solid recipe for naughty surprises and subtle bugs -- especially when generic code is also untyped. (The latest concept design makes no attempt to address this, rather proliferates it.)

2. The language and its class system are neither closed under template abstraction, nor does turning something into a template, if it works, necessarily maintain the meaning of contained constructs. I.e., templates are not compositional, not orthogonal, and the rules and restrictions surrounding them tend to be rather random. A major problem IME for generalising code.

In general, C++ has been piling up far too many ad-hoc features too fast whose interaction was far too poorly understood by its makers. It's a collection of "features" not a design. It was driven by use cases, not by semantics. (The problem with the latter is that you can find cute use cases for any terrible feature.)

I actually own both D&E and ARM, bought when they first came out. IMHO they don't clear up the myth about the inevitability of C++'s design, they very much planted it -- as I said earlier, I don't always find BS's claims plausible.

Re "hierarchical & sequential": con/destructors and RAII ultimately tie memory and resource lifetime to sequential control flow through hierarchically nested scopes. But with concurrency, life times aren't sequential, with long-lived higher-order functions, they aren't hierarchical (well-nested). You either have to hoist them (=> abstraction breaches, memory overhead & potential space leaks) or reference count them (=> runtime overhead esp with concurrency & potential space leaks).

Edit: read "sequential" in the sense of single-threaded.

C++ is a big language, but...

If you have time, I'd recommend that you watch Bjarne's C++11 Style talk, the one he gave at the same C++ conference that initiated this thread.

It's certainy true that C++ is a big language with many features (some of which are certainly outdated or showing their age). I do think it's unfair, however, to claim it's "makers" have let features in to the language willy-nilly. Have you attended any ISO C++ committee meetings? Getting a new C++ feature standardized takes several years for a reason - the process of standardization is exceedingly thorough. Breaking changes - or ratifying poorly understood language features - of any significance would, in fact, break the world... C++ is a big language.

In your characterization of C++ - Features Gone Wild - Concepts would be shipping in C++11 as a half-baked implementation. In reality, C++ may never get Concepts.

Much of the damage

Actually, much of the damage was done before there was an official committee, or it had a chance to intervene. E.g. the examples I have given. After that, only so much could be repaired (although sometimes, more could have been done). Yes, compatibility is an issue, once it's too late. That only reinforces my point.

The difficulties with concepts are exactly due to all the earlier mistakes that have been made. Thus my original pun.

Re BS's talk, I had read through the slides earlier, but if there was anything in them that I hadn't heard before then I missed it. Anything specific you think is worth noting?

His talk is net new...

There's a lot in his _talk_ that is net new. Specifically, well, broadly :), suggesting that C++11 is not just a new language version (it feels like a new lnaguage, doesn't it? Certainly modern...), but it represents a new style of writing C++ code in _modern_ times.

It's worth not just looking at his slides, as he adds more information via speech. If you are already well aware of C++11 and Bjarne's new definition for the language, then my apologies for suggesting that you watch the presentation in addition to browsing slides (yes, this is a time committment, but you can download the media and watch when the right time presents itself).

Leaky abstractions

The main thrust seems to be that C++11 is all better because of the new abstractions in the language and its library. Using library abstractions also is the recipe he has advertised for teaching C++.

The problem with abstractions in C++, though, is that they are incredibly leaky. That means that (1) it is easy to make mistakes, and (2) as soon as you do, all the low-level bare metal blows up directly in your face, striking through all abstraction layers. So, (3) at that point, you have to understand all layers and their implementation details simultaneously.

In my experience, programming in C++ is about 30% coding and 70% debugging. The net win of "clever" but leaky abstractions can actually become negative under such circumstances.

C++ abstractions are leaky as a general rule?

Which C++ abstractions? All of them?

Of course, leakiness here is implementation dependent and isn't a property - or general rule - of a programming language... I'm sure there are plenty examples of C++ library implementations that are leaky (like the canonical example, string classes...), but this is true for other library abstractions written in other languages...

Without doubt

In unsafe languages, all abstractions are leaky. But in my experience, the safety holes of C++ in particular are spacious and numerous enough to remind you of this fact on a daily (hourly?) basis.

I would put it this way:

I would put it this way: teaching languages for their own sake should never be the point of a university CS course.

Just like having a fluent English conversation shouldn't be the goal of learning English at school. That's how it is taught in many countries.

Principles or Tools

"Universities should focus on teaching principles, not tools."

OMG NO! Let them teach tools. This is much better than teaching all the wrong principles, which is what most of them do.. I am thinking of OO of course ...

Lets be real. There's no tradition for science in so-called computer science faculties, indeed in Australia most computing is now taught in business school (IT) or engineering. Lacking this tradition, teachers speak of myths which in Physics would be akin to preaching the existence of Perpetual Motion Machines.

The best programmers I know came out of Maths, Engineering, Medicine, Physics .. or were self-taught: anything but CS.

Wrong principles or wrong tools

OMG NO! Let them teach tools. This is much better than teaching all the wrong principles

And teaching all the wrong tools would then be better how?

Low-latency doesn't mean you can't use FPLs

You just get the FPL to write the programs for you. See e.g. Atom.

Better C

If you want a better C try Felix. Bind C or C++ without executable glue (you need type glue, since the idea is to put a real type system on top of the C/C++ object model).

Oh, yes, it has "concepts" too. [Ermm .. actually they're basically Haskell style type classes with axioms thrown in]

Concepts should eliminate dependent names, but they still won't provide type recursion .. heck, C has type recursion, C++ is way behind.

Actually, I enjoy Pure ....

... which is ML with its type system ripped out and with lambda calculus expanded to general term rewriting.


The PL context would be generic programming

A C++ design paper by Stroustrup is definitely going to focus on issues that are mostly only relevant to C++. That point shouldn't be too surprising or debatable. To turn the topic towards a broader PL question, maybe the issue could be framed as follows.

If a language is going to support generic programming, it can either take the route that by default a generic function accepts any type without complaint so long as it defines operations which match the syntax of the generic function (this is what C++ does), or it can take an approach that the set of types which can be accepted always starts always with some more explicit type lineage (this is what Haskell does).

In either case, one can see the utility of additional language mechanisms to restrict or permit a specific set of types to be used as particular arguments to a generic function. For languages without type variables, it's an interesting question what sort of syntax, semantics, and pragmatics work well for expressing these restrictions. Can anyone recommend some papers addressing this question in a way that isn't very specific to a particular language or implementation, or is the topic inevitably specific to a given language?

Stroustrup and Sutton spend most of their prose on a different question: given the type of generic programming which is most common in contemporary C++, how should one design a reusable library of type restricting expressions.

Concept maps

To avoid only contributing to the redditization of LtU, I'll ask a technical question that maybe someone can answer. I watched some of the video and one of the comments made was that they weren't implementing something that IIRC was called 'concept maps'. Does that mean they are giving up the ability to define an instance of a concept for every instance of some other concept? Is this equivalent to losing the constraints to the left of => in Haskell instances?

Concept maps