Arc in action (a.k.a. it's aliiiiive!)

Recent discussion of what software LtU is running, and what languages it relies on, reminded me of Paul Graham's language, Arc. Despite many rumors of its death, Arc's seed finally sprouted in February, when Paul's startup incubator, Y Combinator, launched a reddit-like social news site for startups, Y Combinator Startup News. In the launch announcement, it was mentioned that the site is built with Arc:

And of course another reason we made this site is that last summer we wrote the first reasonably efficient implementation of Arc, and we were looking for something to build with it.

This month, a new version of the news site was released, based on a new version of of Arc with the innards redone by Robert Morris to reduce consing[*], resulting in a 2-3x speedup.

Arc still isn't available to the public, but painstaking NSA-class analysis of the related threads, Why we made this site and New, much faster version of News.YC reveals that Arc code in the current implementation compiles, mainly by macroexpansion, to MzScheme code. The news site's web server is also written in Arc.

There's not much more news than that, which is why I'm not posting this as a front-page story.

[*] Perhaps this could be considered an example of the hot new trend for zero-cons programming?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Arc still puzzles me

I'm all for people writing their own languages, but I still haven't figured out how Arc isn't just Common Lisp with most of the library cruft removed. Anyone?

Real fandom

I like particularly this comment on the YC comments site:

I am as big an Arc fan as the next guy, but I can't say it looks any more exciting than PHP.

Imagine someone saying: I really love [favourite-band-inserted] and I regularly scream and get fainted on their concerts, but I don't like their music.

Real fandom does not need causes, it just needs other fans.

Perhaps the implication is

Perhaps the implication is that the next guy is not a very big fan.

The next guy

Only the next guy is a big fan and he is an even bigger one when being virtual and absent. There are similar twists about knowledge and believe. Knowledge is the absence of the absent true believer. It's not so much about me who doesn't believe but there is also no credible believer, who gains true insights from his believe. If you want to save god he must be proven and reality, to be real, must be rationally constructed from certain unquestionable facts. The chain of authority is inevitably broken. This is also why science was once subversive and still is occasionally. But science has research interests and promising fields of study where progress is expected. So we don't have to be interested ourselves, it suffices that there is the next guy who is interested ( and he is of course much, much smarter than I am ).

I hate to call it vaporware...

But the only public information about Arc is that it is a lisp variant. So it seems a little silly to even try to discuss it's merits unless you're Paul Graham or Robert Morris.

Discussion points

Difficult to discuss the technical merits given the dearth of details. My impression is that it will be an opinionated dialect of Lisp.

OTOH, a discussion on personalities and cultures that surround the development of PLs might be interesting.

Lisp

Chris Rathman: My impression is that it will be an opinionated dialect of Lisp.

As constrasted with the bland, flavorless, Swiss-shamingly neutral Common Lisp and Scheme communities? Now you're scaring me. :-)

I'm afraid I have to count myself among those who fully expect Arc, if it ever actually sees public release, to be BAD: Broken As Designed. I have an extremely difficult time imagining a Lisp that combines cleanliness and pragmatism that would offer anything sufficiently compelling that Bigloo, Chicken, and apart from native-code compilation, MzScheme don't.

But this is one of those areas in which I would love to be proven wrong.

I blame it all on McCarthy...

...for never having finished the M-Expression version of Lisp:

Another reason for the initial acceptance of awkwardnesses in the internal form of LISP is that we still expected to switch to writing programs as M-expressions. The project of defining M-expressions precisely and compiling them or at least translating them into S-expressions was neither finalized nor explicitly abandoned. It just receded into the indefinite future, and a new generation of programmers appeared who preferred internal notation to any FORTRAN-like or ALGOL-like notation that could be devised.

My hand at the blame game

I blame it all on McCarthy...

...for never having finished the M-Expression version of Lisp:

I'll blame it on Apple for not going anywhere with Dylan. I wonder where the environments would be today if they had ran with it.

Harlequin went open, is it really available/working?

the CMU and Harlequin versions went open source, no? Harlequin had an IDEish thing?

Functional Objects went open source

It was a couple years ago I guess. Everything is there. The IDE is windows only because DUIM (the toolkit) is Windows only at this time. Supposedly, some are working on a gtk+ backend.

It's just too bad that the community is so small. Besides some quirky verbosity, Dylan would be my favorite language.

But my point was that if Apple had stuck with Dylan the community would have been much bigger and everybody that hates Lots of Irritating Superfluous Parenthesis would have gotten a quite powerful language in the Lisp family. Well....we still have the language, but the Java borg pretty much gobbled up Functional Object's run at it.

[oting] Dylan on JVM

Worth somebody taking a stab at Dylan-to-Java-bytecodes? (Or give up and use something like Scala?)

Harlequin Dylan is open source

Harlequin Dylan is currently called Open Dylan, and available from here. The IDE is only working on Windows so far.

Single implementation glory

I don't need Arc to be technically superior to any given Scheme or Common Lisp implementation. Just to have a single community sharing a single implementation would make it much more appealing to me.

I'm totally flabberghasted that Schemers worry about how to define a record in their program and Common Lispers worry about how to open a socket and everyone thinks that the solution is more standardisation.

Huh?

I really don't get your point. As a (PLT) Schemer, I really don't worry about how to define a record. Can you explain in more detail?

(I'm moderately glad that people are working on standardisation, but I secretly hope the increased complexity of R6RS will actually kill off some implementations.)

I will have to work to find

I will have to work to find the right words. My reasons are probably the same ones that make you hope for some Scheme implementations to die.

Common Lisp

I've now blogged my detailed thoughts on why I haven't been able to simply choose one Common Lisp implementation and ignore the rest (much as I'd like to). I would be interested to know how the Scheme experience compares.

Good point

"Many not altogether compatible implementations" is the number one reason why I avoid Lisp and Scheme. Once I got hooked on the tremendous stability of Perl & Python (and Erlang) across platforms, there's no going back.

Code isn't written in specifications

Thing is, you only need to chose one of those implementations. Then you can ignore the rest.

Only...

...if you're the only consumer of your code.

Also, code isn't written in specifications... unless it is (in Coq, Twelf, Isabelle...) :-)

Seriously, Coq's extraction ability (to Haskell, O'Caml, or Scheme!) is pretty awesome.

You could, yes

But then you wouldn't have the full power of the language community working on bugs and improvements and, most importantly, libraries.

The problem isn't standardization

The problem is inappropriate levels of standardization. (In some cases, there are also problems with vendors who intentionally subvert standardization processes, but I doubt this applies to the Lisp community).

Sometimes, things are standardized to strictly; making it very difficult if not impossible to produce a conforming implementation.

Sometimes, thing are standardized not well enough--issues which affect large parts of the user base (like object models, or how to bind to a socket) are either left as implementation details; or are standardized by disjunction--any of several competing implementations are permitted. Users who wish to write portable code need to consider the mechanism chosen by EVERY implementation they wish to support.

The latter case, I suspect, is what affects the world of CL and Scheme. It also affects the C/C++ worlds to some extent; companies like Rogue Wave make money writing abstraction layers for the many different ways to do I/O in C/C++ (where many things, including networking, unbuffered or asynchronous IO, any multimedia or graphics, and any concurrency or synchronization, are completely ignored by the relevant standards).

A well-written standard CAN be an effective means to production of compatible but competing implementations. But far too often, it's in the interest of a supplier to make compliance a low priority, and/or aim for the production of weak standards which are easily conformed with, but which deliver minimal value to customers.

Will Arc fix this? If it's noticeably better than either CL or Scheme, such that it prompts migration from either of the two main LISP dialects, then maybe. More likely, it will end up like most attempts at unification--and result in further fragmentation, or else simply be ignored.

Graham, I think, needs to remember that the converse of worse is better is frequently true: "Better" is often worse; especially when it disregards the established computing infrastructure. I don't know for sure what Arc will look like, but reading some of Graham's comments makes me think that it might be yet another attempt to Rewrite Everything From Scratch. Which we don't need.

Cynicism

I don't know for sure what Arc will look like, but reading some of Graham's comments makes me think that it might be yet another attempt to Rewrite Everything From Scratch. Which we don't need.

My lesser angel tells me that, for a certain kind of personality, the industry as a whole is better off if they do try to Rewrite Everything From Scratch. That way, they (and their egos, and their moral superiority) avoid getting in the way of those of us trying to get real work done. Whether this applies to Graham is a question for the reader.

Real work

those of us trying to get real work done

Is that a retaliatory round in the moral superiority battle?

Java was an example of Rewriting Everything From Scratch. From what I understand, you'd be doing different real work right now if it didn't exist. Compare Java to C++, which is a nice example of a language that wasn't rewritten from scratch. If you had to choose one over the other for all future code, which would it be?

We need more stuff to be rewritten from scratch.

Actually, much of Java

was (as far as your average enterprise coder is concerned) something new; rather than something rewritten from scratch. Java was simply written from scratch--and it was the first GC and VM-enabled PL to be successfully marketed on such a wide scale. (A few waffle words in that sentence, I know--but technology-wise, Java wasn't terribly new). And Java did have good reasons to shove the industry from compiled-to-native languages to an environment using a published IL; it probably helped Java survive the numerous attempts by a certain Northwest software vendor to kill it. :)

Graham, on the other hand, appears to consider the various industry-standard high-level VMs out there (the JVM, the .NET platform, Parrot, etc.) which have been developed over the years and each successfully host numerous interesting languages--to be Not Good Enough. Reading between the lines, it appears that he is presently in the business of making a custom VM or other runtime environment (something which is a fun project to undertake, but probably not necessary for a production language) to go along with Arc. I can only speculate what cool new features or capabilities are in Arc which cannot be (reasonably) supported on one of the aformenetioned platforms.

That said, industrial programmers trying to get their jobs done have little to fear from Graham, or anyone else who designs PLs for fun or profit. (Unless they work for the software vendor previously alluded to, one which Graham, ironically, recently proclaimed to be "dead"). Most such projects fail to be adopted (often it's the world's loss too); and those which succeed will eventually have a raft of books and such written about them. If an O'Reilly published tome entitled Arc: The Definitive Guide should appear at a bookstore near you (with the Roman aqueduct on the cover, I suppose), then you might worry. Especially since the only significant coverage O'Reilly currently appears to give Lisp (of any sort) is in the context of Emacs.

O'Reilly is already preparing

Previously, O'Reilly's stance has been a firm No to new Lisp books. In an apparent effort to phase in a series of Arc books modelled after the Java bookshelf, they have softened the wording on their Proposing a Book page, even mentioning the word "revival" in the same breath:

If you're addressing a topic where good books have sold dismally in the past (for instance, LISP, LaTeX, or Web-based training), you have a much higher threshold to clear with your proposal. Convince us why there is a revival of interest in your topic, or why your approach to a deadly topic will provoke interest nonetheless.

To answer your question

I should note that if I had to choose between Java and C++ for all future software development (a Hobson's choice if there was one), I'd choose.... C++.

Why?

It's a fair bit easier to Greenspun up a reasonable higher-level environment on top of C++ (download a GC here, throw in a bunch of stuff from Boost there, etc.) then it is to do reasonable systems programming in Java.

But were I to choose one single language for the world (and were my choice to immediately create the necessary tooling and other infrastructure that generally accompanies widely-used industrial languages), it wouldn't be either C++ or Java. Probably wouldn't be any Lisp dialect either. (If it isn't obvious, I'm not sure what it should be; I'd much rather be able to choose two or possibly three...)

best. term. ever.

"Greenspun up"

genius.

I realize "greenspun" is a person's name...

...but it seems to me that the present tense of that verb (in English) should be "greenspin".

After greenspinning around in circles several times in my career (and making C++ compilers do unholy things that Bjarne never anticipated), I do believe this latter form has the correct, er, spin.

Of course, greenspinning (or greenspunning) is often a temporary state, as greenspun systems which prove popular (or which have the backing of a key vendor, personality, or standards body) often get added to the definition of the language in question. C++, through the numerous libraries which have been written for it (Boost et al), now resembles a respectable HLL, at least on alternate Tuesdays. (Likewise, I might be ornery and point out that CLOS itself was originally a greenspin; as were Flavors, LOOPS, and all the other attempts to build an OO system on top of a certain dynamically-typed functional language).

It's all enough to make my green head spin.

Back to work...

scratching out rewrites

I loved see the topic veer to a favorite: Rewrite Everything From Scratch (REFS for short). I agree with Anton we need more stuff rewritten from scratch. But it kinda ruins any chance of an objective stance. Not that I think anyone here has one, right? (I'm glad Graham isn't here, since writing more code is a much better use of his time.)

If it's hard to discuss Arc in the absence of hard info, maybe we can instead discuss why one would want to rewrite things from scratch, and still stay on topic. Of course, reasons for are often also reasons against as well, since nearly every quality is both a good and bad thing, depending on context used as a metric. I can't hold Graham accountable for vaporware, since I'm a really bad vaporware offender myself, and probably can't do enough penance. (Maybe another seven years will be enough.)

Scott Johnson: It's a fair bit easier to Greenspun up a reasonable higher-level environment on top of C++ (download a GC here, throw in a bunch of stuff from Boost there, etc.) then it is to do reasonable systems programming in Java.

That's a fair description of what I've been doing recently, but as a REFS nut, naturally I have to rewrite everything from scatch, with Boost equivalents more suitable to the runtime design. But I really don't want to talk about what I'm doing. (I don't want to be coaxed; it would take too long.) It'd be a lot more fun to discuss specific things in how typical code is done that ought to be rewritten, with the right amount of pragmatism to cut better down to the right level of worse. However, it just takes forever to code everything scratch; dramatic pragmatism is necessary to get anything done. (For example, I think the way I'm doing it will make most people flinch; it's not pretty and violates most of the norms I use at work.) A very useful shortcut is to assume much more basic stuff is obvious compared to normal approaches, and write code accordingly in very high density.

I just want to re-implement Lisp, Smalltalk, Python, etc in the same runtime so I can borrow libraries from all of them, and use a re-writing infrastructure to target different preferred delivery languages. (This is how you avoid being left stranded when thriving communities churn out far more than you can write yourself.) It would be nice to end up with a bunch of stuff that could be self hosting even after translated into different languages. But I'll settle for the runtime being the sort you'd use to implement an efficient scaling C++ server. The stuff I'm writing is actually the way I wish the infrastructure had been in servers I've done in recent years, but there was never enough time to rewrite enough from scratch. Even if I never host the dynamic languages I'd like to, it ought to be a lot less painful to control memory in more C++ apps.

(I've been working full time in C++ almost continuously the last 18 years, and certain parts of it have gotten pretty irritating to use again.)

We don't talk about what motivates our technology choices, and lots of mutually distinct reasons can co-exist and compete with another. The most common motivation seems to be making money, and bolstering one's current choices that support that goal. Personally I'm interesting in control over whether my work is still useful and accessible, to me, in fifteen years. Value as a product doesn't have much to do with this.

(Every time I write something interesting here, I have to fend off job feelers from folks who seem interested in debriefing me. But I'm not interested in being debriefed. And I like my current job fairly well.)

Dynamic Language Runtime

I just want to re-implement Lisp, Smalltalk, Python, etc in the same runtime so I can borrow libraries from all of them, and use a re-writing infrastructure to target different preferred delivery languages.

Sounds a bit like your are seeking a DLR.

freely available vs control

Kay Schluehr: Sounds a bit like your are seeking a DLR.

Except that I also want to keep doing what I already do as well, which is completely control optimization of memory, threads, locks etc in Linux content delivery servers with an optimization focus. I just want to use other languages as well, using a common runtime with the C++ code.

Decision making parts of servers tend to be complex (async networking, continuations, and error recovery) but not resource intensive, and perfectly suitable for much higher level languages, since then control could be expressed without as many unnecessary low level nuisances.

Someone has to make sure memory and threads do the right thing in user space, and professionally that person these years is me, since I'm usually the user space runtime specialist who triages every weird thing. And it seems likely to stay that way, so I use tools I can instrument all the way down, letting me use evidence-based reasoning and scientific method about everything through direct inspection.

But the C++ runtime could make it even easier for me to do this, and some other ways of controlling memory would not only make C++ more efficient, but would also optimize runtimes of dynamic language runtimes built on top, since they could rely upon these useful heap guarantees (Luke Gorrie in on the right track here).

I just see an opportunity to line all my ducks up in a row, and get C++ standard performance in higher level languages that mostly do the intensive things in C++ primitives, using a better standard library which mutates less, all in a smallish body of code written by one person that avoids corporate twister design.

(I say the phrase "corporate twister" to myself all the time to explain why code ends up the way it does. Derivation of the metaphor is pretty obvious.) [cf]

So you want a language that

So you want a language that provides you with low-level control for system programming when needed, and high-level abstractions when you don't care. Like BitC?

quid pro quo

I'm willing to trade conjecture and analysis (and similar conceptually synonymous things) on a one for one basis if I see some value, in a manner not unlike the way characters in noir stories trade information: temporary mutual cooperation. (Yes, it's easy to make cloak and dagger jokes at my expense.) When I first posted in this thread, I mentioned I wanted to discuss reasons to rewrite. But I'm not too narrow; analysis of layers is just as interesting.

Looking at DLR and BitC web pages is a bit vacuous, like shopping, which I don't enjoy. I'd be more interested in hearing your first-hand impressions of, say, the consequences of compling BitC to Ansi C. I'd respond to that.

A longish response to your (very short) BitC question would be telling, but without my having found out much of anything. I write a lot of analysis at work, on the fly, just tossed off amid all the code I check in, usually in the form of wiki pages. I'm prolific, and I like doing it, but realistically I'm expensive even if I do write hundreds of lines of both C++ and Enlgish prose every day.

On my clock, lately I use my expensive time to write code to my own purposes, and I don't have as much of my time to do this as I'd like. Most people write about what they're doing as a kind of loss leader they can write off to advertising. But that's unlikely to be applicable for me, if ever, until sometime after I put code in other people's hands to play with.

I don't think you can pay for answers to questions like, "What do you think of X?" for various technologies. Answering to entice you to say more things I want to hear has to be more valuable to me than writing more code. (Some people answer to be liked, or more generally to enhance their reputations, but I'm too stuck in a work grind to benefit from being liked. Sorry if it makes me an asshole; maybe I'll be less of one if circumstances change.)

All the above is offered freely. A fair exchange for your link to BitC is my next question: What do you think of BitC's choice of targeting Ansi C as the bottom layer?

All the above is offered

All the above is offered freely. A fair exchange for your link to BitC is my next question: What do you think of BitC's choice of targeting Ansi C as the bottom layer?

It's the only choice they felt was reasonable at the time; they require manual control over execution and portability, so their only choices are a compiler back end/framework, or compilation to C.

Their ultimate goal is to build a formally verified compiler and runtime, and when I suggested LLVM as a backend, they passed because the LLVM effort was not aiming for a verifiable implementation (LLVM can compile down to C as well). They were fairly impressed with C--, but it was still too immature to use at the time.

They were also aware of MLRISC, but vetoed that for some reason that is probably documented on the mailing lists.

So the conclusion: C was the only fallback, and it makes for a quick bootstrap. They're using BitC to build the verified Coyotos operating system (whose predecessors were all written in C), so they have a good application for their new language, and you can expect a good language and runtime in a year or two by my estimate.

BitC has some interesting features, like intermingled theorem proving and programming, and compiler profiles to enable or disable manual/automatic memory management, so it will be interesting to see how well they achieve their goals of eventually supplanting C for systems programming.

reasonable choices, ultimate goals

Thanks, that's engaging even if I didn't get a sense of consequences you thought might follow from compilation to C. (During the 80's in school, I helped port a cfront version to my school's computers; I'd been interested in C++ since I bought Stroustrup's first book on it, so I was able to help with rough edges in C code boostrapping cfront. Reading the source was interesting.)

To leverage your terms (as a framework) I can focus on "reasonable choices" and "ultimate goals" to clarify my remarks. Every tech choice both acquires good and bad effects, and loses other good and bad effects. (No matter how good a choice seems, it minimally narrows and therefore loses choices.) Whether effects seem to matter in context depends on end goals, and end goals are at least partially driven by an audience (or market) who will reward good choices and punish poor ones, according to what is perceived.

In other words, sociological effects are at least as important as technical ones in success of tech. (I never say this at work. All my writing is very dry and technical and aimed at exactly what folks at work need to hear. No one in professional contexts cares to be lectured about abstractions.) What's most important is what an audience or market thinks of tech choices, because one usually has equally valid tech choices that vary in acceptability; by choosing well, one can avoid time waste from audience rejection.

Professionally, I'd avoid compilation to C because replacing C++ with C would never be seriously considered in any C++ shop I've seen. That's not my war. There's no possibility I can offer a non-C++ solution to folks already using C++ that has any chance of acceptance (on my one person budget). So I can't sell it.

Architectually, I'd avoid compilation to C because I'm afraid of other C code linked against what I generate. I tend to use C++ for effects aiming to prevent low level errors that are easier in C. So hardening would be more difficult.

Personally, I'd void compilation to C in my development stack because I want to end up with incremental compilation in one process without jumping through complex hoops. So it wouldn't be fun, and fun is big. (Please excuse my excessive use of parallel forms; it's such a habit, and it's terribly easy and clear.)

I really appreciate safety goals of BitC folks, with respect to security and theorem proving, but those aren't my goals. When a technology aims for something, it generates a certain amount of bureaucracy in service of those goals, and this bureaucracy interferes with code aiming for other things, if only passively. (I wish a clever acronym resulted from All Things Serve the Bureaucracy, the better to mock syndromes involved.)

I'm aiming for something closer to the space of possibilities I usually need to reach easily, so I can modify a base system to fit a particular context. But I want the base system to already be a reasonable approximation without tuning. However, ultimately I consider the code disposable in the sense I expect it to be rewritten over and over by myself and others.

(Right here I had a metaphor I meant to use to describe a house, using terminology from a fairy tale. But it occurs to me then name of the occupant is colorful enough to use for a while, and would make an excellent domain name. And I see that name isn't registered yet, so I can't very easily say it without causing the domain to disappear. It combines two words, and it's not nonsense.)

I regret writing so much to no clear purpose; I'm unsure I have more for this thread. But thanks once again for the BitC info.

[...] I'd avoid compilation

[...] I'd avoid compilation to C because [...]

Compilation to C is simply easier because C is more widely available, the support is more standardized across platforms and architectures, the performance and overhead trade offs are well known (particularly for kernel code), and the core language and runtime is just simpler; the C++ constructs which you may find attractive for application-level development are actually quite useless if you're just using the language as a backend. C also has much more code which can be linked against it than C++ (or linked more easily at least), though SWIG mitigates that problem somewhat nowadays.

On the downside, neither C nor C++ provide tail calls and other optimizations needed in functional languages, so they must be "simulated" in some way.

Down the road, I'd say BitC or something like it is what you'll want to do your low-level systems programming. Theorem proving is less compelling to some developers, but BitC's support for strongly, statically typed low-level primitives, and high-level features like type classes make it an interesting competitor to C/C++. If they get the performance right, all those lispy brackets will be the only reason not to use it, but the BitC devs have expressed interest in other surface syntax.

I hope that gives you a more explicit overview of the tradeoffs in using C. :-)

you're mostly right

I'll stipulate those C benefits and many more besides. I don't mean to knock the idea of compiling to C generally; I only meant to note problematic aspects for my own plans. I've no persuasion agenda here.

I didn't say which C++ constructs I find attractive. :-) I don't use all of them, and often replace standard bits with work-alikes I can show work better or faster in some context. I'm a big proof-by-demonstration guy. I'd much rather show the results of a test than argue principles.

The C++ parts are not just for backend use. As I noted earlier, I'll still write C++ in addition to other languages I map to the C++ backend. Over the last dozen years or so, a lot of my "hobby" C++ coding has later yielded small pieces directly solving a problem at work when cherry picked from the context I'd originally intended. In fact, most of my work optimizations derive of my older research, when it's not a new problem. (I love new problems; operations research puzzles are a lot of fun.) Most problems aren't new, though.

I've stopped expecting I'll someday stop cherry picking my research for use at work. So I actually code expecting it'll get ripped apart for new purposes. This is part of the pragmatic focus. If I don't think code has immediate application, even in pieces, I don't write it. I want it all to be useful even if I stop at an arbitrary spot.

If the Lisp, Smalltalk, and Python (etc) parts ever get used at work, it will only be after some crushing proof-by-demonstration wins a point. But obviously, anything in those languages running on top of C++ backend can be turned into C++ with little effort -- that might be the price of acceptance. (Actually, now I think about it, there are some things at work that might use some scripting, so I might do another language to cover that too.)

Stackless infrastructure for tail calls and full continuations is something I'll add in C++ for the languages using them. (All of them actually; once I have full continuations, I won't do without them.) I just wouldn't care to write much C++ code by hand using using raw mechanisms. :-) One version of the code base (I expect a lot of forking) might migrate the entire system over to a stackless version of C++, but that would be based on code generation. I'm not crazy enough to write that all by hand. It'd be far too tedious to add backtrace and debugging support without automation.

(It might seem I'm describing my stuff, when I said I wouldn't. But I'm not actually saying much -- nothing anyone would patent.)