Perlis Languages

I was wondering what people would consider Perlis languages, i.e. languages worth knowing because they should change how you think about programming. Obviously there is a lot of overlap between languages so what I'm looking for is a minimal set of established languages (by that I mean 10 years or older) that still provides a reasonably complete overview of different approaches to programming. My first try at a list would look something like:

Ada, C, Haskell, Java, Lisp, Smalltalk, Perl

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

What about TeX?

Its approach to macros (more or less the only control structure) is essentially meta-programming.

I didn't appreciate its effect on my macro comprehension until I explained to other people a set of C macros I wrote.

It also helps to blur the distinction between an application and a programming language, but this is true for all end-user programming languages.

Macros

I learned TeX, well really LaTex, after having done a lot of Lisp programming so I didn't have the impression that TeX changed the way I thought about programming.

I wasn't really asking about interesting languages as then my list would need to be a lot longer. I rather like Eiffel and always use Python rather than Perl for example but didn't feel they contributed much that wasn't already to be found in the other languages.

Tex seconded, PS

It's well worth getting to know the gory innards of Tex's digestive system because of how many rules of programming language design it breaks. You will emerge a sadder but wiser programmer, more tolerant of the failings of other languages.

Learning Postscript, and then the algorithm for transforming Postscript to PDF, was an influence on me.

how many rules of programming language design TeX breaks

Just wondering...

is there a rant anywhere that talks about this? I would love to read it. I've been thinking about ranting about this myself, but if somebody has beaten me to it, then there are better things to focus energy on.

Syntax-semantics separation

Tex has a notion of catcode, which tells Tex how to handle the next item of input, which can be changed at any time, and may arbitrarily determine the meaning of the remaining code. It's widely used in Tex programming, and it makes static analysis of tex difficult. Modern PLs don't extend their syntax this way anymore, it's just too hairy. Even m4 doesn't let you do this.

I mentioned this in What Are The Resolved Debates in General Purpose Language Design?

But is this a generally

But is this a generally known design smell? I know from my OO background that such contextual design is bad for multitudes of reasons. Although I never considered the static analysis slant thoroughly, it makes perfect sense and aligns with the basic program and system building advice books like Composite/Structured Design and Reliable Software Through Composite Design by Myers, and Program Design by Jackson advocated in the '70s: Passing context to a function turns a program into a tree of control objects, rather than collaborative peers with full independence.

It seems even in the thread you point to that you acknowledge this isn't that well known.

I've been kind of thinking about this a lot lately, especially as I finally have had some free time to fairly assess VPRI's latest OMeta work.

Basically, I am hoping for a POPL paper or something that discusses a "don't pull the pin on this grenade" principle.

The syntax-semantics grenade

I don't know whether it is explicitly described as a design criterion anywhere authoritative, but my impression of the evolution of PLs suggests that PLs have got a lot better at maintaining program comprehensibility in how they handle issue of syntax-semantics permeability.

I would love to hear of relevant literature.

I didn't appreciate its

I didn't appreciate its effect on my macro comprehension until I explained to other people a set of C macros I wrote.

Experience reports like this befuddle me, in part because I recognize how difficult it is to explain programming experiences. It's like when you awake from a dream and then rush to tell people what happened, only to realize that because it was a dream and you didn't actually have to pay attention to any of the details (the stuff that makes the dream cool to others), so you ultimately make up details.

Could you elaborate on some of the details here? What did the C macros do? How did you explain it? How did TeX help you explain it?

To me, if I can't answer questions like these, then I didn't learn anything from learning the langauge.

Assertion Macros

It was a set of macros for assertion handling. Basically, the requirement was that if assertions are turned off, they will all be compiled away, no trace of the assertion system in the code. But we also wanted to be able to turn them on and off up to a certain level.

So you would have several types of assertions and several levels for each type, and you can turn off assertions of type X from level Y upwards.

Issues people found hard to comprehend were:

  • Making new macros through macro expansion, using the C concatenation operator (##). (In TeX you'll have \csname ... \endcsname)

  • Implementing control structures by expansion into nothing. (Since the C preprocessor only allows bounded expansion of macros, you cannot have arbitrary loops there).

TeX didn't help me explain these issues per se, but when I saw other people not getting what happens there, I realise that it was easier for me to think in those terms since low-level TeX programming forces you to program this way.

Concurrency or logic?

None of the languages on your list encourage constructing programs as systems of concurrent processes. Ada, Haskell, and to a certain extent Java, do have good support for concurrency, but it's not the fundamental paradigm around which those languages were designed. So you might consider adding something like occam, pict, or some other language based around process-oriented programming to your list. Erlang is another possibility, if you're looking for something a bit more mainstream (although my experience is that Erlang programs tend to use slightly less fine-grained concurrency than occam programs). Another option would be something like Esterel, which has more of an embedded-systems bent to it.

You also seem to be missing languages from the logic programming side of things. Prolog or Mercury seem likely candidates here.

Logic Programming Languages

I hesitated with adding Prolog as I have used it. But when I think back I remember finding it interesting and in some cases very useful but it really didn't effect my programming in other languages as the paradigm was so different. However, it is possible I didn't reach a sufficient level of skill with Prolog for it to have any effect on me so I would be really curious if someone else did.

Java didn't change how I thought about programming either but I had done quite a bit of programming in Eiffel and C++. I surmise that had I not know Eiffel and C++ then Java would have changed the way I thought about programming.

I'll look further into the process-oriented programming model.

Declarative programming

I remember one particular insight with Prolog which probably had a big influence on me: the necessity of declarative reasoning about the program.

Among the logic programming languages, Prolog straddles the fence between declarative and procedural programming. If you came from another procedural language, it's easy to see a Prolog predicate as a procedure that calls its subgoals and backtracks if necessary. It's possible for simple programs. But very soon you cross the threshold where your intelligence (or mine, at any rate) is not up to the task.

From that point on the only way to understand what's going on is to perform a mental switch from this predicate does this... to this predicate is true if.... I still remember the shock.

The trouble with Prolog from that point on, of course, is that sometimes you can't reason declaratively...

Imperative Prolog

If you want to think of Prolog programs in imperative terms, one of the best ways is to understand its representation in term of the Warren Abstract Machine, which is a hand-optimised bytecode and storage architecture with special primitives to deal with the syntax of Prolog terms and unification of logic variables. The original 1983 paper has good examples, and I'm pleased to discover that SRI have made the PDF available.

David Warren, 1983. An abstract Prolog instruction set. SRI technical note 391.

Warren's Abstract Machine

There is also a book on my shelf somewhere about how to implement a WAM: Warren's Abstract Machine: A Tutorial Reconstruction. It is a very light, easy read. I think most freshman college students who have finished a course in data structures could easily grasp it.

Separately, I think everyone who has programmed in Prolog thinks of it in imperative terms. S&S even talk about this in their book somewhere around page ~225 IIRC. Also, Dragon book author Jeffrey Ullman in his other book Principles of Database and Knowledge-Base Systems, Vol. 1, talks about his Prolog variant datalog as something you would likely reason about imperatively.

Also, if you use Perl, and Perl has influenced your programming but you feel "what you learned in Prolog" is just too different to make use of in other languages... well... guess what... you are out of excuses buddy. Check CPAN for AI::Prolog! There is also SWI-Prolog for Java. And Yield Prolog, which can target Python and C#, but looks butt ugly.

The link to the tutorial is

The link to the tutorial is broken.

Fixed

thanks

A few more

As well as the ones you've already listed, I'd add:

  • Python, for reintroducing me to simplicity and expressiveness in code after years spent writing Enterprise Java.
  • C++, for template metaprogramming.
  • Assembly language, because it requires a very different mindset to higher level programming.
  • Aldor, where I first encountered dependent types (and a lot of mathematics that was new to me too, thanks to the libraries!).
  • Oz, for showing an easier approach to concurrency (from what I've seen of it so far).

languages I left off the list

Some were on purpose and some were simply out of ignorance (although knowing more I might still leave them off)

  • Python: I really like Python and perfer it to Perl but if I had to pick one in terms of how much it affected the way I think about programming it would be Perl. But I wouldn't find it unreasonable to substitute Python for Perl.

  • C++: it did change the way I think about programming but only because it was my first statically typed OO language so Java seemed like a reasonable replacement. Templates always seemed a poor cousin compared to Ada's support for generics.
  • Assembly Language: not sure where I stand on this one, it is so low level it is almost apples and oranges.
  • Aldor: don't know enough
  • Oz: to hear some people tell it all you need to know is Oz and you have everything covered

I also left off Eiffel. The design by contract approach was quite influential on me but I picked that up just reading Betrand Meyer's early book and didn't get around to writing code in Eiffel until years later.

Assembler

Assembly Language: not sure where I stand on this one, it is so low level it is almost apples and oranges.

I second this. The supported macro language is important.

If your type theory is up to scratch and you are interested in verifiably correct code, then I recommend Greg Morrisett's Typed Assembly Language. I'm surprised this doesn't seem to have had an LtU story.

VLIW assembly

Assembly Language: not sure where I stand on this one, it is so low level it is almost apples and oranges.

I've been studying TI's C64x+ ASM for use on the Beagleboard recently, and it blew my mind. This arch is a clustered VLIW machine with delay slots on most of its data-processing instructions, in addition to its branches. Pipeline hazard management is the responsibility of the programmer: if you fail to avoid a hazard, then rather than stall the pipeline, you get undefined results. The arch also includes special instructions for building software-pipelined loops.

You'll never think about instruction-level parallelism the same way again.

Typed Assembly Language

Just looking around I didn't find anything more recent that 2000 although there is plenty to dig through. At the time they seemed to be pitching it as more suitable for multiple languages than Java Byte Code. I'm guessing Java Byte Code came out the winner. I'm not sure if byte code would qualify as a "Perlis Language" but I kind of doubt it.

There's more

Following up...

I've generally found Perl & Python to encourage quite different styles of programming so I wouldn't lump them in together, but each to their own I guess.

Regarding Aldor, it was really the concept of dependent typing that I found influential, rather than the language itself. Aldor's documentation has a page about it's type system which does a fairly good job of explaining the concepts. There are probably better known, or more widely used, languages with dependent types (Epigram, possibly?); Aldor is just the one I happen to have used.

I was in the same position as you with Eiffel's design by contract - except that I never did get around to writing any Eiffel code afterwards. The concept was pretty appealing though, enough so that I started to apply it in my Java code.

Perl and Python

I learned Perl first and then switched to Python several years later. I remember being very impressed with Python but never got the kind of eureka experience I had with other languages and I felt the really novel aspects I had picked up in Perl or somewhere else already. One example is generators in Python. I realized I had done similar things in other languages (specifically Ada with tasks and Java/C++ with iterators) but generators were quite simple and clean in comparison.

I do agree that Perl and Python encourage different styles. I guess the category I put them in was dynamically typed scripting languages although both have evolved a lot since they were "just" scripting languages.

Oz: to hear some people tell

Oz: to hear some people tell it all you need to know is Oz and you have everything covered

What I found impressive about Mozart/Oz was something quite simple - you can sort-of nudge source-code step-by-step out of one "style" into another.

Usually when you move from one language to another there's a high-barrier of somewhat incidental differences that takes your attention away from the "style" differences - in Mozart/Oz there aren't those incidentals to get in the way so you get much more of a feel for the "style" differences.

Careful Design

Isaac's point, I think, validates the approach to evolving the kernel language taken in CTM, which is to only add a feature when that feature cannot be modeled by the existing kernel language without making non-local changes to code. That care in design shows up exactly as Isaac describes, in my experience.

/bin/sh

Another language from the "Thing of Pain and Beauty" class. More than any other, it is the language that taught me to think hard about whether I really understood what I had just written.

Postscript — Olin Shivers (1994), A Scheme Shell:

Shell programming terrifies me. There is something about writing a simple shell script that is just much, much more unpleasant than writing a simple C program, or a simple COMMON LISP program, or a simple Mips assembler program. Is it trying to remember what the rules are for all the different quotes? Is it having to look up the multi-phased interaction between filename expansion, shell variables, quotation, backslashes and alias expansion? Maybe it’s having to subsequently look up which of the twenty or thirty flags I need for my grep, sed, and awk invocations. Maybe it just gets on my nerves that I have to run two complete programs simply to count the number of files in a directory (ls | wc -l), which seems like several orders of magnitude more cycles than was really needed.

Whatever it is, it’s an object lesson in angst.

Coq

Haskell/C ~ Coq/Haskell ;-)

Let's not forget Basic...

...or whatever language drove the transition from non-programmer to programmer in your particular case. If that doesn't change "how you think about programming", nothing ever will.

Profound

For me, I think the first programming language I ever used was a DSL called Macro Shop that allowed you to create ASCII art. From there, I wondered how such a tool was ever built and learned Visual Basic, since that is what I heard the author used (When I asked, somebody mentioned a good clue was the use of a certain VB3 filetype, I can't remember, the predecessor to *.OCX).

Context

I think that what makes a new PL change the way you program is that it comes with both new concepts to use and new problems that those concepts can be put to use in solving. I used the CP/M ZCPR shell before I used UNIX sh, but I wasn't admin for that machine, and so I wasn't motivated to really learn the limits of the language.

I don't think that any of the languages that changed the way I programmed are languages I would hold up as examples of languages that are both well designed and that I would use today. The best example I can give is Olin Shiver's scsh, as the best example I know of how to present UNIX services in a manner compatible with higher-order, lexically-scoped programming style. Maybe Perl 6, if it ever gets finalised, will make me use parser expression grammars in day-to-day shell programming — that would count.

First Language was BASIC

but in my opinion I easily could have skipped it. Clearly it did change the way I thought about programming but I was trying to think of a small set of languages that if you didn't know one of the languages --or one very similar-- then that is probably something you should look into correcting.

Just thinking about a language with line numbers brings back unpleasant memories.

Low-level stuff

I hacked a lot of Amiga assembler as a teenager and I'm reliving this a bit with Forth development lately. Interacting with genuine pieces of hardware - a CPU and a bunch of chips and peripherals - is worth doing. Quite advanced ideas (like Continuations, DSLs, self-modifying systems, etc) are very understandable with the CPU (or Forth VM) as an operational model.

I would like to learn a hardware description language next.

APL?

APL?

The Quotations Page

Lambda the Ultimate Quotations

APL is a mistake, carried through to perfection. It is the language of the future for the programming techniques of the past: it creates a new generation of coding bums.
Edsger Dijkstra, How do we tell truths that might hurt?

Interesting Opinion

I looked at APL around 1995 and it did catch my interest in the way Lisp had 5 years earlier but in the end it mostly seemed to be old stuff in a new package (new to me that is). I think there are some good insights in APL that were not recognized for a long time but was under the impression that functional languages had pretty much picked up most of what was worth picking up. And if functional languages didn't pick it up then "scripting" languages did.

Admittedly my knowledge of APL is superficial based on reading a book and several articles and playing around briefly with and APL environment. Would be more than happy for someone with actual experience to disagree with my impression.

The main thing that APL actually did

First, it was one of the first really interactive programming environments (note I said "one of the first", not "the first" - I think that was JOVIAL). As such, many insights about interactive use were gained from it.

Second, it was one of the few experiments in using a new orthogonal textual operator set to describe programming operations. And, it used those textual operators in a unique way (i.e., the overstrike concept).

Finally, it was one of the first languages to take a primitive datatype to its "natural conclusion". It was the minimalist language of its day. In fact the use of mapping as a primitive operation makes much of its code look almost functional.

Whether one counts these items as useful or not, I believe that APL is still a language worth studying, because at least the second feature hasn't been replicated (where it still might make programs more understandable).

The Design Docs Page

Parrying the quote by Dijkstra, there is also the actual design intent behind APL linked to here as well, on another page: The Lambda the Ultimate Design Docs. See Iverson's A personal view of APL.

The neat thing about APL is that it didn't require you to know matrix multiplication to do matrix multiplication. That is pretty powerful, when you consider how many people don't know this. Today, of course, most object-oriented kernel languages provide such operator overloading facilities.

UML

I am gonna get murdered here for saying this...

but easily the most profound languages I've learned are Lisp, Tutorial D, MetaOCaml, and UML.

Lisp: symbolic list processing
Tutorial D: it implements the Third Manifesto's ideals, enough said
MetaOCaml: I think of as a multi-stage object-functional assembly language

UML: I think this is the highest level language of them all (especially with the right MDA profile). Once you learn to read pictures and understand how pictures can be orthogonal, you just enter another realm of thinking about your programs -- like Tutorial D it emphasizes knowing the difference between a representation and a platonic ideal

Lisp took me 3 years to understand. UML took me 7.

OT: Constraint diagrams

Once you learn to read pictures and understand how pictures can be orthogonal...

As you start by saying you expect to get murdered ,perhaps you'd agree that UML can encourage some very bad habits and fuzzy thinking :-) I find it useful in informal discussions, but in my experience people inevitably begin to ascribe too much meaning to a series of relatively vague diagrams. You need pictures with unambiguous meaning before you can rely on what you read into them, and if you want them to express entire specifications they must have richer semantics than UML - OCL. As a formal, purely diagrammatic system designed to model operational constraints and static specs, I think constraint diagrams deserve more attention, even if they're not ready for widespread use.

Back on topic, I vote Coq/Agda for an approach that certainly changed the way I think about programming, or rather proofs that write programs that write proofs :-)

As you start by saying you

As you start by saying you expect to get murdered ,perhaps you'd agree that UML can encourage some very bad habits and fuzzy thinking :-)

That is exactly why it took so long to learn. Learning UML's syntax takes a single college course in "Business Systems Analysis & Design" or "Classical & Object-Oriented Software Engineering Methods" (or whatever it is called elsewhere). UML has a lot of cruft that can encourage poorly designed systems. In college, we were actually only taught a very small subset and as my Rational IDE could print out that portion from the code, I never did up-front design. But I had really pretty printed diagrams.

However, everything in UML if used correctly should lead to a very orthogonal design that helps separate out various physical implementation details. Now it could take me another 7 years to learn how to explain this effectively...

I regularly use UML

but almost exclusively as part of communicating with non-programmers or informally just sketching something with programmers. I've always used it but I learned all my oo and structured techniques way before UML existed so I don't actually use it up front when I'm designing.

I could see where it might have the same kind of effect as a programming langauge does depending on when you are introduced to UML.

APL, ALGOL67 & Alan Perlis - (& Objective C)

From a historical perspective, Alan Perlis used APL to teach the introductory computer science course, CPSC 221, at Yale around 1976-1984, covering much the same material as in Structure and Interpretation of Computer Programming. APL could be used in an interactive or compiled fashion, allowing one to build idioms and programs in a stepwise fashion. By the end of the course, one could write a full parser in APL. It had strong capabilities for mathematrics, including multidimensional matrices. It also a font set, which included latin and greek letters, and more symbols than you shake a stick at. An APL program could rival any PERL program for its opaqueness and terseness.

I mention ALGOL-67, since Perlis was involved along with Dijystra in developing one of the first modern structured computer language, which eventually lead to the C-family of languages.

Today, I would consider Objective-C a revolutionary language, since Apple has been developing it into a power systems programming language, with recent additions including blocks to allow for management of concurrency. Along with the Open CL spec, with its C-like language, it provides extensive tools for concurrent programming on multi-core systems.

Algol vs C

I mention ALGOL-67, since Perlis was involved along with Dijystra in developing one of the first modern structured computer language, which eventually lead to the C-family of languages.

You make it sound as if C was the evolutionary conclusion of the Algol development. That is very misleading - the Algol inventors would probably bend in pain if they heard this. C has its roots mostly in BCPL, and was meant to be a relatively low-level language. Algol had an almost opposite philosophy, it was the high-level language of its days. It is true that C took some ideas from Algol, but almost everybody did.

Also, I don't think there ever existed an Algol 67. The only official languages were Algol 60 and Algol 68. Neither Dijkstra nor Perlis had much part in Algol 68 (in fact, Dijkstra hated it), but that's the version that influenced C, according to Ritchie.

Language behind Language

Probably confusing with Simula-67 - an Object Oriented extension of Algol 60.

Speaking of Algol 68, I picked up a couple of discounted books on the language a while back. Althouch I've been meaning to play with the language (Algol 68 Genie looks interesting), the main thing that struck me is that the explanation of Algol 68 uses a lot of vocabulary and terminology that is different from most other languages (e.g., mode, mode-indicant, denotation, collateral-clause, formulas, generators, stowed, etc...). Coming from the outside it can be hard to tell when they are using different words for the same idea; or when the idea is different - subtle or innovative.

Anyhow, register me as one that thinks that some languages are interesting simply because they are described in a very different fashion - the language behind the language.

Is Objective-C revolutionary?

my impression was it was pretty much Smalltalk's vision of OO integrated in a reasonably clean way with C rather than the conceptually much different approach taken by C++. I didn't include C++ because I figured most innovations from C++ were done as well or better in Java. Similarly I figured if you know C and Smalltalk there isn't much that is going to suprise you as new in Objective-C.

Hmm... I can't speak about

Hmm... I can't speak about Objective-C, but even without the will to really rant, I'd share:

I've never considered C++ to be a "revolutionary" OO language, of course. I don't even think Stroustrup ever thought it himself (even less, claimed) of that way of extending C to "catch up on OO", in the footsteps of Simula 67, and later on of Smalltalk, which both had introduced 20 years earlier (in the case of Simula) probably more than two thirds of OOPLs' concepts still in use today.

I believe that, given the set of constraints that C++ was willing to embrace and resolve, C++ was from the start doomed to end up having an awfully complex semantic. Then, it having to undergo quite heavy standardization processes in/by committees at ANSI and at ISO rather quickly after its first inception, and first implementations by AT&T didn't help either, I suppose.

Anyway. I had to use C++ every day from my beginning to my mid career (starting from the mid 90's). I never enjoyed coding in C++ while everybody else would require me to do so (because "It's the most powerful they say, don't you know that?" -- "Mmm... 'k. Whatever. If you insist...").

And by that, I do mean: never enjoyed at all, or very exceptionally/circumstancially. As "sad" as it may sound to those that don't like C# either, for instance, I can say that I'm much happier to have to cope with Anders' C-offspring than I had with Bjarn's!

Note: as I see it, .NET's managed C++/CLI is only rather remotely related to the "barefoot" C++ as we know it from the late 80's; I'm still fine doing some managed C++ in .NET when I'm required to (most of the time to bridge some interop code uninteresting to port to C#, but that's less and less frequent, anyway)

Eventually, after some recollecting and reflecting, I figured the problem I probably had with C++ for "so long" wasn't so much in the language's syntax/semantic design specifics themselves, than the way too big "impedance mismatch" between the overall language's complexity and the actual productivity I could hope to achieve with it through its implementations (Borland, MS, Watcom, etc).

Unsurprisingly, by not having to use it anymore at all at some point, I've probably lost some of my C++ skills way faster than if I had enjoyed more my writing of code with it by the past.

Oh well, just my experience anyway. (YMMV)

[BTW: I'm really talking of C++ as I had to use it with its OO features, and not just its legacy C-subset; hence, also having to deal with Borland's OWL or MS's MFC, then MS' ATL, etc. Goodness, glad I could move on to something else, eventually... One of those things I'm absolutely fine with no nostalgia.]

Perlis and APL

In the Fall of 1970, Alan Perlis taught APL at Carnegie-Mellon, in the S-601 course (second course in the Computer Science undergraduate sequence). There were a limited number of terminals available, so punched cards were used (I wish I were kidding). My impression when taking the course was that Prof Perlis had been using APL for a while; he was certainly enthusiastic about the language.

I later asked Prof Perlis why he’d switched to LISP (at Yale): He replied that it was much easier to organize team programming efforts in LISP than in APL (as I recall it, this was at the First Hopkins Symposium on the Role of Language in Problem Solving (1981)).

Just one example I can give,

Just one example I can give, interesting if only in the way I evolved around it:

I think XSLT was, to me at least, of the kind of languages you're alluding to... I say "was" because, indeed, it was back in the days where I'd be watching/playing with, regularly, everything XML-related.

Nowadays, XSLT is just yet another functional language especially useful when my programs have to eat and tweak "non-trivial" trees that happen to have an XML representation (or where it's cheap enough to come up with either forms -- serialized or not).

Much like XML itself, XSLT (1.0) in itself is becoming (relatively) less and less interesting, as a language (again, to me, at least). Which doesn't mean I'll lose interest in efficient implementations of it, or of its successor's, XSLT 2.0, any time soon. That's clearly a great computer language for its set of purposes, imho.

[Edit: goodness, this thread populates fast, btw.. :)]

I'll have to second that

I, too, was looking at XSLT way back when and still regard it as as a really nice language. Perhaps it was the sudden shock of, having worked through the hideous XML syntax, finding an elegant little gem.

In any case, the changes in my thinking revolved around the idea of a tree-transformation language. I was never a big Lisp macro writer, although I Common Lisp was my favorite programming language for quite a while. I suspect XSLT led to my easy and fanatical adoption of algebraic data types when I later ran across ML.

Yes, I think noticed that "a

Yes, I think noticed that "a while ago" about D.

D's designers had better memory, say, "than average", then.

I guess some of us wished more languages had been in the same case, if only in that trend of languages. To be honest, haven't digged enough into D so far -- lack of time, and/or energy, as usual.

But thanks for this reminder, btw: just had a look again at DM's language features and I'm thinking it's definitely worth I be back on there later soon :)

[Edit: oops. sorry, confused the thread in replying... could someone move this to where it belongs?]