Google's Dart announced

A while back we learned about Google's thoughts on Javascript. Well, it seems Google's Dart Language is now live.

Dart is a new class-based programming language for creating structured web applications. Developed with the goals of simplicity, efficiency, and scalability, the Dart language combines powerful new language features with familiar language constructs into a clear, readable syntax.

The full specification (PDF)

A feature I find interesting is that you can add static types:

Dart programmers can optionally add static types to their code. Depending on programmer preference and stage of application development, the code can migrate from a simple, untyped experimental prototype to a complex, modular application with typing. Because types state programmer intent, less documentation is required to explain what is happening in the code, and type-checking tools can be used for debugging.

this will improve reliability and maintainability, I imagine, right?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

tail calls?

No word in the spec about tail calls, it seems.

Probably not

There probably won't be support for tail calls. The language needs to compile to JavaScript which doesn't support them.

JavaScript is TC

Meh. JavaScript is Turing complete, therefore it can support tail calls. Generate bytecode and interpret it if you must.

(I'm not saying this is a sane method of language translation, but lack of feature X in the target language does not mean the source language cannot have feature X.)

Tail calls in JavaScript

Tail calls have been (tentatively) voted into ES6.

Quick reactions:Good:

Quick reactions:

Good:

  • Erlang style process model. This will keep gc implementation complexity down.
  • Lexically-scoped, albeit with the stupid Java shadowing restriction. Javascript has really lowered the bar in terms of what will make me happy, though, so this is a model of sanity in comparison.

Bad:

  • No word on tail calls.
  • Null. In 2011.
  • Reified generics. This is tempting, but mandatory representation-passing gets ever more expensive as you add features like higher kinds.
  • Intentionally unsound type system, with covariant generics. On purpose, but still the wrong choice.
  • N-ary function types. Now you can't write a generic apply, joy.
  • Where are the tuples? I've given up hope on sum types, but there should at least be tuples.
  • Weird hackish void type.

Overall, it seems very derivative of C#.

Makes me wanna cry

This

The type system is unsound, due to the covariance of generic types. This is a deliberate choice (and undoubtedly controversial). Experience has shown that sound type rules for generics fly in the face of programmer intuition. It is easy for tools to provide a sound type analysis if they choose, which may be useful for tasks like refactoring.

makes me wanna cry. It's like saying that gravity flies into the face of astronauts. Let's violate basic laws of math simply because, allegedly, they are not "intuitive"? Also, the last sentence seems very naive -- if the type system is violently unsound then doing a useful type analysis after the fact is as hard as doing one for a language that isn't typed at all.

Java's use-site variance

Java's use-site variance modifiers on generics has poisoned a large chunk of a generation of programmers on expressive static typing. The problem is that expressing type flow fully and explicitly is more difficult for most programmers than writing code that passes values around and deals with runtime type errors when and if they happen. The word chosen for this difference in difficulty is that the latter is more "intuitive" than the former - I don't think it's a particularly bad choice of word. The phenomenon is one of the biggest reasons dynamic languages have become a lot more popular over recent years, a rejection of complexity in specifying static types.

It's like there's another triangle tradeoff: expressive, sound, simple: choose any two for your type system. Almost everyone is unwilling to forgo expressiveness - the object graphs weaved in modern software can be quite tangled indeed - while any language that hopes to have large-scale success cannot start out being anything but fairly simple. So they give up some measure of (statically-typed) soundness, and expect lots of runtime type errors during debugging and testing.

I think this is a common

I think this is a common trend on this "web2.0 era": move complexity away from development time (improving readability and expressiveness) towards production time (making realiability, soundness and maintainability more costly).

It's all about time to market and cheap development costs, I imagine. After all users are becoming beta testers nowadays, aren't they?

Well, perhaps this equation

Well, perhaps this equation often wins; why are dynamic languages so successful? If you can make A cheaper by making B more costly, then you win if A is more important than B. I don't exactly see that here, but it makes sense why they wouldn't go with something like Scala (developers are intimidated by type tar pits?).

I find Dart to be boring, actually. I don't see it being very disruptive.

I've run a four nines service written in Java for several years

I've run a service with 99.99%+ availability for several years; a service with the codebase written in Java. Your estimation of where the errors come from does not mesh with my experience. While null references has caused fair amounts of errors, I've never seen a problem occur due to unsound generics.

Same question

I'm sympathetic to the quoted concern, but I don't see how covariant typing helps. The problems I've had with type systems, and that I've seen other people having, are more about generics. For example, the use-side variance described elsewhere in the comments.

Powerful type systems should be unsound

This isn't directly related to Dart, and so is heading a little off topic, but IMO as we try to integrate more expressive types into programming languages, it will be important to let the user facing type systems become "unsound". Consider that any time you leave a case off of a pattern match, you're actually subverting the type system. Similarly, half of the things you do with phantom types in Haskell should be erasable from types, otherwise you're left doing theorem proving in what's a really poor environment for it.

On the other hand, I do think we want a sound reasoning system beneath the user facing type system, and I won't bother checking to see if Dart has such a system.

I would disagree with that

I would disagree with that train of thoughts. I think it's necessary to have an "escape point" that allows you to purposefully break¹ the type system, such as Coq's "admit", OCaml "Obj.magic" or Haskell "unsafePerformIO" (and, dynamically, OCaml's "assert false" and Haskell's "error"). But that's a single escape point in a otherwise sound type system. This is enough to explain your example: an absent case in a pattern match actually means `| missing_case -> assert false`.

Taking liberties in the other parts of the system because "it's ok to be unsound" leads to an unsoundness in the *reasoning* that you can't contain and control anymore. I don't see how that would be a benefit.

I'm unsure what you mean by "phantom types erasable from types". I agree that hacking too much in the type system is not a good idea; to me, the solutions are to embed an external prover that keeps the whole system sound (by providing in-system certificates), or to rely on dynamic assumptions that are then injected² in the type system : you are discharged of the proof burden, but the soundness remains.

¹: it would be good if the dynamic semantic was safe in front of such escape hatch; it could for example dynamically test that the unsafe safe is actually safe, or insert dynamic type checks in some way. That's not done in current-generation languages, which means that in this case you're on your own (segfaults, etc.). Nicely degrading to dynamic checks would be better. Bracha's argumentation for "pluggable type systems" can be interpreted to say exactly this.

²: one specifically-considered way to do that is languages where assertions or conditional tests enrich the typing context (eg. the typestate systems such as Rust, or F*). It can also be done "by hand" with only type abstraction:

  module Pos_int : sig
    type t = private int
    val pos : int -> pos_int
  end = struct
    type t = int
    let pos n =
      assert (n > 0);
      n
  end

We're not too far apart, I think

I think it's necessary to have an "escape point" that allows you to purposefully break¹ the type system

This is just another way of saying that unsound type conversions should be explicit. I agree. If you want to characterize this as a sound type system with "escape hatches", that's fine with me, but as you allude to in your footnote [1], it would be better if these unsafe conversions were part of a more principled system.

I'm unsure what you mean by "phantom types erasable from types".

Just that where phantom types are just being used to pass along proof obligations, it should be possible to cast them away. Or stated another way, it should be possible to pull the required proof term "out of the air" by merely asserting the proposition to be true, producing a proof obligation for the ambient theorem prover to discharge. Or not. You should be able to ship an application that's reporting "4931 undischarged proof obligations."

Is proper variance compatible with objects/classes?

  • Intentionally unsound type system, with covariant generics. On purpose, but still the wrong choice.

There was a recent 'let's make a slightly better Java' language (Gosu) that made this same decision. And the cited advantage was that you could make an elem with the proper type. That is:

l : List T
l.elem : T -> Bool

Scala, for instance, gives l.elem the type Any -> Bool, which seems lax, since it'd be nice to statically rule out asking if an Int is in a list of Strings.

However, Scala actually can't do better. Ostensibly, List should be covariant in its argument, but if you try:


trait List[+A] {
def elem(e : A): Boolean
...
}

You get a variance error. List is supposed to be covariant in A, but it contains an A in a negative position. It of course works if you define elem as a separate function on lists, and there are then, I think, tricks you can play to make 'l.elem' a valid term, but you cannot just use Scala's OO facilities.

Near as I can tell, the problem boils down to the OO practice of bundling the functions together with the data, and this being incompatible with proper variance of parameters. That is, an OO list is not just an element of the algebraic data type, but also carries a suite of methods to operate on that element. And then it becomes obvious that this conglomeration is not actually covariant. Switching to Haskell for a moment, we have something like:


data List a = Obj
{ uncons :: Maybe (a, List a)
, elem :: a -> Boolean
, ...
}

Which is clearly invariant. You'd have a similar problem if you tried to conflate data types with modules that operate on them in ML, for instance. The parameterized module may be invariant in the type parameter while the data type could be covariant.

So, could structuring programs this way be leading people to make these bad type system decisions? 'I know that lists should be covariant, but something weird is going on with my list class; I'll just make it covariant anyway.' Currently one seemingly has to choose between proper variance and proper code style.

Of course, I'd just get rid of classes/objects as an organizing unit. But that's not an option for someone trying to be an improved Java(Script). And it's probably not surprising that they haven't worked out a solution to the above dichotomy, either.

Wouldn't the Haskell type be more like...


data List a = Obj
{ uncons :: Maybe (a, List a)
, elem :: Comparable a b => b -> Boolean
, ...
}

It seems to me that this is closer to the "right" solution. The Scala approach amounts to allowing Any values to be compared.

The type of elem has nothing

The type of elem has nothing to do with the fact that it is packaged in lists. It is a direct consequence of covariance. Think about it. Assume List is covariant, i.e. List[T] is a subtype of List[Any], for any type T. Assume further that you have a polymorphic element test on lists, but now define it as a function outside lists:

def elem[T](xs: List[T], x: T): Boolean

Then

elem(List(1, 2, 3), "abc")

is well-typed. Just instantiate elem at type Any.

So to conclude: The "strange" type of the elem method is a simple consequence of covariance of lists. It does not matter whether you define elem as a method or as an external function.

But wouldn't you say...

That, ideally, contains() should take T? I mean, asking if a list of integers contains a string is nonsense, and almost certainly a programming error of the sort that a type system should prevent, if possible.

The solution I would prefer in a language like Scala would be the ability to mark a covariant type variable as "acting contravariantly in this position, trust me, I know what I'm doing, I won't write it to a memory location, pinky swear."

OTOH, I'm not a type theorist, nor do I play one on TV, so I don't know if this makes any sense at that level. I'm just looking at it with intuitive logic.

"Ideally"

That, ideally, contains() should take T? I mean, asking if a list of integers contains a string is nonsense, and almost certainly a programming error of the sort that a type system should prevent, if possible.

But if List is covariant as assumed, then you can just as easily case List[Int] to List[Any] and ask if it contains a string in any case. So exactly what error do you want to be prevented? (If list is not covariant, then there's no problem to begin with.) As Martin says, it really is a natural consequence of covariance.

The solution I would prefer in a language like Scala would be the ability to mark a covariant type variable as "acting contravariantly in this position, trust me, I know what I'm doing, I won't write it to a memory location, pinky swear."

Ever used const/mutable in C++? Nobody can even agree what const ought to actually mean anymore, IMHO due in large part to all the pinky-swearing... Again IMHO, down that road lies only madness.

It also depends on other choices Scala has made

But if List is covariant as assumed, then you can just as easily case List[Int] to List[Any] and ask if it contains a string in any case. So exactly what error do you want to be prevented? (If list is not covariant, then there's no problem to begin with.) As Martin says, it really is a natural consequence of covariance.

Only if you assume an ordering on Any and allow implicit casts to Any.

If you have any Any

Note that the more precise typing `elem : ∀(β≥α), Ord β ⇒ β → bool` doesn't rely on the existence of `Any` – not all flavours of subtyping have such a top type.

What are you getting at?

I agree with what you just wrote, but I'm not sure if you're agreeing with me or correcting me :).

The point carsongross made was that asking for int membership in a list of strings is likely an error. (I'll add that it also impedes type inference if you can only conclude the element to find is of type Any.) A simple way to get that error message in the presence of covariance is for the compiler to complain "I don't know how to compare Ints and Strings."

Correct.

Implicit in that discussion was the assumption that there is some universal equality on type Any. Universal equality is something that Scala inherited from Java, and I am less and less happy with it the longer I use it. Without universal equality the correct signature of Scala's contains would be:

    class List[A] {
      ...
      def contains[B >: A : Eq](x: B)
    }

We can still compare against values of supertypes but only as long as an equality Eq exists for the supertype.

So, yes, the whole covariance debate is a red herring. The problem lies with universal equality.

I'm really not schizophrenic

After just posting an hour ago that getting rid of universal equality is a simple way to solve the problem, I have to say I don't think that's the ideal solution. Better, IMO, is getting rid of implicit unification to Any altogether. Because even if there isn't an equality constraint, as there isn't with append, I think it's preferable to error if the only unification is to Any. i.e. "Error: cannot append an Int to List[String] without an explicit cast to List[Any]".

In other words

In other words, you don't want a top type?

I didn't say that

I don't want implicit conversion to Top.

Well...

Well, then you don't want subtyping. ;)

Seriously, if you have subtyping, and a top type (which is meaningless without subtyping anyway), then anything has type Top. No way around it, except by replacing either subtyping or Top with something else (whatever that may be).

I think...

...you can have a type system which has a full subtype relation, but which doesn't apply subtype coercions unless explicitly asked to. You can even allow implicit casts only at certain types -- e.g., perhaps you only implicitly apply subtype coercions at base types, and require an explicit coercion at higher types.

F# has this, to some extent.

F# has this, to some extent. In some cases you have to explicitly use upcast. At least in they way F#'s does it, in my experience this causes more trouble than it's worth: you have to insert a lot of upcasts to make the type checker happy.

Predicate subtypes

The most important implicit conversion, IMO, is between predicate subtypes. Without implicit unification of {x:T | prop1} and {x:T | prop2}, they would be unwieldy.

Subsumption vs subtyping

...you can have a type system which has a full subtype relation, but which doesn't apply subtype coercions

Yes, I should have said subsumption instead of subtyping.

But if you want non-trivial subsumption (like Matt seems to), and at the same time an even more general subtyping relation, then you effectively have two different relations to deal with. That seems rather muddly, and not like a simplification for the programmer overall.

I think subtyping without

I think subtyping without subsumption significantly limits its expressiveness. In fact, I'd argue that without some kind of implicit subsumption rules, you don't have really have subtyping at all.

I think the motivating List example is mostly a failure of type inference, and not variance problems. Virgil doesn't have variance for user type constructors but the example comes out:


class List<T> {
	value head: T;
	value tail: List<T>;
	new(head, tail) { }
}

class A {}
class B extends A {}

component Test {
	method find<T>(a: List<T>, b: T) -> bool {
		for (l = a; l != null; l = l.tail) {
			if (l.head == b) return true;
		}
		return false;
	}
	method test() {
		local a = List<A>.new(null, null);
		local b = List<B>.new(null, null);
		find(a, A.new());
		find(a, B.new());
		find(b, A.new()); // type infers <T=A>, ERROR
		find<B>(b, A.new()); // A not a subtype of B, ERROR
		find(b, B.new());
	}
}

Why do you even need to

Why do you even need to parameterize the contains method? Won't the normal type rules for applying functions suffice if the x parameter of the contains method is of type A? I.e. normal subsumption--applying list.contains(elem) is well typed if the type of elem is a subtype of list's type argument.

Definitely

Yes, sorry, I was taking that as assumed.

It seems to me that giving up unification to Any would be a really big change, and it's not clear to me what all the implications would be. Giving up universal equality seems more manageable and has some definite upsides, but I'm not sure how easy it would be to co-exist on the JVM without exposing reference equality in some form. I guess if we added an explicit AnyRef.getReference() to expose the underlying pointer (as an opaque value of some Reference type), then it would be possible and better to do way with universal value equality.

Well

I'll go back to the original point, which is that you shouldn't be able to ask a list of strings if it contains a boolean: that's just always a type error on the users part. Sure, if you cast it to List of Object, all bets are off, but most people don't do that: they are usually working with particular concrete parameterized types.

As Martin says below, the "correct" signature here is something like:

  <B super T> boolean contains( B elt ) {
   ...
  }

(NB: not valid Java)

But, of course, a determined user can always cast a boolean to an object and test for membership. Oh well, I guess type systems are just destined to be either incomplete or insane.

And I also totally agree with Martin on equality: why in God's name should you be able to ask if a boolean equals a string?

Again, I'm no type theorist, just a dude trying to write some code, so this is all going to be desperately intuitive and uninformed.

Comparing Strings with Booleans

Indeed you could say that's always an error on the user's part, because the result is already determined at compile-time. It's like writing

    if (1 != 1) ...

But note that no type system will catch this expression as an error! In fact Scala's compiler will emit warnings when you try to compare elements of two types where the answer is always false, as in 1 == "abc". But that's an ad-hoc check. I think it would make sense to investigate systematic ways to generalize that to operations like contains on collections. But I don't believe in making the type system unsound to cater for these situations.

The right variance

You are right that the OO style of pre-applying functions to arguments make precise types more delicate. You are not right, however, that it makes it "incompatible with proper variance": if you accept to spend enough type sophistication, you can get precisely the desired variance.

  type α list = {
     elem : ∀ (β ≥ α), Ord β ⇒ β → bool;
  }

You can check that this definition is covariant: if α₁ ≤ α₂, then an element of type ∀(β ≥ α₁) β → bool can also be considered as an element of type ∀(β≥α₂) β → bool. Indeed, given a type β ≥ α₂, we also have β ≥ α₁ as α₂ ≥ α₁, and can get a bool out of β.

And this type can be implemented : given a List α and a β ≥ α with Ord β, you iter through the elements of the list, coerce them from α to β, and then compare at type β.

Now, this type is complex, and this complexity clearly comes from the "pre-applied" flavor of OOP functionals. Indeed, consider two typings for the "full" elem function:

  elem₁ : ∀α, Ord α ⇒ List α → α → bool
  elem₂ : ∀α, List α → ∀(β≥α), Ord β ⇒ β → bool

The second type is more complex, but not more precise: elem₂ can be written in terms of elem₁ (assuming that List is covariant):

  elem₂ α (li : List α) (β ≥ α) (w : Ord β) (elem : β) :=
    elem₁ β (w : Ord β) (li : List β) (elem : β)

This "static complexity" that is characteristic of OOP style is indeed daunting. Some people argue that it should be hidden by abandoning the idea of expressing such fine-grained properties in the type system. Others argue that those static subtleties actually reflect the semantics of the situation, and that hiding them doesn't make them less important to write correct programs. See for example this older discussion on LtU.
I don't wish to take sides here, but I wanted to point out that correct variance is possible.

Lists should be covariant?

Why should a list be covariant? Because it is more frequently used to retrieve its contents than to add to them?

In my opinion variance depends on usage. Type inference can help here, as the type engine knows if update operations are used and if read operations are used. Typing manually, on the other hand...

Immutable lists should be covariant

I think everyone in this discussion assumed that the list was immutable. In the Array case (which is implicitly assumed to be mutable), it "should be invariant".

In presence of subtyping, monotonous (variant or covariant) datastructures are simply much more flexible than invariant ones. Trying to get as much variace as possible out of a datatype has direct benefits on its reusability -- and safety, if the other option is 'perform dynamic type casting to a derived class'.

The option you suggest, that is having a-priori mutable data structure and allowing variance only when the mutating operations are not used, is vastly more difficult than it looks like. Controlling sharing and ownership through types either makes strongly simplifying assumptions (eg. Erlang's "no sharing") or demands sophisticated type features (linear types, etc.); this is still a research topic and I'm not aware of a satisfying, consumption-ready solution -- see however Plaid, F*, and the linear bestiary.

Should a mutable list really

Should a mutable list really be invariant? If I pass it to a function that will only read from the list it should be safe to pass a list of int to a function that accepts list of real. Similarly if a function only writes to the list, then it should be safe to pass a list of real to a function that writes to a list of int.

So perhaps a mutable List should have two type parameters: one indicating the type you can read from the list and another indicating the type you can write to the list. So List[+Read,-Write]. For example List[real,int] <: List[int,int] and List[real,int] <: List[real,real].

Source and Sink

There is an interesting discussion of this aspect, if I remember correctly, in Pierce's TAPL : you can decompose a mutable reference into a 'sink' (where you can only put) and a 'source' (from where you can only get). I don't recall the details, but you should look it up if you're interested in the specifics.

Good too: -big

Good too:
-big ints
-optional typing

shifting in zeros from heaven

Bignums are nice - but then, what's the purpose of the >>> operator?

The bool semantics are odd, but it's hard to beat Javascript in that discipline. Some people may be surprised by the "anything other than true is false" rule; quite the opposite of the Lisp tradition.

Dart generics are not really

Dart generics are not really covariant, they are actually covariant in some places but co+contravariant in other places. To the is operator they they are covariant, but in the language they are actually somewhat co+contravariant, but not really.

For example this compiles (and runs) without type error:

List<Object> x = new List<bool>();
List<int> y = x;

But this produces a type error:

List<int> y = new List<bool>();

So it appears that generics are unsafely treated as both co and contravariant, but only one of the two per assignment or per passing something to something else (or whatever you want to call that).

The is operator is only covariant however:

foo(List<int> x){ print(x is List<int>); }

foo(new List<Object>); // no error, prints "false"

And:

bar(List<Object> x){ x.add("foo"); }

List<int> x = new List<int>();
bar(x); // no error
print(x is List<int>); // prints true
print(x[0] is int); // prints false

But doing x.add("foo") instead of bar(x) does produce an error.

The hacker news discussion.

The hacker news discussion.

in the jungle

The top comment is an ad hominem attack on Gilad Bracha. Hacker news discussions are often very interesting, but I'm glad LtU has more respectful discussion policies.

De gustibus

While I agree that the comment, in the face of it, is an ad hominem, it is also true that Bracha is a very opinionated PL designer that fails to convey the motivation behind his opinions†. The message I get from his comments in that particular post are that, indeed, his own experience and pragmatic taste as an engineer trumps theoretical considerations, to the point that he seems to advocate throwing the baby with the bathwater (the motif being "types are weak static guarantees, so let's do without them and replace inference with pluggable runtime contracts"). This is an ad auctoritatem, hardly any more acceptable in the context, and for me at least casts a pall on what Dart can actually offer over JavaScript.

† Not unlike Bob Harper; the difference being that Harper is a sound theorist and I happen to agree with where he comes from.

Conservative skin over JavaScript

It looks like a simple, typed language that will translate very easily to JavaScript. That's why a lot of interesting features aren't there, e.g. tail calls. It also explains the concurrency model, which sounds like will map directly to worker threads. They've taken some of the more egregiously missing parts of JavaScript--e.g. classes and modules--and designed a simple solution to them that can be compiled away.

It's not clear that browsers will benefit from direct support for Dart. If they have a good JS VM, perhaps with a small extension or two, then developers might be just as well off compiling to JS like they do now.

Reified generics are a surprise. In addition to the performance problem Neel describes, they are also problematic for cross-language communication. Perhaps, though, Dart only uses reified generics in its checked mode? I'm unclear from the linked documents.

I have mixed feelings about null. It's helpful to have explicit options in the code, but it also blows up the code size. I have found that many APIs turn out to have optional values after all, and if I had had to use explicit options, I would have to go back and make cascading type changes to change from String to Option[String] or whatnot. As well, I have written and maintained a fair amount of option-happy code at this point, and in practice I find a lot of code does a deference without checking for the "none" option. Code like that means you pay the price in code size but don't get any extra checking.

I think that's the key point

The bad and the ugly of JavaScript are gone, and yet the language translates straightforwardly into JavaScript. No, it's not what one would want to see starting from scratch, but the Web browser environment is as far from "scratch" as could be.

Blub

I can't help but see it not as a fractional inching upwards from JavaScript's blub coordinate but as a sideways epsilon from it: a tepidly pragmatic papering over JS's sourest pain points.

Blubland is where we live

All Bulbs (generic term for non-Blub languages) have to deal with Blub somewhere, and that means writing either Blubby Bulb or Blub that is bizarre even for Blub. Just see any FFI you happen to think of.

Because most programming is

The type systems are flawed and untyped langauges are popular, because the vast majority of programming is done by non-programmers, no matter what their faux-job title may be.

I remember a rather large government contractor project with gobs of PHB managers that declared that all variables must be global (!), because oodles of local variables "took up too much space" as if the lexical spellings for variables were "space hogs" - i.e., they had absolutely no understanding even of a simple runtime stack.

I'm not joking. And that was well before every young unemployed "creative" person in the mid-90s learned some scripting language and started stringing together copied/pasted "magical incantations" to make Web sites more "interactive".

Move along people, nothing new to see here on the flawed type system front ;-) IMHO, interesting languages will occupy shrinking niches: domains where correctness is important; domains where performance is very important and better correctness than C++ is highly desirable (what's happening in the D community?); languages or DSL's that really nail an interesting problem domain just right (basic Web site building will NOT be one of them) and are actually supported and deployed in the field; languages that resurrect some interesting language feature from the past and show how it makes many current problems solvable more easily or elegantly or safely or with greater performance (preferably all of the above).

Sorry, I can't sleep and am having a curmudgeonly morning.

Dynamically typed is not the

Dynamically typed is not the same as untyped!

"Creative" people need languages also. We should maybe try to cater to them in the right way. Figure what they find easy and what they find hard, looking at abstractions is a good start.

D correctness, D community

IMHO, interesting languages will occupy shrinking niches: domains where correctness is important; domains where performance is very important and better correctness than C++ is highly desirable (what's happening in the D community?);

D seems more interested in correctness than C++, but languages like SPARK, Ada, or even Haskell are way more interested in it. D has correctness/safety holes regarding integral overflows, nulls, weak typing of enums (that have being fixed in C++0x), array covariance troubles (http://d.puremagic.com/issues/show_bug.cgi?id=2095), undefined behaviours inherited from C design like order of sub-expression evaluation and at function calls, and more. Few of such holes will probably be fixed (the last ones about C undefined behaviours), while most of them will not be filled. This is very different from Ada language.

At the moment the D community is going well, the discussions in the D newsgroups are lively and smart, important bugs and corner cases of the language are being fixed every day, small new features to improve language usage are discussed daily, and the standard library is shaping up well. Slowly D (V2) is becoming a practically usable language, a good language.

Another nice overview

This is bizarre direction

This is bizarre direction given that Adobe left the ECMAScript committee due to ActionScript already having gone much of the way in this direction (e.g., shared libraries and mixed static/dynamic types) and Google's and others' representatives stalling on cleaning up and incorporating these additions. Perhaps this is a good lesson on listening to corporate members of a standards committee, or, better, a lesson on 'natural' standards: implement first, standardize after.

Dart won't be standardized without multiple vendors

And multiple vendors won't come if they don't see either market pressure to support, or an advantage-bone (a big one) thrown their way from Google. Market pressure will take time to build and may not rise at all, not beyond the "support CoffeeScript" or "support [Language X, currently mapped to JS]".

More interesting (since I'm supporting it within Mozilla): Emscripten, an LLVM-to-JS back end that already generates code within striking distance of native, where speed will only go up as it dances with the top open source JS VMs, and which can grow to support dynamic loading, multicore, and GPU optimizations.

This is a much shorter path to C and C++ cross-browser portability than NaCl, which requires the huge, chromium-only Pepper API in all browsers. That dog won't hunt -- as I said at JSConf.eu, "NaCl is not coming to the iPad. Ever."

/be

Emscripten looks fun! It

Emscripten looks fun! It reminds me of Scott Peterson's Alchemy project for C++ to ActionScript, which also led to some funny changes to the ASVM. Similar milestone of Python as well :) A useful next one was POSIX APIs.

I'm actually pretty excited to see the FreeType port demo. I've been playing with a code generated layout engine, so we're pretty close to a bare-bones browser-in-a-browser (e.g., others already pretty much have JS DOMs), but something like FreeType is important for really getting cross-platform compatibility.

The problem with NaCl as far

The problem with NaCl as far as I can tell is that it is non portable. This not only applies to x86 machine code but also to LLVM IL. Things like endiannness, pointer size and ABI are exposed in the LLVM IL, so the resulting code is not portable. LLVM IL is also rather large in space usage.

What would be a good compiler target is a portable low level bytecode, or JS with static types, integers, value types, tail calls etc.

Also, Emscripten is not within striking distance of native performance. The performance difference is over two orders of magnitude on their Python interpreter.

Emscripten Performance

Also, Emscripten is not within striking distance of native performance. The performance difference is over two orders of magnitude on their Python interpreter.

The Python interpreter is not a good benchmark of Emscripten for various reasons. Mainly since it has not really been optimized for speed, no one that I know has really focused on this. For example even on repl.it, the Python interpreter has not had closure compiler advanced optimizations run on it. And it does not have any of the more advanced Emscripten optimizations applied to it (typed arrays, memory compression, etc.). Even with that, though, it is 40X slower than native code on Opera, so it isn't over two orders of magnitude - but yes, it is quite slow. If someone were to spend some time optimizing it, I believe that could be hugely improved, though.

Regarding Emscripten performance, the benchmarks I have run are typically in the range of 3X-4X slower than native code. I actually just ran the benchmarks on the latest JS engines, and they have improved a bit more, getting closer to 3X on average. So overall I would say that performance is quite good - not as good as native code obviously, but not that far from static languages that run on the JVM or .NET.

There are some caveats, however. First and foremost that performance varies between JS engines, so you have no guarantee of performance (no JS engine is best, btw - each is better at different benchmarks). Also, typically JS engines are not optimized for large JS files, so there are slowdowns caused by recompiling and things like that. But those things are being improved all the time, so I believe we will get very close to native speed.

Dart and ActionScript

I too find the similarities between the Dart and AS3 situations to be a little uncanny. It seems unlikely to me that many of the people working on Dart were very familiar with the path Adobe took, given that the Dart leads aren't TC-39 members and that essentially nobody outside of the Flash community pays attention to ActionScript.

It will be interesting to see whether or not Dart falls into some of the same traps as a result of these similarities. Already it seems like performance might be one of those issues. Flash Player has Tamarin/AVM+, and Dart has DartVM, but the (early) Dart benchmarks show mixed results at best, just as Tamarin has gone in cycles of being slower or faster than top JS engines since its release in 2006 despite the advantages promised for type-annotated code.

Like Adobe, Google also seems to be the only party, at this point, with much of any interest in delivering VM performance and language evolution. That has obvious implications for performance and less-clear implications for evolution. In the case of ActionScript, the language has essentially remained the same except for the addition of a single generic class, Vector.<T>. It would be a shame if some of the features in consideration for Dart, like reflection, were never added because Google stalled on it. Thankfully, Dart hasn't yet reached 'release'-level maturity.

ActionScript 3 suffered a lot by not being a clean break from ECMAScript (while being a messy break from the committee), since much of its syntax was designed to be compatible with the ES4 drafts. It seems that Dart will at least remedy that problem, although it's not entirely clear that JavaScript developers will appreciate the loss of the dynamism that they're used to. This could be why Google now seems to be marketing Dart as a web language that's more approachable for those coming from Java, despite the leaked memo which made it sound more like an attempt at JavaScript assassination.

I wonder what we would have seen if Adobe and Google had decided to team up on a shared language/VM. (if the gap between 2006 and 2011 had been significantly smaller)

AS3 and onwards are a pretty

ActionScript 3 suffered a lot by not being a clean break from ECMAScript (while being a messy break from the committee), since much of its syntax was designed to be compatible with the ES4 drafts.

AS3 and onwards are a pretty strong advance from ES of that period both syntactically and semantically (which is a Good Thing). As a fly on the wall at that time, the concerns seemed more about Flex and AS2 legacy :) I think it was actually a healthy decision as it gave them an empirically validated position on mixing static and dynamic code as they had to do it for many teams worth of code. I'm not sure how it suffered -- syntax wasn't too interesting (except for say E4X), and arguably the main semantic impact was that the static/dynamic integration was very conservative. Independent of legacy, a bigger issue seems to have been not having many type specialists help out, unlike what happened with say Java.

It seems unlikely to me that many of the people working on Dart were very familiar with the path Adobe took, given that the Dart leads aren't TC-39 members and that essentially nobody outside of the Flash community pays attention to ActionScript.

I can't really comment to that -- my point is more of what this tells us about language committees where members are corporations. Google investing in web language technology is important given the current economics of technology infrastructure, and clearly these technologies need improving; I'd be worried if there wasn't something like Dart :) A committee shouldn't be surprised either, and this sort of thing should be taken into account when making decisions, esp. on innovation. If one representative of an organization says to hold back, will another agree? And for how long?

Going back to Brendan's comments in another thread on this about closed development, the comments on types here are funny in terms of standards. Let's say Google announced an open project to add gradual/soft/etc. types to JS sometime 2 years ago, and drafted some of our favorite type theorists to help. Would more progress than Dart's be made? Would it be suitable for standardization? I don't know -- but in terms of natural standards, the current result of Dart (should it be picked up at least within Google) seem hopeful for pushing such ideas into ES. However, I'm not as clear on the theoretical quality of Dart's approach nor its chance of success -- the open approach seems less 'risky'. Luckily, it's still not out of the question :)

Lars Bak on Dart

Lars Bak answers a few questions after unveiling Dart:

When you design a language, you always have to design trade-offs into the language. What will you do to make sure that five years from, some other “Lars and Gilad” will have to start all over and make yet another new language?

We try to be pretty inclusive in the trade-offs. There’s no such thing as an all-purpose language – you can’t take just any language and make it do everything.
Many of the constraints have to do with backwards compatibility. For us, being able to start from a brand new language makes a lot of things easier. With that in mind, we have done a lot to make sure, that you can still run Dart programs in existing browsers.

Actually, I hope that in many years from now, someone will come along and say “we’ve got something even better”. The more attempts at innovation, the better the result.

Also there is Dart Inside, The Unofficial Google Dart Blog

The hubris in the last two

The hubris in the last two sentences is remarkable.

So...

Dash is designed with three perspectives in mind:

- Performance -- Dash is designed with performance characteristics in mind, so that it is possible to create VMs that do not have the performance problems that all EcmaScript VMs must have.

- Developer Usability -- Dash is designed to keep the dynamic,
easy-to-get-started, no-compile nature of Javascript that has made the web platform the clear winner for hobbyist developers.

- Ability to be Tooled -- Dash is designed to be more easily tooled (e.g. with optional types) for large-scale projects that require code-comprehension features such as refactoring and finding callsites. Dash, however, does not require tooling to be effective--small-scale developers may still be satisfied with a text editor.

(from the leaked memo)

So, do you think that what you see delivers the goods? I am still not sure how any of these motivates a new language; I'd appreciate being enlightened.

Why not covariant generics?

Why not covariant generics?, by Chung-chieh Shan.

Isn't he conflating one

Isn't he conflating one interpretation of soundness with completeness?

From the post:

But plenty of static type systems are unsound yet respectable. For example, many dependent type systems are unsound, either because their term languages are blatantly non-terminating or because they include the Type:Type rule. (I faintly recall that Conor McBride calls Type:Type “radical impredicativity”.) So it doesn’t seem that unsoundness per se should rile people up about Dart in particular.

Edit: and did I just conflate completeness with decidability? :)

Read on

Read on, he actually clarifies that in the next paragraph. And further down suggests that covariance is a non-starter nevertheless (although I find his line of argument a bit convoluted).

Is Dart really what is

Is Dart really what is needed? Wouldn't it be more useful to have a more general VM in the browser that can be targeted by multiple languages? Dart doesn't seem to be a particularly good target for compilation.

There already are several languages that can target JavaScript like Haxe, Fantom, CoffeScript, ... . Some of them would probably profit from a VM.

Useful? Yes.


Wouldn't it be more useful to have a more general VM in the browser that can be targeted by multiple languages?

Useful? Yes. Practical? Not so much. The reason we're stuck with JavaScript (and indeed, HTML) in the first place is due to the quirks of market adoption.

Was it experience with V8

Was it experience with V8 that led to Dart?

Not just V8

Also Caja. Dart has much better support for sandboxing based on the experience of trying to "tame" JavaScript to make it capability-secure.

client side Java

The more general VM that can be targeted is client side Java applets, flash, silverlight... Those already exist. The whole AJAX push was to offer applications that didn't go into a JVM.

I agree that the whole Javascript push is a mess. We are moving towards a web with massive quantities of highly layered code in an interpreted language performing complex operations .... Some websites take several hundred megs of RAM for a page load.

That being said Google was a primary proponent of the whole AJAX push. If the data is in a VM it can't be hit be effective indexed by search engines.

I anticipate "certified Dart

I anticipate "certified Dart developer" courses popping up :(

Why? Google currently

Why? Google currently doesn't offer developer certifications for anything else, right?

Strongtalk

Dart appears to be very strongly influenced by Lars and Gilad's previous work on Strongtalk, only with a Java-like syntax rather than Smalltalk-80.

Gilad Bracha on Dart

Interview with Gilad Bracha on Dart
He talks about using Google's power to study the use of javascript features in the wild to base the design of Dart.

we're trying to take a data oriented, googly approach to this to actually find out about.. That's a nice thing about the web. We've actually started to do these things. Looking at what features people use in javascript and how popular they are. That can sometimes help us with our decisions. At google we can find out just how many people are doing x in a javascript program on the planet.. We can actually base our decision on data. Which is kind of scary 'cause what if one of the data doesn't agree with me, well it's obviously wrong but :)

And people complain about

And people complain about design by committee...

To paraphrase Jobs: "we

To paraphrase Jobs: "we don't ask our users what they want, we create something that they will want, but have no idea they want now."

I'm not a big fan of user-driven design.

Pot/kettle

Do you think you're disagreeing? I'm not sure you are... It seems to me that Ehud's putting data mining-driven design at the end of a spectrum that goes something like:
cohesive design by a tastemaker -> design by committee -> user-driven design -> data-mining design

I was agreeing with Ehud.

I was agreeing with Ehud. Sorry if that wasn't clear.

Edit: I agree with you also. Here, the users choose what to do which influences the data, I don't think this is data-driven design so much as it is user-driven design reflected through data.

I'm not sure how Gilad will deal with such an environment. I would think this is not his preferred way of working. It does sound like Google though.

I thought so :-)

I thought so :-)

Intel was the same, at least in the 1970s, early 80s

I saw the same thing at Intel in the 1970s, into the early 1980s.

Rarely did our customers signal to us what they wanted. It was our job to develop something, like a DRAM, PROM, Erasable PROM, microprocessor, that our technology was capable of creating but that they DID NOT KNOW THEY NEEDED.

More Dart, metaprogramming

Peter Bright has posted a nice article on Dart at Ars.

I'm wondering if we are losing something with Dart that is fundamental. One of the reasons Javascript took off was its powerful metaprogramming facilities that enabled all sorts of hackery in libraries. This was mainly enabled by dynamic typing and its prototype-based object system. Dart has none of that, and is more like a conventional PL.

I expect we are indeed

I expect we are indeed losing a lot of that, but I am quite sure the Dart designers consider that a feature, not a bug. The prototype chain is the main reason JavaScript is hard to optimize.

However, it will certainly make Dart almost totally uninteresting to the jQuery / Coffeescript users of the world, whose entire lives are based on JavaScript metaprogrammability.

Scala supports static metaprogrammability, but the lack of robust IDE support for it indicates that it may be pushing the abstraction lever too far. JavaScript supports dynamic metaprogrammability but completely abandons static analysis and tooling. Dart seems to have ditched metaprogrammability altogether. It would be nice if Dart moved just a bit in the Scala direction, but not far enough to become nigh un-analyzable....

Just not implemented

Dart seems to have ditched metaprogrammability altogether.

It hasn't ditched it, it just hasn't been implemented yet. The plan (unsurprisingly) is to have a mirrors-based reflection/metaprogramming API. Gilad et. al. just haven't gotten to it yet.

Less protos and more object literals

This was mainly enabled by dynamic typing and its prototype-based object system.

I think most of JS's power comes from:

  1. Object literals, and a nice notation for them.
  2. Objects-as-named-property-bags, and the easy ability to add new properties to them.

I'm not an expert, but most JS I've seen in the wild, especially "modern" stuff like jQuery isn't really that prototype-based, it's mostly just flat objects. From that angle, there isn't much difference between a prototype-based language and a class-based one: if you aren't inheriting to begin with, it doesn't matter how your inheritance scheme works.

Being able to easily create objects on the fly is something nice about JS that Dart doesn't have, though. Like most languages, it has a stronger distinguish between "objects" and "data structures" and has a distinct map type with map[key] syntax.

That will get a little better once doesNotUnderstand is completely implemented, but my expectation is that Dart will never feel as flexible regarding objects is JS does. Depending on your style, that may not be much of a loss.

I think you are missing my

I think you are missing my point: the flexible dynamic object system wasn't really their to support application programmers (they benefited from simplicity, not flexibility), it supported library writers who could then could do crazy things to make it easier for users of their libraries. What Dart has done is focused on the library user, which is reasonable but loses the existing library ecosystem.

Its not clear to me that Dart is flexible enough for library writers to go in and redo what they did for Javascript. Of course, if Google is going to re-provide all of the libraries on their own, this is a moot point. But making things easier for library writers was probably the biggest advantage of JS, especially useful when you don't have the resources to create a complete ecosystem on your own.

Siek & Taha's Gradual Typing

Dart's ability to 'add type information' reminds me of Siek & Taha's gradual typing.

-j n smith.

Gilad comments on Alan Knight's "I wish Dart were Smalltalk"

Gilad comments on Alan Knight's "I wish Dart were Smalltalk"

Wrong direction. (Note: I'm

Wrong direction. (Note: I'm a security engineer by hobby, so it influences my preferences.) We don't need just another programming language. The problem is that the web browser is pretending to be an operating system, the web programming languages aren't as good as system languages, and complexity is getting too high for the simplest applications. We need a different approach that integrates the web applications into one tool or set of appropriate languages. Preferably, an approach that can be secured like regular applications.

I say we either minimize browser's involvement or take things out of the browser. I'd, of course, much rather use languages like Ocaml and properly distribute the application. However, if it has to be Web, I think approaches like SIF and Opa are a bit better than what we currently have. There's always CASE tools like WinDev and AlphaFive, but I don't trust them for many things. Any ideas or good projects/products that I missed?

Rich web languages

You may be interested in Links and Ur/Web.

thanks

Forgot to thank you before. They were interesting. I was also considering practical uses for the SELinks system recently.