Java Generics, Arrays, and Comparables

Tim Bray is struggling with the way Java integrates generics and interfaces,

Some day I must find out why generic declarations extend rather than implement interfaces. Grr.

Kinda makes me understand what my students go through when they first encounter Ada's generic model (which is much easier to understand, I hasten to add).

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

you know it's a sad day...

...when a topic on VB draws more attention ( 2 posts ) than a Java one...

I disagree. I think that Java

I disagree. I think that Java should get back in its box, and stop bothering us all by pretending that it is useful for anything other than glue code. Real algorithms are written in Fortran.

glue code by glue code

i'll take Perl, Java, assembly, then VB, in that order...

perhaps

.. the rest of us are still puzzling over why Java uses extends for both interfaces and class inheritance in generics?

Extends vs. implements

When you inherit an interface from another one you also use "extends". Since you can instantiate a generic with an interface type, "extends" for type bounds makes as much or as little sense as the other choice.

The difference is superficial anyway: in a class declaration, both do the same thing, except that "implements" puts more constraints on its argument (interfaces are just a restricted form of class).

Core Calculus of Metaclasses

Maybe he should take a look at A Core Calculus of Metaclasses to get an idea of what more might be in store for Java. I was quite enlightened by that paper, but I'm not so sure that it was those ideas that influenced the design of Java generics.
why generic declarations extend rather than implement interfaces

It's been a while since I looked at Java, but could it be that if generics implemented the interfaces, then they would no longer be multiply-inheritable?

They just picked one.

"extends" and "implements" are two ways of saying "is subtype of".

Though two keywords are not strictly necessary (the compiler can figure it out automatically) I guess the language designers thought it would aid readability of class definitions. But when specifying type parameter bounds, they decided it was unnecessary for the code to be explicit about the difference.

I think I can see where they're coming from. When defining a new class, inheritance is involved, which is different for classes and interfaces. When specifying type parameter bounds, inheritance isn't involved so having different keywords for classes and interfaces would be pure noise.

The C# language only has one way of specifying "is subtype of", but the Microsoft coding conventions require a prefix "I" for interface types, so the distinction is present everywhere.

nitpick

"Microsoft coding conventions require a prefix "I" for interface types, so the distinction is present everywhere."

Actually, that comes from the good Borland guys they bought to design them a good language. Delphi code use this convention and TFoo for types so as to overcome the "limitation" that Pascal doesn't feature case-sensitive syntax. So, they couldn't adopt the common convention of types beginning with uppercases and variables with lowercase.

Actually, it's much more readable and clear...

Setting the type, err, kind

Actually, it's much more readable and clear...
Interesting, many scientific papers use different metavariables and different typesetting for different "kinds" of names. While the first practice reminds me Fortran, some early Basics, and Rexx, and does not really work well for programming (except when the first letter is I or T ;-) ), the second one is promising.

It has two potential drawbacks: accessability and encoding to plain text. Both can be solved by saying that there is an underlying textual syntax with explicit tagging of "kinds" (not unlike C struct, enum, etc.), while optionally they can be rendered using typesetting. So, in plain syntax: enum Foo = ...; struct Foo = ...; enum Foo Foo = ...; struct Foo Foo = ...; , in typeset syntax: Foo = ...; Foo = ...; Foo Foo = ...; Foo Foo = ...;

Uh, Drupal suspects <style> is malicious, so I had to fall back to <font>, I would have definitely used small-caps and gothic script otherwise :-)

Seriously, even CTM uses typesetting to differentiate between kinds, so why don't we see more of this in real PLs?

because

"Seriously, even CTM uses typesetting to differentiate between kinds, so why don't we see more of this in real PLs?"

Perhaps because typesetting is a pain in the ass for real programming?

Your favorite document editor

Your favorite document editor begs to differ.

not really

"in plain syntax: enum Foo = ...; struct Foo = ...; enum Foo Foo = ...; struct Foo Foo = ...; , in typeset syntax: Foo = ...; Foo = ...; Foo Foo = ...; Foo Foo = ...;"

Emacs sure knows how to automatically syntax-highlight based on keywords. But to simply do like the abov, you'd need to explicitely typeset ( perhaps by a bunch of key combos ) in order to deal with it.

My Question would be...

Seriously, even CTM uses typesetting to differentiate between kinds, so why don't we see more of this in real PLs?
I believe the goal of embedding type information in a variable's name is to make it easier for someone else to read the code. In that sense I'd argue that the technique is used to make up for the deficiencies of common source code browsers/editors. The author of the code shouldn't be bothered to type any extra letters.

Better PLs through editors

Seems to be a neglected area of PL research. Individual preferences for code layout and highlighting should be a function of your editor, not necessarily the code repository. Like offsides as a nesting technique - set the editor to view the code with that parameter. Like squiqly brackets {} or begin end - set it as a preference in your editor. There's lotsa pretty printers out there to make sure you get the view of the code in the one true way. Question is why editors don't just integrate configurable pretty viewers.

Of course, you have to worry about how to input the code (e.g. typeset code). But program editors could make looking at the code in numerous ways.

"Actually, that comes from th

"Actually, that comes from the good Borland guys they bought to design them a good language."

Some of it may have come from the Borland language designers, but I think Microsoft used the "I" prefix for their COM interfaces as well.