Fortress focuses on the needs of scientists

(via Lemonodor)

Guy Steele leads a small team of researchers in Burlington, Massachusetts, who are taking on an enormous challenge -- create a programming language better than Java... Think of it as Java for scientists, Java for the programmers of a peta-scale supercomputer.

You just have to love this quote from Steele: I'm now not convinced that a single programming language can serve everyone's needs, because the needs are so diverse. Well, duh...

But obviously Steele knew this long ago. Anyway, Fortress seems to be concerned with notation ("with square root signs and exponents placed above the line"), in the hope that that's going to be enough to lure Fortran programmers. Time will tell.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

As a potential end user...

I am probably one of the target users of such a language. I'm not into
supercomputing, but I do use a lot of maths and linear algebra in my
research. It's not uncommon for my code to take about a month to run
on a fast Xeon with lots of memory. Perhaps I should optimise it?

More seriously, I think that the main aim of the project---to improve
scientific programmer productivity by delivering fast numerics as well
as rapid software development---is important. I don't think that it's
possible to allow scientists to simply transcribe their equations into
highly sugared source code, because an equation describes intent,
which is usually very much divorced from a practical implementation.
Allowing scientists to work at a level of abstraction that is much
closer to their problem domain is obviously a good thing, but enabling
this in a meaningful way is going to be very difficult.

Allowing me to include a square root symbol in my code isn't going to
be good enough. Systems such as Mathematica already allow me
to do this [1]. Mathematica already does some of the run-time
optimisation that the article hinted at, for example it can
automatically choose from among several alternative implementations of
a particular function, depending upon the characteristics of the data
to be processed (e.g. the data might be a sparse matrix or a dense
symmetrical matrix, and it might alternate between these states during
the execution of an enclosing algorithm).

Another important feature is error propagation, which I believe
Mathematica also has. This allows one to specify not only an input
value, but also the error associated with that value. These errors are
propagated through the operations, so that one can determine the error
associated with the end result. Most of the C/C++ research code that
I've seen doesn't do this. And this brings me onto the topic of the
end users.

The "old-timers" in my research dept. won't touch software such as
Matlab or Mathematica and prefer to stick to C and C++, perceiving
them to be faster. The reality is that Matlab and Mathematica use
numerics libraries written by numerics experts, which are likely to be
not only faster, but of higher quality than ad hoc
implementations. The research students who study under the
"old-timers" are forced to use C or C++, and so it continues.

In order for new tools to be adopted, I believe that it will be
necessary to deliver not only much faster numerics performance, but to
allow the researchers who use the tools to obtain publishable results
more quickly. If a researcher gains a noticeable competitive
advantage, then the tools will be used. If not, they won't.

[1] I use Matlab over Mathematica, because I need very fast numerics
rather than symbolic manipulation, but I believe there is little
between them now in terms of number-crunching.

[Edit] I should probably point out that bottlenecks
can emerge from the use of general purpose numerical
libraries, and so C, C++ or Fortran can be used to
exploit a priori knowledge of the problem to yield
more efficient solutions. But, as always, profiling
is the way to discover when this is sensible.

enormous challenge

"taking on an enormous challenge -- create a programming language better than Java..."

I laughed when I read that. I think it would probably be much more challening to make a language that is *worse* than Java.

worse than Java

Notice that the article was published by Sun. ;-)

Ha ha

Yeah, I also laughed. I thought "somebody should tell them that C++ has already been invented!"

Sisal?

The only serious attempt to design a Fortran replacement for supercomputing that I know of is Sisal (not counting newer Fortran versions or C++ hackery^Wlibraries), and it did not have the success its designers hoped for.

Does anyone else have experience with Sisal? I can understand that getting conservative HPC programmers to take the step from Fortran to a functional language with strict semantics was a futile task, especially if it did not have the large body of efficient Fortran libraries from the beginning.

Links to ...

Sisal is connected to declaractive parallelism, array languages, etc. If you haven't already looked at these threads, they may be informative:

Since the Cell's sidekick CPUs don't do out-of-order execution, array languages and declarative parallelism will likely perform better than thread-based parallelism on that architecture.

--Shae Erisson - ScannedInAvian.com

ANSI C (1999)?

Quite a few folks, it seems, think that Fortran's eventual replacement is here--C. Specifically, the recent revision of the ANSI standard, equipped with a few things that C has long lacked--an intrinsic complex number type (not a class/struct, as provided by the C++ library), and "restrict"ed pointers which the compiler may assume are unaliased when optimizing (one reason that Fortran has long been capable of outperforming C is that C compilers cannot perform certain pointer optimizations across function calls; because it's possible that the function might refer to the object/array with an alias).

Progress? Probably not. "restrict" is probably useful--it's an inherently unsafe construct (as the compiler cannot guarantee that a restricted pointer doesn't have aliases), but one that is not necessary for most programs (and which can be abolished by local coding standards as necessary, and likely will not be noticed by programmers who don't need it). I'm not sure what the intrinsic complex type gets you over and above the C++ way of doing it (with a class), other than the ability to use infix operators on complex numbers in C without introducing operator overloading to the language.

There still seems to be lot of bias against C++ in the numerical community--but for (alledged) reasons of performance, rather than the low-level and unsafe nature of the langauge. FTMP, it's my experience that on mature compiler suites, such reasons are groundless (assuming one disables exceptions in the C++ case). It might be the case that certain high-performance compilers used for computationally intensive numerics still have better C backends than C++ backends (or no C++ support at all).

[Note. Got the year of the standard wrong initially; 1998 is the year the first ANSI C++ standard was released. Yet another off-by-one bug.. :) ]

Good Reasons

The numerical computing community has had good reasons to be skeptical about high-performance numerical computing in C++: the best available C++ libraries for the purpose use expression templates and operator overloading, and in virtually all C++ compilers even today, with the exception of the KAI/Intel compiler, even if the resulting object has no virtuals and can fit entirely in registers, that optimization ("small-object optimization" or "replacement of aggregates with scalars") is not done. In particular, no version of GCC has ever performed such an optimization. However, GCC 4.0 introduces a new SSA-based IR that provides this optimization. I expect the impact on libraries such as Blitz++ and POOMA to be dramatic.

C is not a programming language ---

it is an insult. You are not seriously suggesting that people use a badly designed macro assembler that lacks practically everything that supports good and error-free programming and ignores practically all of the advances of computer language design of the last 40 years or so as a replacement for this language?

Beware of trolls...

Let's try to avoid another holy war...

J

Has anyone tried J? It's not in the category of high-performance computing, but definitely worth a look.