LtU Forum

LLVM 1.3 is released

This is possibly interesting for language implementors out there: LLVM 1.3 is now officially released.

-Chris

Cω vs. Erlang?

I know there was Cω topic before. But the discussion really had more to do with Xen (XML, DB integration) than the Polyphonic C# concurrency part. I'm an Erlang newbie myself, but I already can see that asynchronous communication is the common thread to both languages and this approach is going to be big - can be used from embedded system to biz process orchestration.

Looking at the Santa Claus sample, Cω code seems to be quite compact and the syntax extension to C# is minimal. However process/thread creation is implicit, and it seems to be a little more difficult to reason. I would imagine a solution in Erlang clearer, but maybe longer.

Any thoughts/pointers on the pros and cons of different concurrency approaches? We really should have a "great concurrency shootout".

Type Theoretical Foundations for Data Structures, Classes, and Objects

Via MetaPRL

Type Theoretical Foundations for Data Structures, Classes, and Objects

We will show that records and objects can be defined in a powerful enough type theory. We will also show how to use these type constructors to define abstract data structure.
and
Here we extend the constructive type theory with a simpler type constructor dependent intersection, i.e., the intersection of two types, where the second type may depend on elements of the first one (not to be confused with the intersection of a family of types). This new type constructor allows us to define dependent records in a very simple way.

Dependent intersection looks worth attention for many reasons, e.g. providing a simpler alternative to very dependent functions. I especially liked how it simplifies treatment of coincident labels (multiple inheritance, anyone? :).

Why compiler optimizations are interesting

    Compiler optimizations are interesting because they automatically improve the efficiency of programs. Hand-optimization by a programmer is a time consuming and notoriously error-prone activity, so it is of the utmost importance for the compiler to make correct optimizations automatically wherever possible. Compiler optimization also serves as an important link between theory and practice: many optimizations are made by proving certain properties of programs -- for instance, that an array index cannot be out of bounds or that a data structure is uniquely referenced -- and the relevant proof techniques are a valuable area of research in themselves.

    Ensuring that whole programs can be effectively optimized does impose certain design constraints on the compiler and on the programming language. The benefits of automatic optimization do, however, far outweigh these relatively minor restrictions.

To me that is a reasonable perspective and I think it's widely held. My own real perspective is quite different, but it's nothing novel and can probably be summarised using catch-phrases:

    Adequate performance really comes from algorithms, data structures, profilers, experience; the compiler itself is the program that most needs to be simple and fast; whole-program analysis is the enemy of interactive programming; the means seem to justify the ends for many optimizations presented at conferences; I must trust my own experience more than microbenchmarks and Socratic discourses on modern computer architecture.
Of course this has absolutely nothing to do with type systems.

New Paul Graham thing...

The Python Paradox


I mostly agree with the thrust of this piece, I think, but here's the most interesting bit (to me, at least):

Both languages [Python and perl] are of course moving targets. But they share, along with Ruby (and Icon, and Joy, and J, and Lisp, and Smalltalk) the fact that they're created by, and used by, people who really care about programming. And those tend to be the ones who do it well.
It's interesting that all of those languages are dynamic and thus favored by Paul Graham. Does he really think that, e.g., Haskell and Ocaml are being created by people who don't "really care about programming." Or is this just a cheap shot? Or are those languages really just completely off his radar?

LPFML, Xml language for linear programming

Sometimes you just stumble across a project that you had in the back of your mind for a while: an XML standard for representing linear programming problem and solution instances

now I just have to find the time to write a solver.

Call-by-what?

I know that this was discussed on LtU already, but as "Why Types" discussion demonstrated, people can always come with new ideas on old topics.

From theoretical point of view, call-by-value and call-by-name are dual.

Does it mean that in practice PLs should support just one of them, and get "two-for-the-price-of-one"?

Or does usability demand supporting them both?

Is this decision similar to supporting just And-Not as opposed to full set of (redundant) logical operations?

Does the decision depend on type system, or is fully orthogonal to it?

Constraint-Based Type Inference for Guarded Algebraic Data Types

Constraint-Based Type Inference for Guarded Algebraic Data Types

Quite a mouthful, but look:

Guarded algebraic data types, which subsume the concepts known in the literature as indexed types, guarded recursive datatype constructors, and phantom types, and are closely related to inductive types, have the distinguishing feature that, when typechecking a function defined by cases, every branch must be checked under different typing assumptions. This mechanism allows exploiting the presence of dynamic tests in the code to produce extra static type information.

I still have to grok this idea of "dynamic tests producing extra static type information" in its entirety, though.

PLs and SE

it's never a language designer's job to define what's a sound software engineering practice. (on the Why Are Type Systems Interesting thread)

Setting aside the discussion on static vs. dynamic type checking, I must say that I find this statement rather puzzling. Aren't PL designers suposed to make SE easier? Isn't the best way to do this to evaluate and consider SE parctices? Isn't that what happens in practice (e.g., mining patterns for language features, library building etc.)?

When I created the software engineering department on LtU I wanted it to focus on language features and language design that are directly concerned with software engineering issues. But my view is perhaps even stronger than that: I think most PL issues are ultimately about SE.

Natrually, one approach is for the language to support whatever SE practices each and every programmer or team chooses. But isn't that simply an example of one sttitude towards the best way to handle the complexity of SE?

I find it incredible that some would argue that PL designers should be agnostics when it comes to SE.

Why type systems are interesting - part II

The type systems thread is getting ridiculously long. Please continue the discussion here...

XML feed