J and K: One User's View
started 12/16/2001; 8:42:37 AM - last post 12/17/2001; 2:05:45 PM
|
|
Ehud Lamm - J and K: One User's View
12/16/2001; 8:42:37 AM (reads: 707, responses: 5)
|
|
J and K: One User's View |
J and K share some common heritage. Both are, at least in some sense, reconsiderations of APL. The languages thus share in the space that might, at least loosely, be called array processing languages because both of the languages perform highly efficient array operations.
Talk about syntax...
Anyone honset enough to say that these languages are ugly, should be given a hearing.
The weak point of this note is that it lacks code examples. These can be found by going to the languages homepages (links are given at the bottom of the page).
Posted to general by Ehud Lamm on 12/16/01; 8:43:11 AM
|
|
|
|
Frank Atanassow - Re: J and K: One User's View
12/16/2001; 12:29:03 PM (reads: 730, responses: 0)
|
|
Anyone honset enough to say that these languages are ugly, should be given a hearing.
I am reminded of the famous quote by Edsger Dijkstra: "APL is a mistake, carried through to perfection."
This apparently originated with his memo, How do we tell truths that might hurt?, which I happily found online. It sure isn't reading for the faint of heart, though! Even I wouldn't dare to be so heavy-handed. And you all know what a bastard I am. :)
|
|
Chris Rathman - Re: J and K: One User's View
12/16/2001; 1:27:21 PM (reads: 741, responses: 0)
|
|
I suppose if arrays are your thing, the APL family is the way to go. I've always considered it odd that most computer languages give up when it comes to string processing (they embed a parser for a sublanguage embedded in strings - aka regular expressions). They also give up when it comes to database manipulation (preferring an embedded SQL language disguised as strings). Yet, most languages are not willing to give up on array processing, preferring to use built in mechanisms to handle arrays. I've never been one that likes instructions for another language built in as strings - strings acting as regular expressions or SQL commands suffer from being non-testable and tend to promote throw away code - they are treated as data by the encircling program but they definitely part and parcel of the program itself. What I'd like is a seamless integration of modules and functions written in differing program notation. Instead of embedding a regular expression in a string, make a call to a function written in Icon. Instead of embedding SQL strings in function calls, call a procedure that is written in a syntax that resembles SQL.
Ok, I'm off the beaten path, but it's really ashame that it's an all or nothing proposition. I'd love to be able to write certain parts of my program in APL or J (I'm not familiar enuf with K - even though the article above is informative, I still don't have a good feel for the language - but then I often don't feel comfortable with a language until I've had a chance to bang on it). When it comes to other aspects of programming (such as string handling, etc), these languages seem fairly useless - might as well program in XSL as do string handling in APL/J.
I think the problem with analyzing the syntax of these languages is that they are simultaneously very good and very bad. They are very good for certain classes of problems and certain datatypes producing more readable code within limited confines. Being a specialized form of syntax, however, they are nasty when you step outside of that area.
From what I gather of J & K, they are mostly semantic twists on APL. The syntax of both differs with APL, but for the most part they keep the spirit of APL. Which does bring up the question of whether syntax and semantics are tied together. If syntax is as easy to change as your clothes, why can't we come up with a skin for this class of languages that makes them more palatable?
|
|
Frank Atanassow - Regular expressions and FP
12/16/2001; 4:15:37 PM (reads: 727, responses: 1)
|
|
Instead of embedding a regular expression in a string, make a call to a function written in Icon.
You are talking about an alternative syntax for writing parsers, but I would like to say something about the semantics of regular expressions, that is, about regular sets. It will take me a while to get to my point, but try to bear with me.
It came as somewhat of a surprise to me, but regular sets popped up indirectly in my study of models for functional programming languages. If you decompose an FP language into linear and non-linear parts, "linear" in the sense of linear logic, then you find that the non-linearity is modeled by a comonad. You can think of this comonad as a functor which takes a type to the type of free comonoids on that type, where a comonoid on type T is basically an infinite stream over the alphabet whose symbols are all values of type T. (That's how you get non-linearity: you can produce many instances of the values.)
These comonoids are commutative, which means that the elements of the stream can commute with each other, and that is what you need for an FP language, because the free variable environment of a term is not ordered. But, it is not hard to imagine weakening the system so that the streams are ordered.
This is a bare-bones FP language, so it is intuitionistic. In classical linear logic, though, there is also a monad, which controls the non-linearity of continuations of functional terms: it is a functor which sends a type T to the set of multisets ("bags") on T. If you weaken this by removing the commutativity, you are left with the free monoid construction: the set of strings over alphabet T. Taken together with the coproduct (for alternation), you now have precisely a model of regular expressions.
So, my point is that there is a model of regular expressions buried in the semantics of functional programming languages, which should be available in a non-commutative linear FP language which has first-class continuations. This is the sort of language one might use as a core for an ML-like language, i.e., an FP language with imperative effects. I have been wondering if it is worth exploiting this. It has some unusual features, though. For example, there are two kinds of lambda-abstraction, since environments are ordered; in terms of regular expressions, they correspond to putting conditional constraints on the expression which say things like, "if ever A was matched then B matches". These sorts of regular expressions are also in another sense more general than the ones found in Perl: they are not restricted to strings of characters only, but strings of arbitrary types of elements. For example, if you had defined an abstract datatype of XML elements, you would get regular expression-matching on XML documents for free.
|
|
Noel Welsh - Re: Regular expressions and FP
12/17/2001; 1:10:32 AM (reads: 731, responses: 0)
|
|
Frank, do you have any references for intuitionistic and classical linear logic? I'd like to learn what they're about.
Chris' point about syntax is well put. I think it addresses the same issues the little languages guys and the macro guys have been looking at. The problem with syntax is you have to learn it and knowledge of one syntax often isn't transferable to another, whereas knowledge of one language often is transferable to another. Now I couldn't mention macros without mentioning Scheme, so I'll just note the regularity of s-expressions help to make user defined syntax easier to grok. However it would be really cool to have a sort of 'Intellisense' feature in Scheme programming environments that gives you help with macros.
|
|
Frank Atanassow - Re: J and K: One User's View
12/17/2001; 2:05:45 PM (reads: 660, responses: 0)
|
|
A good introduction to intuitionistic linear logic may be Philip Wadler's paper "A taste of linear logic". The paper is available from his linear logic papers page. It's very accessible, but you should be aware that the type system he uses (has its own merits but) is not the one with which ILL is usually presented.
On the other hand, you can go straight to the source, and get the paper by Jean-Yves Girard, LL's inventor, Linear logic: its syntax and semantics. This one treats classical linear logic, and is, remarkably for Girard, also quite readable. On the other hand, it is less practical.
The most practical (in the sense of, "will be used for real", not "immediately understandable") presentation of ILL that I know of is Dual Intuitionistic Linear Logic, which is due to Andrew Barber. There is another version of this paper with Gordon Plotkin, but I can't seem to find it. Barber's thesis also deals with this subject in more depth, along with applications to Milner's action calculi. Ritter and de Paiva did some additional work on DILL for their xSLAM project, which treats the "single-reference" problem.
If you do a web search, you can find many other papers on LL, but many of them are pretty technical. (Especially the ones written by other Frenchmen. :)
Maybe it is also worth mentioning that Girard has a new thing going, called "ludics". In fact, he has written almost a whole book about it. I don't know quite what to make of it, but I've only read the beginning. It is, um, very, how shall I put it... avant-garde. But hey, Girard is a genius, so we have to give him some slack.
|
|
|
|