Exploring NLP in Oz/Mozart
started 9/22/2002; 2:16:21 AM - last post 9/24/2002; 11:19:54 PM
|
|
jon fernquest - Exploring NLP in Oz/Mozart
9/22/2002; 2:16:21 AM (reads: 1999, responses: 8)
|
|
|
Michael Vanier - Re: Exploring NLP in Oz/Mozart
9/22/2002; 6:21:41 PM (reads: 1197, responses: 0)
|
|
I haven't read the NLP stuff, but I found a really great 40-page overview paper on Oz called "Logic programming in the context of multiparadigm programming: the Oz experience" by Van Roy et. al. which describes the language in quite a lot of detail. Unfortunately, I can't for the life of me find a link to the paper :-( The language looks very interesting, although the syntax has some silly warts (why don't people just use S-expressions as the base syntax?).
|
|
Isaac Gouy - Re: Exploring NLP in Oz/Mozart
9/22/2002; 7:29:50 PM (reads: 1171, responses: 0)
|
|
|
jon fernquest - Re: Exploring NLP in Oz/Mozart
9/23/2002; 12:06:38 AM (reads: 1148, responses: 0)
|
|
> The language looks very interesting, although the syntax
> has some silly warts (why don't people just use S-expressions
> as the base syntax?).
I had similar thoughts, like why can't they accomplish the same thing with extensions to already existing languages like Scheme and there is in fact a constraint programming in Scheme library that I found that allows you to do much of the same things Oz does in all their cool examples.
I guess it is Scheme's *untyped nature* that allows it to add so many different features including lazy evaluation and several different flavors of typing.
So many books of the late 1980's had working examples in more strongly typed Pascal which didn't in the log run prove as malleable as Scheme.
"S-expressions as base syntax" also allows straightforward transformation to the signifcant indentation syntax of Python that so many find readable, also. (Paul Graham advocates this for Arc)
|
|
James Hague - Re: Exploring NLP in Oz/Mozart
9/23/2002; 7:50:39 AM (reads: 1153, responses: 0)
|
|
"Concepts, Techniques, and Models of Computer Programming" is an excellent text. It may prompt me to finally learn Oz properly. Up until now, I keep hitting a psychological barrier. Haskell, Erlang, and Python seem so clean and straightforward to me, but Oz has repeatedly set off the "This is Ugly!" light.
|
|
Paul Snively - Re: Exploring NLP in Oz/Mozart
9/23/2002; 10:58:58 AM (reads: 1141, responses: 0)
|
|
Michael Vanier: The language looks very interesting, although the syntax has some silly warts (why don't people just use S-expressions as the base syntax?)
Because the Lisp lesson is that this guarantees lossage of adoption, and I say this as a raving Lisp aficionado.
jon fernquest: I had similar thoughts, like why can't they accomplish the same thing with extensions to already existing languages like Scheme and there is in fact a constraint programming in Scheme library that I found that allows you to do much of the same things Oz does in all their cool examples.
I have to suggest, with all due respect, that you haven't looked that closely at Oz yet. Oz is multiparadigm, covering constraints, distributed, concurrency, logic, functional, and imperative paradigms in an integrated, coherent fashion. While there's no doubt that you could implement, e.g. the Oz kernel language in Scheme, probably relying on call/cc to simulate threads, one of the points of Oz is precisely to provide a virtual machine that implements the kernel, and its touchpoints to the OS for things like networking, efficiently. Scheme doesn't even have a standard means of providing access to TCP/IP. There's also the question of shipping code around: Oz features transportable bytecodes. You'd need to do something equivalent for Scheme, perhaps leveraging an existing bytecoded Scheme implementation. I suspect that by the time you got done doing everything Oz required, you'd find you'd be better off writing a new virtual machine anyway.
jon: I guess it is Scheme's *untyped nature* that allows it to add so many different features including lazy evaluation and several different flavors of typing.
Scheme isn't lazily evaluated, but of course you can get most of the way there by using delay and force with macros. And of course you can express various type systems with Scheme, but I don't know that its own latent typing helps with this.
It seems to me that you're closest to your intended mark in observing that both Scheme and Oz can be considered "kernel languages," where a very small, simple kernel is used to derive the rest of the language. Scheme's is essentially an interpreter for the Lambda Calculus with considerable syntactic sugar and some fundamental types. Oz's is a concurrent-constraint kernel extended with logic variables. So already we see differing foundations even at the kernel level, and these differences might not matter in theory (Turing equivalence, and all that), but appear to matter a great deal pragmatically (although I've not undertaken the exercise of reimplementing Oz's semantics in Scheme).
|
|
jon fernquest - Re: Exploring NLP in Oz/Mozart
9/24/2002; 1:08:14 AM (reads: 1111, responses: 0)
|
|
>> jon: I guess it is Scheme's *untyped nature* that
>> allows it to add so
>> many different features including lazy evaluation
>> and several different flavors of typing.
>Scheme isn't lazily evaluated...
I am not talking about the *defined language Scheme* I'm talking about extensions to the language.
There are several instances of lazy interpreter's
written in Scheme,
Similix has an efficient one. (SICP, EOPL, and LISP all have toy versions)
There is also the Spineless G Machine written in C behind Haskell compilers,
but I specifically mention Scheme because its first class continuations
make it a good candidate for implementing different
evaluation strategies.
Several implementations of Scheme do have TCP/IP (PLT Scheme)
and mobile bytecode (Tubes).
PLT unit modules provide a mutually recursive glue
to build the system with.
I'm talking about sharing definitions and implementation code (Scheme or C) in the manner of SLIB and Scheme SRFI's.
Sharing in this manner keeps the threads of ideas implicit in the code written in the language going for longer. (There are a lot of almost lost ideas in Lisp.)
No matter how well integrated Oz's features are in the kernel,
I doubt that this requires it all to be redesigned
from scratch.
Perhaps there are good reasons why a multi-paradigm programming system
like Oz-Mozart can only be created as a monolithic take or leave it system,
but creating such a system *severely limits their audience*.
What are the odds that significantly large amounts of people
are going to go off on a tangent, learn it, and create significant systems in it?
Very low. Currently, I don't see much at all happening.
If you start using a monolithic system like Oz-Mozart you're also more likely to be locking yourself into something that won't be around tommorrow.
The tradition of Scheme meta-circular interpreters also makes it crystal clear what the language is doing under the hood, something that can't be said for Haskell for instance (I'm struggling with the Hugs type extensions behind modular monadic semantics right now).
I appreciate your decomposition of what Oz and Scheme are and are not. If you know a good paper that describes the kernel in greater detail I'd like to read it.
You're right that I don't know a lot about Oz. I'd like to know more.
Syntax:
Someone has already remarked that the syntax is ugly, most people
also don't find the parentheses of Lisp very readable either, whereas
the "significant indentation" syntax of Python
is usually considered very readable.
There are straightforward *one-to-one* mappings between
s-expressions and Python-style significant indentation,
implying that you can easily provide alternative views of the code,
the s-expression view being more of an Abstract Syntax Tree view.
(the idea behind the Perspectives Project)
|
|
Frank Atanassow - Re: Exploring NLP in Oz/Mozart
9/24/2002; 3:59:14 AM (reads: 1109, responses: 0)
|
|
"Concepts, Techniques, and Models of Computer Programming" is an excellent text.
I've been dipping through it, and I agree: it looks like a great book, reminiscent of SICP (a fact which the authors acknowledge).
Good examples too, as Jon mentioned.
|
|
Paul Snively - Re: Exploring NLP in Oz/Mozart
9/24/2002; 11:19:54 PM (reads: 1075, responses: 0)
|
|
Jon, thanks for the cogent reply; it most definitely clarifies your intent from your original post. While you make some excellent points, it's amusing to me to read about Oz "limiting its audience" vs. a collection of non-standard extensions to Scheme, no matter how nice SLIB and the SRFI mechanisms might be.
With that said, I do think that there are some very interesting parallels between Scheme and Oz, with, again, the most significant one being a firm grounding in a simple "kernel language." You'll find a brief description of Oz's kernel language at <http://www.mozart-oz.org/documentation/tutorial/node1.html>, Section 1.2. Of course, the book that everyone's raving about (see <http://www.info.ucl.ac.be/~pvr>) goes into considerable detail about Oz' kernel language, and in fact contradicts my assertion that Scheme is also a kernel language, apparently drawing a distinction between theoretical foundations such as the lambda or pi calculi and "a small number of programmer-relevant concepts." Personally, I find such a distinction meaningless at best and wrong at worst: if anything, I prefer the theoretical foundation for reasons of possible treatment via proof-theoretic means, therefore I wonder about the possibility of a "validated" Oz implementation.
Nevertheless, I'm enjoying Oz, and I think the materials such as the book, the system itself, and the NLP materials that began this thread do an excellent job of justifying the consortium's judgment in creating yet another language vs. attempting to extend any of the admittedly fine existing languages to be equivalent. Of course, your milage may vary.
|
|
|
|