archives

A question I have been meaning to ask.

The Lambda calculus is a logical language and so are derivations such as lisp, scheme, and other functional languages. As such they theoretically could be used to prove facts as in languages like Prolog but I have never seen this done.

Moreover there is an intuitionistic logic (could this also be the more classic synthetic logic??) which could be quite useful for dealing with "real" dynamic, or uncertain environments. Something that could be very useful. Yet I am not awair of any practical tools for this.

Might not the logical interpretation of a function be more clear if the functions were written as rules, and had only a works/doesn't work return value?

Just wondering
Hank

What will Apple's move to Intel mean for Alternative Language Communities?

In light of yesterday's big news from Apple, I am left wondering what will become of the many fine language implementations that we have seen emerging under OS X?

Will the transition present problems for PLT Scheme, Haskell, Frontier, J, Croquet, FScript, etc....?

Will the various vm engines and self-hosted native code generation capabilities survive with a few minor tweaks and a simple recompile or are they predicated on the PowerPC architecture itself forcing language designers back to the blackboard?

Finally, from a purely technical perspective vis-a-vis the chip sets in question, was this move a stroke of genious, a case of 'worse is better', a bad idea, or an overall wash?

data locality and data structures

In the past few months, I've been studying programming languages/compilers/etc.. I'm surprised that basic data structures used to implement various language constructs are not given a great deal of importance.


Many PL/Compiler books mention tuples, records, trees, linked lists...but don't generally describe their performance trade-offs.


Simple matters such as using a basic linked list vs. a linked list which stores several keys in a node would greatly improve performance (and just one paper I've read mentions that most implementation do use such a technique).


Huge performance gains from languages such as fortran and APL seem to come from data locality as well (alhtough I can't be sure since I didn't get any straight answers from relevant usenet newgroups...or I didn't know how to phrase the question correctly).


Now I have collected enough books and papers which talk about lexing, parsing, type theory, functional vs. imperative, etc. But nothing which goes into detail about having the right set of fundamental data structures. For example: isn't it better to use to 'struct' to keep attributes of an object together in memory (since they are often likely to be used together), where as attributes of a large relation should be more liked a 'linked' data structure to allow quick addition/deletion of attributes (...basically adding relational algebra as a first class component my mini-language). What if I have a list of a million records (tuple with named attributes)...isn't it much better to have name/position correspondence stored on as part of list definition rather than a million times with each instance of record?


Data structures are, obviously, a seperate subject, but surely they deserve more attention in compiler/PL books. Any way, what do the experts on this forum think? Did I miss something obvious?

Reusing XML Processing Code in non-XML Applications

I'd like to introduce an article which might be of some interest:

Reusing XML Processing Code in non-XML Applications

[abstract]

XML can be considered as a representation of hierarchical data, and the XML-related standards - as methods of processing such data. We describe benefits of XML view on legacy data and its processing, and suggest a method to develop XML tools and make them reusable for different tree-like structures in different programming languages.

Our approach is to use virtual machine technology, in particular, the Scheme programming language. We're taking the unusual step of using the Scheme syntax itself as a native virtual machine language. Together with the SXML format and Scheme implementations tuning, it gives us the XML virtual machine (XML VM).

Reference implementations are ready for the Python and C languages. We describe a library for XSLT-like transformations of the Python parse trees and a special version of the GNU find utility which supports XPath queries over the file system.

[/abstract]

The article needs some rework. Unfortunately, I don't know when I'll find time for it, so I publish it as is.