In the past few months, I've been studying programming languages/compilers/etc.. I'm surprised that basic data structures used to implement various language constructs are not given a great deal of importance.
Many PL/Compiler books mention tuples, records, trees, linked lists...but don't generally describe their performance trade-offs.
Simple matters such as using a basic linked list vs. a linked list which stores several keys in a node would greatly improve performance (and just one paper I've read mentions that most implementation do use such a technique).
Huge performance gains from languages such as fortran and APL seem to come from data locality as well (alhtough I can't be sure since I didn't get any straight answers from relevant usenet newgroups...or I didn't know how to phrase the question correctly).
Now I have collected enough books and papers which talk about lexing, parsing, type theory, functional vs. imperative, etc. But nothing which goes into detail about having the right set of fundamental data structures. For example: isn't it better to use to 'struct' to keep attributes of an object together in memory (since they are often likely to be used together), where as attributes of a large relation should be more liked a 'linked' data structure to allow quick addition/deletion of attributes (...basically adding relational algebra as a first class component my mini-language). What if I have a list of a million records (tuple with named attributes)...isn't it much better to have name/position correspondence stored on as part of list definition rather than a million times with each instance of record?
Data structures are, obviously, a seperate subject, but surely they deserve more attention in compiler/PL books. Any way, what do the experts on this forum think? Did I miss something obvious?
Recent comments
27 weeks 2 days ago
27 weeks 2 days ago
27 weeks 2 days ago
49 weeks 3 days ago
1 year 1 week ago
1 year 3 weeks ago
1 year 3 weeks ago
1 year 5 weeks ago
1 year 10 weeks ago
1 year 10 weeks ago