User loginNavigation |
LtU Forummechanics of designing and implementing a languageHi all, Any way, I started out by writing our what my code might look like...but it seems I've been concentrating too much on syntax rather than the semantics. Then I switched to drawing abstract syntax trees but they only seem to capture specific programs. (i learned some of these PLT related words in the past few days so excuse their abuse). I'd like to code a small 'core.' Then extend then core as I learn new concepts (message passing concurency, lambda expressions, etc.). Developing a syntax, messing around with parsers, (as I mentioned earlier) seems to be more of a distraction at such an early stage. How should I approach this. I've been reading books, papers and looking at features of existing languages...I'm not sure what's the most effecient way of actually starting to implement all these (often overwhelming) ideas. I'm not sure if it is relevant but I've been skipping around the following books: Finally, I'll likely use Java or C# for parsing, dealing with AST, code generation, etc. Eventually I'd like to either have this language be interpreted by Parrot or have it compiled...by first translating it to C. Thanks! Fun: HaWiki Quotes PageI just want to point out the HaWiki Quotes Page to those who haven't seen it before. Most of the quotes are from the #haskell IRC channel on FreeNode, and most are chosen for their humor value. I imagine that everyone here, no matter where you stand on FP or Haskell, will find some entertaining quotes (and anyways many of them are only tenuously related to Haskell at best). Lest you get the idea that this is some roundabout Haskell advocacy, while there are quotes that reflect well on Haskell, there are also quotes like this scathing indictment: I'm assuming people here know who Simon Peyton Jones is. :) expressivity of "idiomatic C++"The April issue of C/C++ Users Journal has an article called A New Solution To an Old Problem by Andrew Koenig and Barbara E. Moo. In it, they revisit the Hamming numbers problem from Dijkstra's A Discipline of Programming. They examine four different solutions:
The Haskell solution is the following
scale n (x:xs) = (n * x) : (scale n xs)
merge xs [] = xs
merge [] ys = ys
merge (x:xs) (y:ys) =
if x == y then
x : (merge xs ys)
else if x < y then
x : (merge xs (y:ys))
else
y : (merge (x:xs) ys)
seq = 1 : (merge (scale 2 seq)
(merge (scale 3 seq) (scale 5 seq)))
Their "idiomatic C++" solution uses ordered sets: set<int> seq; seq.insert(1); set<int>::const_iterator it = seq.begin(); int val = *it; seq.insert(val * 2); seq.insert(val * 3); seq.insert(val * 5); it++; In conclusion, they have this to say (emphasis mine),
I may be reading too much into this quote, but it sounds to me like Koenig and Moo consider it a bad thing to require a "totally different way of thinking about programming". P.S. While googling for Hamming numbers, I came across this related paper: Expressivity of Functional-Logic Languages and Their Implementation by Juan José Moreno Navarro. programmatic nature of subatomic make upok, so i can't say i have much depth of undestanding of subatomic makeup. however, the more i learn about programming and design, the more i see it mimicking what nature has been doing for aeons. well designed code anyway. this leads me to reason that not only should biology be an analogue for programming, but it is likely that the physical world itself has characteristics that we are now convergently using in our code. take a 3D virtual reality world... if it were possible (ie., no limit on processing power or distributed processing) then the best way to code that world would be to build the world from atoms, putting all the info needed for phyiscal behaviour etc, into each one of those atoms. OOPing at the lowest level. at least, that's how i would see the 'best' way to build that world. we have seen that our newly developed methods of programming are mimicking what biology has achieved and found to be a 'good' general design strategy... so IMO it stands to reason that physics has done the same thing. that each atom has the physical constants and equations 'hardwired' into their subatomic makeup. from there it comes down to interaction between these particles and the cumulative result of that distributed processing.... any thoughts? New Dan Friedman book coming up??
The Reasoned Schemer sounds like the title of a cool new book in the works, doesn't it?
A Scheme incarnation of the much anticipated The Little Haskellist, perhaps? Proper tail reflection?
I was reading some papers on reflective towers, and it occurred to me that levels in the towers look suspiciously similar to recursive calls (well, it was actually openly proclaimed by the authors, so probably "occurred" is not the right word).
For example, from A Tutorial on Behavioral Reflection and its Implementation: In the same way well-defined recursions never require an infinite number of recursive calls, a well-defined reflective program never uses an infinite number of embedded reflective procedure calls.What really occurred to me, why not give to reflective levels some kind of TCO? After a minute's joy, I decided to consult Google, and found this: Intensions and Extensions in a Reflective Tower: The key points obtained here are: a formal relation between the semantic domains of each level; a formal identification of reification and reflection; the visual- isation of intensional snapshots of a tower of inter- preters; a formal justification and a generalization of Brown's meta-continuation; a (structural) denotational semantics for a compositional subset of the model; the distinction between making continuations jumpy and pushy; the discovery of the tail-reflection property; and a Scheme implementation of a properly tail-reflective and single-threaded reflective tower.So, on a negative side, somebody (actually, Danvy) discovered this long ago; on a positive side, I can now read it :) Speed and semantics in CTM Chap. 1Because it seems highly regarded among LtU members, I've started reading CTM. Already Chapter 1 has provoked some thoughts/questions. In Section 1.7 (Complexity), the authors introduce an optimization, which, in simplified form, is to use L=expr instead of {Fn expr expr}. This is advertised as changing the enclosing algorithm from complexity 2^n to n^2. I have no doubt that in practice, i.e. using their implementation of the language (and many (all?) implementations of similar languages) this speedup holds. But, my question is, what in the semantics of the language makes their speedup claim true? At this point in the book their language lacks side effects, so, in theory, couldn't a compiler take advantage of referential transparency and "hoist" (is that the right word? seems like it is usally applied to loops) the evaluation of expr outside the function call, i.e. do same optimization suggested? In general what (if anything) does the semantics of the language allow you to conclude about complexity? Perhaps an operational semantics allows you to make such conclusions, but I really don't see how a denotational one does. And, if an operational semantics allows you to make complexity conclusions, does this mean that an optimizing compiler could violate semantics (even if such a "violation" were welcome because it resulted in a speedup)? Script Vs JavaWe have a product that comes with its own proprietary scripting language. This scripting language runs inside a JVM and can callout to java classes. Like most of the scripting languages, this script is not object oriented. I am concerned about using this scripting language for heavy programming. This scripting code can only be stored in a database. It is hard for multipel developers to work on it simultaneouly. I am afraid of facing significant maintenance issues in the future. Before it becomes too late, I am thinking to write the code in java and call from this scripting language.I am interested in comparing the advantages/disadvantages of using a proprietary scripting language against using Java. Please give me your thoughts so I can make a better decision. I really appreciate your input.. Thanks Best Common Lisp IDE?As I am starting to learn Common Lisp, it would be great to do so using an easy to use environment/IDE. What would you recoomend? thanks, Perl6 - PUGSWell there doesn't seem to be a plethora of Perl fans here (I hear Python this and Ruby that...) but I thought I'd mention Pugs. Pugs is an implementation of Perl6, written in Haskell. While, as mentioned in other recent articles some languages are moving away from powerful abstractions, Perl continues to suck them in. One of the more interesting things they are putting into Perl6 are junctions (which I like to think of as quantum superpositions). Things like that, along with formalization of the object system and other cleanups should make Perl6 a language which gets mentioned here much more often. The Pugs implementation (in progress) is being written at breathtaking speed. Check it out and lets hear your thoughts! |
Browse archives
Active forum topics |
Recent comments
2 days 15 hours ago
3 days 12 hours ago
4 days 17 hours ago
4 days 17 hours ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
4 weeks 3 days ago
5 weeks 1 day ago
5 weeks 2 days ago