User loginNavigation |
LtU ForumHow widespread are inhouse DSLs?
A student asked me this question, and apart from saying that quite a large percentage of large organizations use in house DSLs, I couldn't give any details, nor am I aware of any research.
So if anyone came across a survey or research report that gives useful (hopefully current) information about DSL use, I'd be glad to hear about it. Denotational semantics of a DSL?
I am really at a loss here... I was just trying to understand if one can define more or less formally what is a DSL.
Is it semantics of DSLs that differentiates them from GP PLs? Or is it only pragmatics ("you can do the same in any GP but it's easier/faster/cheaper to do in this DSL")? In other words, can we define as DSLs those PLs whose semantic domains are, uh, domain specific? By a semantic domain here I mean the range (or codomain) of a semantic function, and the domain of this function is a syntactic domain... All these puns are not really intended :-( ... After several unsuccessful attempts to pursue this path, I came to another "definition" - DSLs are the PLs with the pragmatics massively dominating the semantics :-) PS: I would be very interested in seeing any references to papers on denotational semantics of some DSL. Turing Extender Language (TXL)TXL has been mentioned briefly before on LTU. There is a recent paper in the LTDA Proceedings 2004 about the motivations of TXL, where the language designers iteratively modified their grammar to suit the intuitive expectations of their users. Turing uses an asterisk (*) to denote the upper bound of a parameter array (as in array 1..* of int). Users therefore began to write s(3..*) to mean the substring from position 3 to the end of the string, s(1..*-1) to mean the substring from the first position to the second last, s(*-1..*) to mean the substring consisting of the last two characters, and so on. As these forms evolved, the language was modified to adapt to the users’ expectations. The approach above does sound fruitful, if we want to achieve higher programmer productivity, rapid iterative design of the tools used has a radical chance of making an impact, instead of the path of taking tens of years before a language becomes a productive medium. Incidentally, TXL allows more than rapid prototyping of the Turing language itself though. Here is an example of how one can override the Pascal grammar. % Trivial coalesced addition dialect of Pascal % Based on standard Pascal grammar include "Pascal.Grm" % Overrides to allow new statement forms redefine statement ... | [reference] += [expression] end redefine % Transform new forms to old rule main replace [statement] V [reference] += E [expression] by V := V + (E) end rule The designers of TXL chose Lisp as the model for the underlying semantics, and uses functional programming with full backtracking for both parser and transformer. Links
You can see slides from the Links meeting
here and commentary and pictures
here.
(Thanks to Ethan Aubin for already starting a thread under the former, and to Ehud Lamm for inviting me to guest blog.)
Ethan Aubin writes: So why do we need a new language? What cannot be accomplished with existing frameworks? There is a slide following this asking why can't you do this in Haskell or ML, but I don't know why they (or even java/php/etc) aren't enough.Let me try to answer this. Links is aimed at doing certain specific things.
Is it a good enough argument? Is this enough of an advantage to get folk to move from PHP, Perl, Python? Not clear. I suspect if it is good enough, a major motivating factor is not going to be anything deep, but simply the fact that being able to write everything down in one language instead of three or four will make people's brains hurt less. Ethan Aubin also writes: Wadler goes into the FP success stories, Kleisli, Xduce, PLT Scheme (Continuations on the Web), Erlang. If you take the befenits of these individually, you've got a language which solves the 3-tier problem better than what we have now, but I don't think it meet the criteria of "permitting its users to do something that cannot be done in any other way". So, I'd like to ask the all the perl/php/asp/pythonistas on LtU, what it is the killer-app that that your language cannot handle?I'd love to see answers to this question! Premonoidal categories and notions of computation
I am currently working through Premonoidal categories and notions of computation.
Eugenio Moggi, in (Moggi 1991), advocated the use of monads, equivalently Kleisli triples, to model what he called notions of computation.As my knowledge of monads and CT is very limited, it's pretty tough... Does anybody have any opinion on the value of this paper and importance of premonoidal categories to CS? Thanks! Links SlidesThe speakers at the Links meeting at ETAPS have posted slides from their talks To me, Xavier Leroys slides seem especially interesting, but there is something for everyone. Transactions, XML, Concurrency, Types, Object-Orientation, etc. JPred -- predicate dispatch for Java
JPred extends Java to include predicate dispatch.
This is quite cool -- predicate dispatch allows for the partial implementation of methods in abstract classes so that the base class can provide common argument type checking (nulls, negative values, etc.) as well as simplifying the maintenance of the jumbo if/case statements for dispatching based on the type of an object (e.g. event dispatch) or field values. As a language design non-expert, predicate dispatch reminds of template matching in XPath, which I have found to be incredibly useful for handling special cases like "Chapter 1 shouldn't have a blank page in front of it" without messing up a working template for the more general case. I'd expect that JPred will provide a similar capability. JPred is implemented using Polyglot, and compiles JPred code into vanilla Java. Todd Millstein, the creator, used JPred to re-implement a complex chunk of event handling code and eliminated several gaps in the original dispatch code. I'm not able to appreciate the subtler parts of this work, I suspect, but the basic idea is simple enough that I could explain it to my 14 year-old son and he understood its value. By meltsner at 2005-04-08 15:15 | LtU Forum | login or register to post comments | other blogs | 10798 reads
Why is erlang.org downDoes anybody know why the site is down? Also, where can I find active lists/forums on Erlang? Thanks, Lisp-Stat does not seem to be in good health lately.The Journal of Statistical Software http://www.jstatsoft.org/ has a Special Volume devoted to the topic: "Lisp-Stat, Past, Present and Future". In the world of statistics, it appears that XLISP-STAT http://www.stat.uiowa.edu/~luke/xls/xlsinfo/xlsinfo.html has lost out to the S family of languages: S / R / S-plus:
In fact, the S languages are not statistical per se; instead they provide an environment within which many classical and modern statistical techniques have been implemented. An article giving an excellent overview of the special volume is: "The Health of Lisp-Stat" http://www.jstatsoft.org/v13/i10/v13i10.pdf Some of the articles describe the declining user base of the language due to defections:
whilst other articles describe active projects using XLisp-Stat, often leveraging the power of the language, in particular for producing dynamic graphics. The S family of languages, originally developed at Bell Labs, has much to recommend it. S is an expression language with functional and class features. However, as the original creator and main developer of XLisp-Stat, (and now R developer) Luke Tierney explains in "Some Notes on the Past and Future of Lisp-Stat" http://www.jstatsoft.org/v13/i09/v13i09.pdf ,
mechanics of designing and implementing a languageHi all, Any way, I started out by writing our what my code might look like...but it seems I've been concentrating too much on syntax rather than the semantics. Then I switched to drawing abstract syntax trees but they only seem to capture specific programs. (i learned some of these PLT related words in the past few days so excuse their abuse). I'd like to code a small 'core.' Then extend then core as I learn new concepts (message passing concurency, lambda expressions, etc.). Developing a syntax, messing around with parsers, (as I mentioned earlier) seems to be more of a distraction at such an early stage. How should I approach this. I've been reading books, papers and looking at features of existing languages...I'm not sure what's the most effecient way of actually starting to implement all these (often overwhelming) ideas. I'm not sure if it is relevant but I've been skipping around the following books: Finally, I'll likely use Java or C# for parsing, dealing with AST, code generation, etc. Eventually I'd like to either have this language be interpreted by Parrot or have it compiled...by first translating it to C. Thanks! |
Browse archives
Active forum topics |
Recent comments
3 weeks 6 days ago
3 weeks 6 days ago
3 weeks 6 days ago
4 weeks 10 hours ago
4 weeks 3 days ago
4 weeks 3 days ago
4 weeks 5 days ago
4 weeks 5 days ago
4 weeks 5 days ago
4 weeks 5 days ago