User loginNavigation |
archivesCilk, OpenMP, or what?For SMP programming at the C level, Cilk's model is sweet and clean. The OpenMP model is "industrial-strength", but complex and error-prone. Anyone have experience with both? What, if anything, can be done (in parallel) in OpenMP but not in Cilk? Cilk seems not to have have had much recent development (indeed, the link to the ICFP Programming contest they won is now dead). OpenMP on the other hand is now part of GCC. Is this a case of "doomed to success"? Josh Extensible LanguagesI'm interested in some thoughts on issues related to syntactic extensions to programming languages. As mentioned Felix now has DSSL (Domain Specific Sublanguage) support. The Felix grammar is now loaded dynamically from the library at run time. This looks like: syntax dssl1 { nt := sym1 sym2 .. =># "scheme code"; ... } .. open syntax dssl1, dssl2; Here every name used is taken as a nonterminal symbol. There is no error checking. A new GLR automaton is built where the open syntax statement is issued, and applied until the end of the containing scope; more precisely, the scope is determined directly by the grammar itself: state modifications to the parser effected in a reduction propagate to the end of the production parsing that nonterminal. Attempting to parse with an undefined nonterminal raises an exception. Several DSSLs, including the current set, may already define nonterminals required for some other DSSL. There is no way to specify such dependencies. The same DSSL can be loaded twice. No attempt is made to stop that, the grammar is just extended again. My syntax doesn't allow for handling merges: the parser engine takes the most recent rule and uses that, so an ambiguous parse of a nonterminal will be silently repaired. Clearly, we'd like DSSLs to be modular and well typed, with the usual kinds of properties we expect from libraries and module constructs. How would you suggest to fix these problems to achieve those goals? Whilst this question is aimed at programmers, I have one for the category theorists too. In a language like Ocaml, an interesting technique termed open recursion can be used to extend types and functions covariantly an in a modular fashion. An otherwise recursive construction is made non-recursive by replacing inner calls with a parameter to form an open version of it, and then a second construction defines the desired closure by applying the symbol to itself (yes I know that's a crude description). The parameter also allows covariant extension by composing several such open constructions, and then closing them with respect to the composition. To me there is an obvious and unexplored extension to this technique to grammars. We make a production non-recursive by replacing a recursion by a parameter, so now we have a new notion: a nonterminal parametrised by another nonterminal, and we form a closed term by applying the nonterminal to itself. If we add new productions to the nonterminal grammar, we can form closure of that too, to obtain a new language. An extension of this idea is to parameterise a whole DSSL (grammar package) with other DSSLs, much like an Ocaml functor, and again form a closed language by a self-application. It seems to me a nonterminal is really a just a type, and alternate productions are really constructors, so the analogy with open recursion for types is strong, and it would seem the user actions similarly would need to be open recursive. Is there any theoretical work on this idea that suggests how a statically typed grammar with open recursion would be typechecked? For the modular extension? Is there even a model of it? To me this kind of thing is better than what I am doing now, which effectively adds productions to an existing nonterminal, making the old version unavailable. Although this is dynamically covariant, it is not pleasing because it is a mutation rather than being purely functional and combinatorial. Why Events Are A Bad Idea (for high-concurrency servers)Why Events Are A Bad Idea (for high-concurrency servers)
The authors take the example of a web server to show that with compiler support for improved threads, they can achieve as good or better performance than event-based applications in high concurrency context. Multiscale Scheduling, Integrating Competitive and Cooperative Parallelism in Theory and in PracticeMultiscale Scheduling, Integrating Competitive and Cooperative Parallelism in Theory and in Practice.
This proposed long-term research project (which as far as I can tell was not mentioned here before) is interesting on many levels. From a programming languages perspective the main idea seems to be extending the NESL language, which is based on data parallelism, with such concepts as speculative parallelism that mesh well with multilevel scheduling. Theorem proving support in programming language semanticsYves Bertot, one of the coauthors of the Coq'Art book, has made available a tech report on Theorem proving support in programming language semantics, which describes how Coq can be used to support an exhasutive "views" approach to program semantics, using a toy imperative language from Glyn Winskel's book, The Formal Semantics of Programming Languages, and specifying a family of kinds of semantics on it, following Winskel's treatment. Bertot follows the work of Tobias Nipkow, in Winskel is (almost) Right: Towards a Mechanized Semantics Textbook using Isabelle, but he shows how Coq's support for reflection allows this treatment to be carried out in a lean, condensed manner. The 23 page paper covers Plotkin-style SOS semantics, Kahn's natural semantics, Dijkstra's weakest precondition semantics, a simple CPO denotational semantics that I assume is due to Winskel & finally uses the denotational model as a basis for abstract interpretation. The PIllars of ConcurrencyHerb Sutter, The Pillars of Concurrency, Dr. Dobb's Journal, 2007-07-02
When I first read the abstract, I assumed that Sutter meant a mental model along the lines of the pi-calculus. But it turns out what he was talking about has more to do with outlining the fundamental things for which concurrency is used, what requirements those things imply, and what kind of solutions can be used to meet the requirements. Nothing startlingly new, although the overview Sutter provides is interesting. From a language design perspective, it might be interesting to consider which of the "pillars" that Sutter identifies are supported by a given language, how well, and why the language developed that way. Many of the arguments about the "best" way to handle concurrency may simply boil down to differences in the application domains being considered. Also interesting to note is that much of this column was inspired by work by David Callahan on concurrency support for Visual Studio, so the things that Sutter discusses provide some insight into what Microsoft is thinking about for future development products. Simply Easy! (An Implementation of a Dependently Typed Lambda Calculus)Simply Easy! (An Implementation of a Dependently Typed Lambda Calculus)
[ANN] YARD 1.0: C++ Template Metaprogramming Parsing FrameworkThree years after the initial announcement here at LtU of the YARD parsing framework, I've finally posted version 1.0 of YARD. I figure despite being self-promotional (I figure it is a forum quality, if not quite meritorious of front-page status), this post should still be relevant to LtU readers because of its overlap with other LtU topics:
Of course many readers of LtU are language implementers and may find a new parsing framework useful or inspirational. Below is the official announcement of YARD:
|
Browse archivesActive forum topics |
Recent comments
27 weeks 3 days ago
27 weeks 3 days ago
27 weeks 3 days ago
49 weeks 4 days ago
1 year 1 week ago
1 year 3 weeks ago
1 year 3 weeks ago
1 year 6 weeks ago
1 year 10 weeks ago
1 year 10 weeks ago