Functional
Comonadic Functional Attribute Evaluation. Tarmo Uustalu and Varmo Vene.
We have previously demonstrated that dataflow computation is comonadic. Here we argue that attribute evaluation has a lot in common with dataflow computation and admits a similar analysis. We claim that this yields a new, modular way to organize both attribute evaluation programs written directly in a functional language as well as attribute grammar processors.
(This is an extended abstract. I believe a longer version is here, but I haven't read it.)
We've previously seen The Essence of Dataflow Programming, and in one sense this is a follow-up. Applying comonads to dataflow programming was not uncritically accepted as being "essential," as I recall. For attribute grammars, this approach seems elegant enough from a casual read, but they don't offer a detailed comparison with prior work on attribute grammars, and I'm nowhere near familiar enough to judge for myself.
Whether or not this is a significant step forward in that sense, it certainly has some nice examples of comonads. There's also this claim:
This does not seem to have been mentioned in the literature, but the type constructor CxtTree E is a comonad (just as LS is; in fact, the same is true of all zipper type constructors).
(Emphasis mine.) This seems to be a relatively well-known fact about zippers, but perhaps not?
I found an interesting new paper by Edwin Brady.
Abstract. Dependent type theory has several practical applications in
the fields of theorem proving, program verifcation and programming
language design. Ivor is a Haskell library designed to allow easy extend-
ing and embedding of a type theory based theorem prover in a Haskell
application. In this paper, I give an overview of the library and show
how it can be used to implement formal systems such as propositional
logic. Furthermore, I sketch an implementation of a simple functional
programming language using the library; by using type theory as a core
representation, we can construct and evaluate terms and prove correct-
ness properties of those terms within the same framework, ensuring con-
sistency of the implementation and the theorem prover.
Software Extension and Integration with Type Classes. Ralf Lämmel and Klaus Ostermann.
The abilities to extend a software module and to integrate a software module into an existing software system without changing existing source code are fundamental challenges in software engineering and programming-language design. We show that these challenges can be tackled, at the level of language expressiveness, by using the language concept of type classes, as it is available in the functional programming language Haskell. A detailed comparison with related work shows that type classes provide a powerful framework in which solutions to known software extension and integration problems can be provided. We also pinpoint several limitations of type classes in this context.
We've had a number of papers lately with solutions to the expression problem and related extensibility challenges, using various techniques in various languages. Here's one exploring the expressiveness of Haskell's type classes.
It's extremely instructive to compare different approaches to these now-standard problems, and in fact I wonder whether this would make an ineresting approach to a programming languages survey course: In CS 3xx we explore and compare a number of programming languages by exploring idiomatic solutions to standard software engineering challenges.
The authors are looking for comments on this draft for a few more days.
Generics as a Library. Oliveira, Hinze and Löh.
A generic function is a function that is defined on the structure of data types: with a single definition, we obtain a function that works for many data types. In contrast, an ad-hoc polymorphic function requires a separate implementation for each data type. Previous work by Hinze on lightweight generic programming has introduced techniques that allow the definition of generic functions directly in Haskell. A severe drawback of these approaches is that generic functions, once defined, cannot be extended with ad-hoc behaviour for new data types, precluding the design of a customizable generic programming library based on the se techniques. In this paper, we present a revised version of Hinze's Generics for the masses approach that overcomes this limitation. Using our new technique, writing a customizable generic programming library in Haskell 98 is possible.
Pushing forward the state of the generics art in Haskell 98. They also discuss the application of their technique to the expression problem.
(Thanks to Jacques Carette for pointing in this direction.)
From the "Whoa!" files:
Concoqtion: Mixing Indexed Types and Hindley-Milner Type Inference
This paper addresses the question of how to extend OCaml’s Hindley-Milner type system with types indexed by logical propositions and proofs of the Coq theorem prover, thereby providing an expressive and extensible mechanism for ensuring fine-grained program invariants. We propose adopting the approached used by Shao et al. for certified binaries. This approach maintains a phase distinction between the computational and logical languages, thereby limiting effects and non-termination to the computational language, and maintaining the decidability of the type system. The extension subsumes language features such as impredicative first-class (higher-rank) polymorphism and type operators, that are notoriously difficult to integrate with the Hindley-Milner style of type inference that is used in OCaml. We make the observation that these features can be more easily integrated with type inference if the inference algorithm is free to adapt the order in which it solves typing constraints to each program. To this end we define a novel “order-free†type inference algorithm. The key enabling technology is a graph representation of constraints and a constraint solver that performs Hindley-Milner inference with just three graph rewrite rules.
Another tough-to-categorize one: dependent types, the Curry-Howard Correspondence, logic programming, theorem provers as subsystems of compilers, implementation issues... it's all in here.
Update: A prototype implementation is available here, but it took a bit of Google-fu to find, and it's brand new, so be gentle.
Update II: The prototype implementation isn't buildable out of the box, and includes a complete copy of both the Coq and O'Caml distributions, presumably with patches etc. already applied. So it's clearly extremely early days yet. But this feels very timely to me, perhaps because I've just started using Coq within the past couple of weeks, and got my copy of Coq'Art and am enjoying it immensely.
Update III: It occurs to me that this might also relate to Vesa Karvonen's comment about type-indexed functions, which occurs in the thread on statically-typed capabilities, so there might be a connection between this front-page story and the front-page story on lightweight static capabilities. That thought makes me happy; I love it when concepts converge.
In the continuing story called Epigram, a new contribution by James mcKinna and Joel Wright.
Conventional approaches to compiler correctness, type safety and type preservation have focused on off-line proofs, either on paper or formalised with a machine, of existing compilation schemes with respect to a reference operational semantics. This pearl shows how the use of dependent types in programming, illustrated here in Epigram, allows us not only to build-in these properties, but to write programs which guarantee them by design and subsequent construction.
We focus here on a very simple expression language, compiled into tree-structured code for a simple stack machine. Our purpose here is not to claim any sophistication in the source language being modelled, but to show off the metalanguage as a tool for writing programs for which the type preservation and progress theorems are self-evident by construction, and finally, whose correctness can be proved directly in the system.
In this simple setting we achieve the following;
• a type-preserving evaluation semantics, which takes typed expressions to typed values.
• a compiler, which takes typed expressions to stack-safe intermediate code.
• an interpreter for compiled code, which takes stack-safe intermediate code to a big-step
stack transition.
• a compiler correctness proof, described via a function whose type expresses the equational
correctness property.
P.S. I found it difficult to classify this paper correctly. Neither functional, nor type theory seem the proper category for a dependently typed programming language, which probably is a separate paradigm.
Interesting draft paper on the History of Haskell by Simon Peyton Jones, Phil Wadler, Paul Hudak, and John Hughes.
This paper describes the history of Haskell, including its genesis and principles, technical contributions, implementations and tools, and applications and impact.
This paper is aimed at History of Programming Languages - HOPL III to be held in June 2007.
In 1978, the first History of Programming Language Conference (HOPL) described the development of 13 computer programming languages, the people who participated in that work, and the context in which it was undertaken. In 1993, HOPL-II contained 14 papers on the genesis and evolution of programming languages. It is time for HOPL-III, to be held with FCRC 2007 in San Diego. Each HOPL-III paper should detail the early history or evolution of a specific programming language. Preliminary ideas about each language should have been documented by 1996 and each language should have been in use by 1998.
Which leaves the question of which PLs should take part in HOPL-III?
(I guess I need to go back and remember which were documentend in I & II).
Registration is now open for the 9th Annual ICFP Programming Contest. The ICFP contest is an event that traditionally raises interest in the LtU community.
A more detailed announcement found in the forum mentions that this year's theme is "computational archaeolinguistics." Intriguing.
We are seeking comments on the final draft of our ICFP 2006 paper: Delimited dynamic binding, by Oleg Kiselyov, Chung-chieh Shan, and Amr Sabry.
Dynamic binding and delimited control are useful together
in many settings, including Web applications, database cursors, and
mobile code. We examine this pair of language features to show that the
semantics of their interaction is ill-defined yet not expressive enough
for these uses.
We solve this open and subtle problem. We formalise a typed language
DB+DC that combines a calculus DB of dynamic binding and a calculus
DC of delimited control. We argue from theoretical and practical
points of view that its semantics should be based on delimited
dynamic binding: capturing a delimited continuation closes over
part of the dynamic environment, rather than all or none
of it; reinstating the captured continuation supplements
the dynamic environment, rather than replacing or inheriting it. We
introduce a type- and reduction-preserving translation from DB+DC to DC,
which proves that delimited control macro-expresses dynamic
binding. We use this translation to implement DB+DC in Scheme, OCaml,
and Haskell.
We extend DB+DC with mutable dynamic variables and a facility to
obtain not only the latest binding of a dynamic variable but also
older bindings. This facility provides for stack inspection and (more
generally) folding over the execution context as an inductive data
structure.
The paper comes with a large amount of
accompanying code—in Scheme, OCaml, SML, and Haskell. The code (described in the paper's appendix) uses realistic examples to show how the joint behavior of delimited continuations
and dynamic binding is ill-defined in various PL systems, and solves the problem by a
full-fledged implementation of dynamic binding in all these systems. Any comments or suggestions would be very welcome!
Since we've been discussing hybrid type checking, dependent types, etc. recently...
Sage
Sage is a prototype functional programming language designed to provide high-coverage checking of expressive program specifications (types). Sage allows a programmer to specify not only simple types such as "Integers" and "Strings" but also arbitrary refinements from simple ranges such as "Positive Integers" to data structures with complex invariants such as "Balanced binary search trees." In addition to featuring these predicates upon types, Sage merges the syntactic categories of types and terms, in the spirit of Pure Type Systems, to express dependent types such as that of the infamous printf function.
Sage performs hybrid type checking of these specifications, proving or refuting as much as possible statically, and inserting runtime checks otherwise. For the complete details of the theory and empirical results, we direct you to the technical report.
|
Recent comments
23 weeks 2 days ago
23 weeks 2 days ago
23 weeks 2 days ago
45 weeks 4 days ago
49 weeks 5 days ago
51 weeks 3 days ago
51 weeks 3 days ago
1 year 2 weeks ago
1 year 6 weeks ago
1 year 6 weeks ago