User loginNavigation 
Lambda CalculusParameterized Notions of ComputationParameterized Notions of Computation, Robert Atkey, JFP 2008.
Once you've programmed with monads for a while, it's pretty common to start defining parameterized families of monads  e.g., we might define a family of type constructors for IO, in which the program type additionally tracks which files the computation reads and writes from. This is a very convenient programming pattern, but the theory of it is honestly a little sketchy: on what basis do we conclude that the indices we define actually track what we intend them to? And furthermore, why can we believe that (say) the monadic equational laws still apply? That's the question Atkey lays out a nice solution to. He gives a nice categorical semantics for indexed, effectful computations, and then cooks up lambda calculi whose equational theory corresponds to the equations his semantics justifies. The application to delimited continuations is quite nice, and the type theories can also give a little insight into the basics of how stuff like Hoare Type Theory works (which uses parameterized monads, with a very sophisticated language of parameters). On a slightly tangential note, this also raises in my mind a methodological point. Over the last n years, we've seen many people identify certain type constructors, whose usage is pervasive, and greatly simplified with some syntactic extensions  monads, comonads, applicative functors, arrows, and so on. It's incredible to suggest that we have exhausted the list of interesting types, and so together they constitute a good argument for some kind of language extension mechanism, such as macros. However, all these examples also raise the bar for when a macro is a good idea, because what makes them compelling is precisely that the right syntax yields an interesting and pretty equational theory in the extended language. By neelk at 20090211 21:40  Category Theory  Lambda Calculus  Semantics  Type Theory  16 comments  other blogs  12841 reads
SEP entry on Combinatory LogicThere's a new entry on Combinatory Logic in the Stanford Encyclopedia of Philosophy. I haven't perused it yet, so please share opinions and comments. Unchecked Exceptions can be Strictly More Powerful than Call/CCHere's a little light reading for your dayafterLaborDay (or whatever yesterday was where you live): Unchecked Exceptions can be Strictly More Powerful than Call/CC, Mark Lillibridge and Olivier Danvy, 1999, HigherOrder and Symbolic Computation.
I have to say that on seeing the title I was surprised: I cut my functional teeth on Scheme and every baby Schemer sucks up the knowledge that call/cc lets you create all manner of flow control including exceptions. But, as the paper makes clear, that's not necessarily the case in a staticallytyped context. Edit: Citeseerx was not responding very well, here's an alternative URL for the paper. By James Iry at 20080902 15:39  Functional  Lambda Calculus  Type Theory  16 comments  other blogs  19495 reads
A located lambda calculusA located lambda calculus. Ezra Cooper and Philip Wadler. Submitted to ICFP 2008.
This paper is technical, and I assume most LtU members will mainly read sections 1, 5 & 6. Figure 5 is definition of the located LC. By Ehud Lamm at 20080504 19:25  Lambda Calculus  Parallel/Distributed  7 comments  other blogs  8422 reads
Register Allocation by Proof TransformationRegister Allocation by Proof Transformation, Atsushi Ohori. ESOP 2003.
The idea that making uses of the structural rules explicit gives you proof terms to model registermemory moves is very pretty. Another fun thing to do would be to take a continuation calculus and apply the ideas here to see if it produces a good lowlevel IR. EDIT: Ehud asked for some further exposition, so here goes. At a high level, you can think of the need to do register allocation as arising from a mismatch between a programming language and the hardware. In a language, we refer to data using variables, and in any given expression, we can use as many variables as we like. However, when a CPU does stuff, it wants the data to be in registers  and it has only a small, finite set of them. So when a program is compiled, some variables can be represented by registers, and the rest must be represented by locations in memory (usually on the stack). Whenever the CPU needs to use a variable in memory, there must be explicit code to move it from memory into a register. The idea in this paper is to take the typing derivation of a program with an unbounded variable set, and then divide the context into two parts, one representing the register file and the other representing variables in memory. In terms of the implementation, moves between these two zones correspond to registermemory moves; and in terms of logic, this is a use of the structural rule of Exchange, which permutes the order of variables in a context. So this gives us a highlevel, machineindependent characterization of the register allocation problem: take a onezone derivation and convert it to a twozone derivation. But it gets even better: as long as the register allocation algorithm only adds uses of the structural rules in its transformation, we know that the meaning of the original program is unchanged  so this method also yields a simple way of proving that a register allocation pass is semanticspreserving. (The fact that this is an easy proof is one indication of the power of this idea.) By neelk at 20080411 22:08  Implementation  Lambda Calculus  Type Theory  5 comments  other blogs  7432 reads
History of LambdaCalculus and Combinatory logic
F. Cardone and J. R. Hindley. History of LambdaCalculus and Combinatory logic. To appear as a chapter in Volume 5 of the Handbook of the History of Logic.
From the introduction:
Seen in outline, the history of LC and CL splits into three main periods: first, several years of intensive and very fruitful study in the 1920s and â€™30s; next, a middle period of nearly 30 years of relative quiet; then in the late 1960s an upsurge of activity stimulated by developments in higherorder function theory, by connections with programming languages, and by new technical discoveries. The fruits of the first period included the firstever proof that predicate logic is undecidable. The results of the second attracted very little nonspecialist interest, but included completeness, cutelimination and standardization theorems (for example) that found many uses later. The achievements of the third, from the 1960s onward, included constructions and analyses of models, development of polymorphic type systems, deep analyses of the reduction process, and many others probably well known to the reader. The high level of activity of this period continues today. Beware: This is a long paper (but less than you might expect it to be by looking at the page count: about half the pages are dedicated to the bibliography). In the announcement on the TYPES Forum the authors invited comments, suggestions and additional information on the topics of the paper, namely the development of lambdacalculi and combinatory logic from the prehistory (Frege, Peano and Russell) to the end of 20th century. By Ehud Lamm at 20080219 19:21  History  Lambda Calculus  Type Theory  1 comment  other blogs  10517 reads
Project LambdaCan
For those that are both language geeks and hardware geeks... Relating Complexity and Precision in Control Flow AnalysisRelating Complexity and Precision in Control Flow Analysis, David Van Horn and Harry Mairson. ICFP 2007.
There's ton of really good stuff in here; I was particularly fascinated by the fact that 0CFA is exact for multiplicatively linear programs (ie, that use variables at most once), because linearity guarantees that every lambda can flow to at most one use site. By neelk at 20080201 18:47  Implementation  Lambda Calculus  Theory  2 comments  other blogs  8508 reads
Callbyvalue Termination in the Untyped Lambdacalculus
To renew the discussion on Total Functional Programming, this paper is an alternative to Termination Checking with Types. By Daniel Yokomizo at 20080108 05:29  Lambda Calculus  Semantics  3 comments  other blogs  7443 reads
Theorem proving support in programming language semantics
More work on mechanized metatheory with an eye towards naturalness of representation and automation. This seems to me to relate to Adam Chlipala's work on A Certified TypePreserving Compiler from Lambda Calculus to Assembly Language, which relies on denotational semantics and proof by reflection, in crucial ways. More generally, it seems to reinforce an important trend in using typetheorybased theorem provers to tackle programming language design from the semantic point of view (see also A Very Modal Model of a Modern, Major, General Type System and Verifying Semantic Type Soundness of a Simple Compiler). I find the trend exciting, but of course I also wonder how far we can practically go with it today, given that the overwhelming majority of the literature, including our beloved Types and Programming Languages, is based on A Syntactic Approach to Type Soundness. Even the upcoming How to write your next POPL paper in Coq at POPL '08 centers on the syntactic approach. Is the syntactic approach barking up the wrong tree, in the long term? The semantic approach? Both? Neither? By Paul Snively at 20071227 22:21  Functional  Implementation  Lambda Calculus  Semantics  4 comments  other blogs  11597 reads

Browse archivesActive forum topics 
Recent comments
1 week 1 day ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago
1 week 2 days ago