Home
FAQ
Feedback
Departments
Discussions
(new topic)
Language Evaluation
PL Courses
Research Papers
Design Docs
Quotations
Genealogical Diagrams
|
|
|
|
lc
A Security Kernel Based on the Lambda-Calculus |
(via our really great dicussion group)
This report describes Scheme 48, a programming environment whose design is guided by established principles of operating system security. Scheme 48's security kernel is small, consisting of the call-by-value lambda-calculus with a few simple extensions to support abstract data types, object mutation, and access to hardware resources. Each agent (user or subsystem) has a separate evaluation environment that holds objects representing privileges granted to that agent. Because environments ultimately determine availability of object references, protection and sharing can be controlled largely by the way in which environments are constructed.
Posted to LC by Ehud Lamm on 5/21/04; 4:31:48 AM
Discuss
|
|
Y derived |
I don't recall seeing this derivation of the applicative order Y combinator (from Richard Gabriel) mentioned here.
It's short and sweet.
Posted to LC by Ehud Lamm on 3/18/04; 1:29:42 AM
Discuss (1 response)
|
|
Lambda tutorial |
A nice, executable, LC tutorial from Chris Barker.
Seems to be a good resource for those trying to learn the lc by themselves.
The examples/exercises are not hard, but you should really try to solve them on your own before pressing the "reduce" button.
Posted to LC by Ehud Lamm on 10/2/03; 3:01:13 AM
Discuss
|
|
Set-Theoretic and Other Elementary Models of the LC |
(postscript version)
May be a bit on the theoretical side for most tastes, but if you are interested in LC theory, you need to know about its models.
In these notes Gordon Plotkin covers the fundamental ideas, starting with the problems stemming from self-application and covering such things as Scott-Engeler algebras, Extended Applicative Type Structures etc. The final section discusses some general properties of LC models.
I urge my fellow editors to post links to resources easier to digest, lest we all develop major headaches...
Posted to LC by Ehud Lamm on 8/26/03; 7:19:54 AM
Discuss
|
|
Constructive Computation Theory. Course notes on lambda calculus |
G. Huet. Constructive Computation Theory. Course notes on lambda calculus, University of Bordeaux I, 2002.
We give in these notes an overview of computation theory, presented in the paradigm of -
calculus. The descriptive formalism used is not the usual meta-language of Mathematics, but
an actual programming language, Pidgin ML, a variant of Objective Caml, developed in IN-
RIA Rocquencourt. This has the double advantage of being more rigorous, more constructive,
and of allowing better understanding by the reader who may interactively execute all definitions
and examples of the course. The corresponding programs may be downloaded from the site
http://pauillac.inria.fr/~huet/FREE/LAMBDA.tar.gz. The programs may be executed with
Objective Caml, version 3.06 (http://caml.inria.fr/ocaml/index.html).
I found the link to these nice lecture notes in a TYPES messages, in regard to Mitch Wand's request for teaching materials for a freshmen honors seminar, in which he wants to take a look at computability theory, using
lambda-calculus as a vehicle.
As LtU regulars know, I am a firm believer in using constructive methods in introductory computer science. But at the moment, Haskell is the language that gets my vote not O'caml...
Posted to LC by Ehud Lamm on 6/29/03; 1:03:00 AM
Discuss (1 response)
|
|
Stochastic Lambda Calculus |
Stochastic Lambda Calculus and Monads of Probability Distributions. Norman Ramsey and Avi Pfeffer. Conference Record of the 29th Annual ACM Symposium on Principles of Programming Languages (POPL'02), in SIGPLAN Notices 37(1):154--165.
Probability distributions form a monad, and the monadic definition leads to a simple, natural semantics for a stochastic lambda calculus, as well as simple, clean implementations of common queries. But the monadic implementation of the expectation query can be much less efficient than current best practices in probabilistic modeling. We therefore present a language of measure terms, which can not only denote discrete probability distributions but can also support the best known modeling techniques.
Some math background is needed for this paper, esp. measure theory.
QuickCheck which was mentioned here a few days ago may serve as a good introduction to some of the ideas analyzed in this paper, since it also discusses probabalistic monads.
It would be very interesting to see the results of the authors promise to investigate Bayesian learning using a monadic framework.
It has been a while since we mentioned probabilistic languages, but this is a fascinating subject. If you have interesting links about such languages, please share!
Posted to LC by Ehud Lamm on 12/30/02; 3:01:00 PM
Discuss
|
|
Lazy vs. strict |
Lazy vs. strict. Philip Wadler. ACM Computing Surveys, June 1996.
A short (4pp) and readable survey listing LC variants, leading to the call-by-need lambda calculus.
Posted to LC by Ehud Lamm on 10/24/02; 2:13:52 AM
Discuss
|
|
TinkerType |
Tinkertype is a framework for compact and modular
description of formal systems. A family of related
systems is broken into a set of individual inference
rules and a set of features controlling the inclusion
of rules in particular systems.
The Tinkertype framework is applied to construct a
substantial repository of typed lambda calculi including
systems with subtyping, polymorphism, type operators
and kinding, computational effects, and dependent and
recursive types. The repository describes both declarative
and algorithmic aspect of the systems and can be used
with our tool, the Tinkertype assembler, to generate
calculi either in the form of typeset collection of
inference rules or executable ML type checkers.
From the
author of the textbook
Types and Programming Languages.
which also has several OCAML
implementations of type systems
available.
[
Presentation]
Posted to LC by jon fernquest on 9/30/02; 2:26:54 AM
Discuss (5 responses)
|
|
Symmetric difference in LC, P-numerals, and fold |
To compute (diffs ck cl)
we convert cl to a list of length l, and then apply a down-or-up
operation k times to the list. The down-or-up operation removes a link
from a list. If, however, it encounters an empty list, it "changes the
direction" and starts adding links. Obviously, the result is the list
whose length is abs(k-l). At the last step, we convert the list back
to the corresponding numeral.
If you are familiar with the Peano axioms for natural numbers, you'll feel at home.
Posted to LC by Ehud Lamm on 8/31/02; 4:08:52 AM
Discuss (1 response)
|
|
On Understanding Types, Data Abstraction, and Polymorphism |
In this classic survey of types and polymorphism,
the typed lambda-calculus is extended
"with type facilities that are more powerful and expressive than those of existing
programming languages."
"Object-oriented languages provide both a framework and a motivation for exploring the interaction
among the concepts of type, data abstraction, and polymorphism, since they extend the notion of type to
data abstraction and since type inheritance is an important form of polymorphism.
We develop a lambda-calculus based
model for type systems that allows us to explore these interactions in a simple setting, unencumbered
by complexities of production programming languages...
The evolution of languages from untyped universes to monomorphic and then polymorphic type
systems is reviewed. Mechanisms for polymorphism such as overloading, coercion, subtyping, and
parameterization are examined. A unifying framework for polymorphic type systems is developed in terms
of the typed lambda-calculus augmented to include binding of types by quantification as well as binding of values
by abstraction.
The typed lambda-calculus is augmented by universal quantification to model generic functions with type
parameters, existential quantification and packaging (information hiding) to model abstract data types, and
bounded quantification to model subtypes and type inheritance. In this way we obtain a simple and precise
characterization of a powerful type system that includes abstract data types, parametric polymorphism, and
multiple inheritance in a single consistent framework." (1)
Precise definitions and examples of the terminology
behind polymorphism: universal, ad-hoc, parametric, inclusion, overloading, and coercion (4).
Posted to LC by jon fernquest on 8/11/02; 3:02:59 AM
Discuss (6 responses)
|
|
To Dissect a Mockingbird |
Subtitled "A Graphical Notation for the Lambda Calculus with Animated Reduction."
A visual introduction to the lambda calculus.
Scroll slowly down the page and watch the animations.
(Source:
Lambda Calculus <-> Category Theory Discussion)
As pointed out by Raymond Smullyan the eminent logician and author of
several puzzle books, the theory of combinators is an abstract science
dealing with objects whose only important property is how they act upon
each other. We are free to choose other properties of these objects in
any way we like. In his delightful book "To mock a mockingbird",
Smullyan (1985) chooses birds for his combinators, in memory of Haskell
Curry, an early pioneer in the theory of combinators (1958) and an avid bird-watcher.
The story begins,
"A certain enchanted forest is inhabited by talking birds.
Given any birds A and B, if you call out the name of B to A,
then A will respond by calling out the name of some bird to you;
this bird we designate AB. Thus AB is the bird named by A upon hearing the name of B."
...
In the hope of making it even easier I introduce the following
graphical notation by extending Smullyan's bird metaphor. We will
go inside the birds' heads and see how to draw diagrams of the
internal plumbing which connects their ear to their throat in
order to produce the correct song in response to each song that they hear.
Posted to LC by jon fernquest on 8/9/02; 11:31:41 PM
Discuss (3 responses)
|
|
RhoStratego |
The original
rho-calculus
or rewriting calculus aims to
"integrate first-order rewriting, the lambda calculus, and non-determinism."
RhoStratego
[1,
2]
is basically the lambda calculus extended with constructors, let bindings,
and from Stratego "first class pattern matching and generic traversals." (32)
The semantics are well-organized:
"We first present a set of rewrite rules on the language.
By taking different subsets of these rules, we obtain lazy or strict semantics.
...all sets of rules as a whole gives a non-confluent calculus
...the lazy and strict subsets themselves are confluent.
...
The fact that the semantics is given as a set of rewrite rules from the language
to the language, i.e., as source-to-source transformation, means that they can be
used directly in, e.g., an optimiser or an interpreter for the language. In fact, an
interpreter based directly on these rewrite rules is given in appendix B." (23)
Another Stratego paper by the same author
Building Interpreters with Rewriting Strategies
includes lambda calculus interpreters with strategies including
"normalization, eager evaluation, lazy evaluation, and lazy evaluation
with updates. An extension with pattern matching and choice shows that
such interpreters can easily be extended."
(Possible Analogy: These papers are to Rewrite interpreters as
EOPL is to Scheme interpreters?)
Posted to LC by jon fernquest on 7/30/02; 2:14:46 AM
Discuss (1 response)
|
|
The Constructor Calculus |
(via comp.lang.functional)
The constructor calculus is a variant of the lambda-calculus that supports generic programming of functions for mapping, folding, equality, addition, etc. based on fresh approach to data types, their constructors, and pattern-matching. (more)
The paper gives a nice intro to generic programming. I haven't had the time to reach any conclusions about the work prsented.
Posted to LC by Ehud Lamm on 7/14/02; 8:06:45 AM
Discuss
|
|
Normal-order evaluation as bottom-up parsing |
This message announces a normal-order lambda-calculator integrated
with Haskell. The calculator implements what seems to be an efficient
and elegant algorithm of normal order reductions. The algorithm is
"more functional" than the traditionally used approach.
Nice integration with Haskell. In fact, you may argue that the LC evalutor is a DSL embedded in Haskell.
Posted to LC by Ehud Lamm on 5/18/02; 3:59:52 AM
Discuss
|
|
Lambda Calculus in Python |
(via Daily Python-URL)
This module allows simple experimentation with the lambda calculus, first developed by Church. It understands the different types of lambda expressions, can extract lists of variables (both free and bound) and subterms, and can simplify complicated by expression by means of application.
(if you need an intro to the lambda calculus, simply check previous messages in LtU's LC department)
Posted to LC by Ehud Lamm on 2/6/02; 2:02:33 PM
Discuss
|
|
subtraction and division in lambda-calculus |
This article will demonstrate basic arithmetic operations -- comparison, addition, subtraction, multiplication, and division -- on non-negative and negative integer numbers. Both the integers and the operations on them are represented as terms in the pure untyped lambda-calculus. The only building blocks are identifiers, abstractions and applications. No constants or delta-rules are used. Reductions follow only the familiar beta-substitution and eta-rules.
Implementing substraction in the LC can be a bit challenging. Natrually, the problem is related to the method used to represent numbers as lambda-calculus terms. This rather subtle issue is explored, and demonstrated.
Posted to LC by Ehud Lamm on 6/5/01; 4:06:24 AM
Discuss
|
|
Computability and Recursion |
Seems like a nice introduction and history.
As you may recall the idea of computability, now often called The Chruch-Turing Thesis was the result of discovering that the LC and Turing Machines are equally powerful compuatbility models. This is quite an amazing result, until you think about. Turing, Church and all that gang (Goedel, Kleene etc.) are the ones who thought about it first...
All this work was the result of thinking about the foundations of mathematics. Mainly trying to solve Hilbert's Tenth Problem. In short it was part of an attempt to (a) formalize mathematics and (b) show how mathematical proofs can be done by mechanical means.
Goedel showed that complete formalization is impossible (On the incompleteness...) and Turing and Church killed the mechanization dream.
In the process Church created the Lambda Calculus which we use a the basis for many theoretical works in programming language research. Turing machines are, of course, one of the basic models of computer science.
I was trying to remember today, where I first read about all these great ideas. I thought it was Goedel, Escher, Bach (a must read), but browsing I can't find all the details I remember reading about all these years ago. Oh well...
Posted to LC by Ehud Lamm on 4/23/01; 1:00:56 PM
Discuss (1 response)
|
|
A practical lambda-calculator |
This calculator implements a normal-order evaluator for the untyped lambda-calculus with shortcuts. Shortcuts are distinguished constants that represent terms. An association between a shortcut symbol and a term must be declared before any term that contains the shortcut could be evaluated. The declaration of a shortcut does not cause the corresponding term to be evaluated. Therefore shortcut's term may contain other shortcuts -- or even yet to be defined ones. Shortcuts make programming in lambda-calculus remarkably more convenient.
Besides palying a bit with the LC, I suggest downloading and reading the code. Well commented, with quite useful comments if you are learning the lambda calculus.
Posted to LC by Ehud Lamm on 4/18/01; 3:59:32 AM
Discuss (1 response)
|
|
Optimizations |
A thread on c.l.scheme. The nice thing is that it gives a bit more theoretical insight than usual discussions on optimization, mostly dealing with eta reductions.
There is another related thread.
Posted to LC by Ehud Lamm on 4/5/01; 3:23:13 AM
Discuss
|
|
Lambda Calculi : A Guide for Computer Scientists - Hankin |
This book was mentioned on c.l.functional as an intro to LC, and since I have the book open on my desk, I thought I'd mention it here.
This is a pretty short book (~ 160 pages), mostly dealing with the proof theory of type free LC.
The TOC:
- Inroduction
- Notation and Basic Theory
- Reduction
- Combinatory Logic
- Semantics
- Computability
- Types
- Practical Issues
- Other Calculi
- Further Reading
The book includes proofs, and provides exercises.
I found the presentation a bit cumbersome, since I think the idea of compuation by reduction and the computability aspects of the LC should be introduced earlier.
The chapter on combinatory logic is very good, and this is a great strength as this is usully treated as an after thought. The Further Reading section is also quite useful.
But linking this item to Amazon, I noticed that the price of the book is prohibitive. I think many introductions which are available online (and which we linked to in the past) provide much the same material almost for free.
Posted to LC by Ehud Lamm on 4/2/01; 8:57:09 AM
Discuss (1 response)
|
|
Introduction to Lambda Calculus |
A detailed introduction to the LC. Pretty long (~ 50 pages).
The text covers quite a lot of ground, contains proofs of many theorems and includes exercises.
If you don't have a book, and want to study the LC, you might give it a try.
Posted to LC by Ehud Lamm on 3/13/01; 12:25:11 PM
Discuss
|
|
The Impact of The Lambda Calculus... |
...in logic and computer science.
The paper describes the history of the lambda calculus, and several of its uses. Along the way it discusses important notions like computability and classifies functional programming languages.
The paper also covers the use of lambda calculus for computer based proofs and reasoning.
Posted to LC by Ehud Lamm on 3/8/01; 11:20:41 AM
Discuss
|
|
|