Home
FAQ
Feedback
Departments
Discussions
(new topic)
Language Evaluation
PL Courses
Research Papers
Design Docs
Quotations
Genealogical Diagrams




theory
Towards a Natural Theory of Programming Languages 
I thought I would use this thread as a cover for writing up some of my recent fumblings about what imo programming languages are, why most of them stop being used, and why people get so gosh darn passionate about them. I call it Towards a Natural Theory of Programming Languages  towards because it is something of an outline, and natural because, well, you'll see. Hopefully it will find some resonance with folks, or at least generate a few more comments.
Stanley Lippman muses about our favorite topic.
Posted to theory by Dan Shappir on 6/2/04; 3:09:26 PM
Discuss


A Type Theory for Memory Allocation and Data Layout 
Recent discussions about machine language and the myth of mandatory HLL performance degradation prompted this note. The idea: HLL code can manipulate lowlevel constructs in a strongly typed regime.
Highlevel programming languages such as ML and Java allow programmers to program in terms of abstractions such as pairs, records, and objects, which have welldefined semantics but whose realizations in terms of the underlying concrete machine are left unspecified and unobservable. Sometimes, it is necessary to program without these [highlevel] abstractions.
Traditionally, [such machineoriented] needs have been addressed in an untyped, or a weakly typed fashion. Languages such as C give programmers relatively precise control over data layout and initialization at the expense of type and memory safety. Traditional compilers represent programs internally using untyped languages, relying on the correctness of the compiler to preserve any safety properties enjoyed by the source program.
An important contribution of this work is ... the possibility ... that even programs requiring detailed control of memory layout could be written in a typed, highlevel language.
Posted to theory by Mark Evans on 5/12/04; 10:23:05 AM
Discuss (3 responses)


lambdacalculus and types 
Not quite my cupotea, but I figure I'd be remiss if I didn't pass the link on for these lecture notes from Thorsten Altenkirch:
These notes are based the view that typed lambdacalculi should be considered as fundamental because they can be used to model total functions which are easier to understand than partial functions and which are ultimatly more useful to construct solutions of problems. What is a (total) function? In Set Theory functions are represented as a relation which is total on the domain and deterministic on the codomain.
Posted to theory by Chris Rathman on 5/11/04; 11:58:33 AM
Discuss (3 responses)


Wolfram's Future Math 
Offerings include Turing machines and register machines, including "the smallest register machine with complex behavior."
Posted to theory by Mark Evans on 4/23/04; 10:10:48 PM
Discuss (10 responses)


A Modal Language for Effects 
Sungwoo Park and Robert Harper.
Building on a judgmental formulation of lax logic, we propose a modal language
which can be used as a framework for practical programming languages with
effects. Its characteristic feature is a syntactic distinction between terms
and expressions, where terms denote values and expressions denote computations.
We distinguish between control effects and world effects, and allow control
effects only in terms and world effects only in expressions. Therefore the
distinction between values and computations is made only with respect to world
effects. We give an explanation of the type system and the operational
semantics from a modal logic perspective. We also introduce a term construct
similar to Haskell's runST construct and augment the type system to
ensure its safety.
Modal logic techniques are beoming more and more important. I suggest follwing the references to Pfenning's papers for further enlightenment.
Posted to theory by Ehud Lamm on 4/23/04; 2:49:16 AM
Discuss


IJCAR 2004 Tutorial Program 
The tutorial program for the International Joint Conference on Automated Reasoning seems nice.
Though not directly PL related, theorem proving is used extensively in PL research (e.g., formal methods, static analysis, algebraic reasnoning).
Some of the lecturers provide links and reference materials for their presentations, so you might want to check the individual pages even if you are not planning on attending the conference.
Posted to theory by Ehud Lamm on 4/18/04; 4:28:47 AM
Discuss


Proof Theory of MartinLof Type Thoery  And Overview 
(by Anton Setzer)
We give an overview over the historic development of proof theory and
the main techniques used in ordinal theoretic proof theory. We
argue, that in a revised Hilbert's programme,
ordinal theoretic proof theory has to be supplemented
by a second step, namely the development of strong
equiconsistent constructive theories.
This is heavy stuff, and it's much too hot in Jerusalem at the moment for me to read this paper carefully.
This paper can serve as an overview but isn't a very good introduction. I wouldn't recommend this paper for those without a good CS/math background.
Posted to theory by Ehud Lamm on 3/3/04; 3:25:09 AM
Discuss (2 responses)


Environment Classifiers 
Environment Classifiers. Walid Taha, Michael Florentin Nielsen. Extended Abstract, POPL'03
This paper proposes and develops the basic theory for a new
approach to typing multistage languages based a notion of
environment classifiers. This approach involves explicit but
lightweight tracking – at typechecking time – of the origination
environment for futurestage computations. Classification is less restrictive than the previously proposed notions of closedness, and allows for both a more expressive typing of the "run" construct and for a unifying account of typed multistage programming.
We mentioned code generation the other day, so I thought people here might be interested in this paper.
It is essential reading for those interested in typed staged computation.
Posted to theory by Ehud Lamm on 2/19/04; 2:57:04 AM
Discuss (1 response)


Playful, streamlike computation 
PierreLouis Curien. Playful, streamlike computation. Proc. Int. Symp. on Domain Theory, Chengdu, 2001, Kluwer.
A bit on the technical side, but should be interesting for those interested in semantics, and the relationship between programming languages and logic.
Posted to theory by Ehud Lamm on 1/25/04; 5:57:10 AM
Discuss


categories study group 
StudyGroup for the moment consists of the book "Conceptual Mathematics, A first introduction to categories", by F. William Lawvere & Stephen H. Schanuel
Some here may be interested in following the progress on this Haskell Wiki page.
The Moggi papers are, of course, worthy of study.
Posted to theory by Ehud Lamm on 1/11/04; 4:55:20 AM
Discuss


MLlike Inference for Classifiers 
Cristiano Calcagno, Eugenio Moggi, and Walid Taha.
Environment classifiers were recently proposed as a new approach to typing
multistage languages. Safety was established in the simplytyped and letpolymorphic set
tings. While the motivation for the classifier approach was the feasibility of inference, this
was in fact not established. This paper starts with the observation that inference for the full
classifierbased system fails. We then identify a subset of the original system for which infer
ence is possible. This subset, which uses implicit classifiers, retains significant expressivity
(e.g. it can embed the calculi of Davies and Pfenning) and eliminates the need for classifier
names in terms. Implicit classifiers were implemented in MetaOCaml, and no changes were
needed to make an existing test suite acceptable by the new type checker.
Staging, which was discussed here quite a few times in the past, is an essential ingredient of macros, partial evaluation, and runtime code generation. From a programmer's perspective it seems like no more than a simple extension of regular macro facilities.
In the untyped setting, the behavior of staging constructs is indeed mostly straighforward and resembles the quasiquotation mechanisms of LISP and Scheme. In the statically typed
setting, however, such quotation mechanisms may prohibit static typechecking.
This paper explores this issue, and presents some promising design approaches.
Posted to theory by Ehud Lamm on 1/7/04; 12:29:35 PM
Discuss


Thielecke: Contrasting exceptions and continuations 
A very useful paper, analyzing the semantic differences between continuations (a la callcc) and exceptions.
The paper is a tour de force, and the details can be quite daunting. However, the basic results are well worth remembering:
 exceptions
cannot express continuations (Section 3).
 conversely, continuations in the absence of state cannot
express exceptions (Section 4).
 If exceptions and continuations are combined in the same language,
they cannot express state (Section 5).
 Exceptions and state cannot express continuations (Section 6).
If you want to skip the details, direct your attention to corollaries 3.10 and 4.11.
The paper was mentioned in a very interesting thread in the LtU discussion group.
Posted to theory by Ehud Lamm on 11/23/03; 3:39:34 AM
Discuss (1 response)


Exercises in Coalgebraic Specification 
B. Jacobs, Exercises in Coalgebraic Specification In: R. Backhouse, R. Crole and J. Gibbons (eds.) Algebraic and Coalgebraic Methods in the Mathematics of Program Construction Exercises in Coalgebraic Specification, (Springer LNCS 2297, 2002), p.237280. [ps version]
Since we are discussing coalgebras and reading Bart Jacobs...
This text is meant to be used as teaching material, so it is perhaps more accessible.
Posted to theory by Ehud Lamm on 11/12/03; 3:37:39 AM
Discuss


Shape Theory 
A theory with some languages dangling from it, and then for Frank, more theory:
Novel type systems designed to support shapebased programming can be found in the various programming languages described below. For example, Functorial ML supports a class of functors used to support new forms of polymorphism, which go by the name of functorial or shape polymorphism, or polytypy. Also, FISh is an Algollike language that supports data types of arrays, by having a distinct class of shape types.
The semantics of programming languages typically uses category theory to represent the denotations of programs, so that a program and its output have the same value. However, enriched categories allow for the expression of operational information. For example, orderenriched categories can express rewriting rules.
Ehud I did due diligence with Google, and you asked for it...
Posted to theory by Mark Evans on 11/5/03; 8:24:04 PM
Discuss (2 responses)


A case study in class library verification: Java's vector class 
M. Huisman, Bart Jacobs, J. van den Berg, A case study in class library verification: Java's vector class. Software Tools for Technology Transfer 3(3), 2001, p.332352.
This paper presents a verification of an invariant property for the Vector class from Java's standard library (API). The property says (essentially) that the actual size of a vector is less than or equal to its capacity. It is shown that this "safety'' or "data integrity'' property is maintained by all methods of the Vector class, and that it holds for all objects created by the constructors of the Vector class. The verification of the Vector class invariant is done with the proof tool PVS. It is based on a semantics of Java in higher order logic. The latter is incorporated in a special purpose compiler, the LOOP tool, which translates Java classes into logical theories. It has been applied to the Vector class for this case study. The actual verification takes into account the objectoriented character of Java: (nonfinal) methods may always be overridden, so that one cannot rely on a particular implementation. Instead, one has to reason from method specifications in such cases. This project demonstrates the feasibility of toolassisted verification of nontrivial properties for nontrivial Java classes.
Not overly exciting, but a useful example none the less.
The paper shows that real class libraries can be verified using well known techniques, coupled with automatic tools. The Java Modeling Language (JML) is used for behavioural specification.
One interesting aspect of this work is the explicit reasoning regarding abrupt termination (e.g., exceptions).
Posted to theory by Ehud Lamm on 11/1/03; 11:27:29 PM
Discuss


Magic Omega and the Limits of Mathematics 
Apropos of the recent discussion on the relationship between software and mathematics, it seemed the right moment to introduce a curious piece from a curious and entertaining spirit named Gregory J. Chaitin. If you've never read his work before, it's a treat. I thought about posting it to the LtU fun house, but hey, the man's serious, too, you know! Gets a paycheck from IBM! And uses Lisp! And talks about Turing completeness! The gist of Chaitin's work is that abstract mathematics as we know it turns out to be a much more "ad hoc" or "pragmatic" endeavor than is commonly known, in his view. Recursive function theory is prominent.
Okay, so why is all this interesting? Well, some of you may know already, but let me wave my hands frantically in the last few minutes and say why....This looks like a kind of programming madness, right? ... Yes, programming can be an obsession, it can ruin your life. You can sit at your terminal watching your life fall apart as you stay there hacking away until 4 in the morning! But is there any other justification for this except as a kind of drug? Well, I think so! I think this has some philosophical significance.
What's the point about this incompleteness theorem, about this technical result? ... [W]hat this incompleteness result shows you is that ... essentially the only way to prove what a bit of Omega is, is to add the theorem that you want to prove as a new axiom. It's irreducible mathematical information. Now you can prove anything by adding it as a new axiom. The point here is that for Omega that's essentially the only way to do it. No compression is possible. So there is no structure, there is no pattern, Omega has maximum entropy, it mirrors independent tosses of a fair coin. Now to a physicist what I'm saying sounds pretty reasonable, right? Omega has maximum entropy, the bits of Omega are completely uncorrelated. But to a mathematician this all sounds weird!
My new Omega is a particular welldefined real number. It's a real number with a rather simple definition, I can even write the LISP program that defines Omega on one computer screen. So you believe, thinking in Platonic terms, that each bit is either a 0 or a 1, even if I'm never going to know which. But it's black or white, right? And what I'm saying is that I think that it's really better to think that it's grey. It's really better to think that each bit of Omega has probability onehalf of being 0 or of being 1—even though it's a particular welldetermined bit, because I've written out the program that defines this number.
Posted to theory by Mark Evans on 10/21/03; 10:30:50 PM
Discuss (2 responses)


A gentle introduction to TLG, the CurryHoward correspondence, and cutelimination 
Barker Chris. A gentle introduction to Type Logical Grammar, the CurryHoward correspondence, and cut elimination
This paper aims to give an introduction to Type Logical Grammar (TLG) for the general linguist who may not have a background in logic or proof theory. This paper will briefly describe the logical heritage of TLG, sketch a simple TLG analysis, convey the spirit of the CurryHoward result, discuss Gentzen’s famous cutelimination result, and motivate multimodal extensions of Lambek’s logic.
A short introduction to some of the same topics discussed in Girard's book plus an introduction to Type Logical Grammar for those who are interested in linguistics.
Posted to theory by Ehud Lamm on 10/11/03; 2:59:14 AM
Discuss (2 responses)


Proofs and Types 
(vai Frank, in the discussion group)
Proofs and Types by JeanYves Girard, translated and with appendices by Yves Lafont and Paul Taylor, 1989.
Based on a short graduate course on typed lambdacalculus given at the Université Paris VII in the autumn term of 19867.
An very interesting book that has gone out of print and is now available online.
A taste of the topics discussed in this smorgasbord: The CurryHoward Isomorphism, Strong Normalisation, System F, Provably Total Functions and more!
Posted to theory by Ehud Lamm on 10/7/03; 11:30:26 AM
Discuss (4 responses)


Selfapplication as a fixpoint of call/cc 
The article basically shows that call/cc and Y are two sides of the
same coin. The article however makes the claims formal and proves them
as a theorem. The way of proving the theorem is interesting too: we
extensively rely on syntaxrules macros. Indeed, one macro does a
CPS transform, and another simple macro  which is actually a
lambdacalculator  normalizes the result of the CPS transform.
The article shows in which way (call/cc call/cc) is equivalent to (lambda (x) (x x)). If you ever felt the (call/cc call/cc) itch, you'll want to check this out.
The article makes clever use of macros. Continuations, fixpoints, CPS, higherorder syntax, syntaxrule macros and lambdacalculators all take part in this Scheme tour de force.
Great fun...
Posted to theory by Ehud Lamm on 10/1/03; 12:50:17 AM
Discuss (1 response)


Iota and Jot: the simplest languages? 
I don't think we discussed these 'esoteric' languages before. Iota, an unambiguous Turingcomplete language with just two symbols and Jot that uses the techniques used to construct Iota to provide a particularly elegant Goedel numbering.
Zot, a related language, has some sort of support for input/output. Zot's semantics are based on continutations.
Posted to theory by Ehud Lamm on 9/28/03; 3:37:04 AM
Discuss (1 response)


Proof methods for corecursive programs 
Proof methods for corecursive programs. Jeremy Gibbons and Graham Hutton. October 2000.
I was surprised to see that we haven't mentioned this paper before. It is quite useful for learning how to prove properties of functions producing infinite lists etc.
The paper presents four methods: fixpoint induction, the approximation lemma, coinduction, and fusion with unfold.
Posted to theory by Ehud Lamm on 8/10/03; 12:56:05 AM
Discuss


Theorems for free! 
Philip Wadler. Theorems for free! 4'th International Conference on Functional Programming and Computer Architecture, London, September 1989.
From the type of a polymorphic function we can derive a theorem that it satisfies. Every function of the same type satisfies the same theorem. This provides a free source of useful theorems, courtesy of Reynolds' abstraction theorem for the polymorphic lambda calculus.
I see that this useful paper wasn't mentioned here before, so I here goes.
The paper formally proves an important metatheorem for derving theorems about algebraic properties of functions, based solely on their types!
Example: Suppose a polymorphic function r with type forall a.[a]>[a], then r statisfies the following theorem: for all types A and A', and every total function a:A>A', we have map a . r_A = r_A' . map a.
The intuitive explanation of this result is that r must
work on lists of X for any type X. Since r is provided
with no operations on values of type X all it can do is
rearrange such lists independent of the values contained
in them. Thus applying a to each element of a list and
then rearranging yields the same result as rearranging
and then applying a to each element.
Posted to theory by Ehud Lamm on 7/31/03; 4:24:52 AM
Discuss (5 responses)


Point Free Style 
This style is particularly useful when deriving efficient programs by calculation, but it is good discipline in general. It helps the writer (and reader) think about composing functions (high level), rather than shuffling data (low level).
Condensed from a Haskell mailing list post, this also has a number of interestinglooking references.
Posted to theory by andrew cooke on 6/19/03; 6:38:41 AM
Discuss (7 responses)


A syntactic approach to type soundness 
This tech report shows some of the standard techniques for proving type soundness, and introduces Felleisen's syntactic approach which is based on subject reduction.
This is an important technique, yet it seems this paper wasn't mentioned before on LtU.
Posted to theory by Ehud Lamm on 6/11/03; 3:13:45 AM
Discuss


Mixin Modules and Computational Effects 
D.Ancona, S.Fagorzi, E.Moggi, E.Zucca. Mixin Modules and Computational Effects. ICALP 2003.
We define a calculus for investigating the interactions between mixin modules
and computational effects, by combining the purely functional mixin calculus CMS with a
monadic metalanguage supporting the two separate notions of simplification (local rewrite
rules) and computation (global evaluation able to modify the store). This distinction is
important for smoothly integrating the CMS rules (which are all local) with the rules dealing
with the imperative features.
The paper is a bit technical, but should interest those that study parameterization and inheritance.
Posted to theory by Ehud Lamm on 5/30/03; 11:49:40 AM
Discuss


Description Logics in Data Management 
The fundamental observation underlying [Description Languages] is that there is a benefit to be gained if languages for talking about classes of individuals yield structured objects that can be reasoned with.
This came up in a Haskell discussion about automating licence handling in software. It looks interesting (it's a survey paper) and might expand a little on the RDF post (which I have yet to read).
The code examples are in CLASSIC  the latest version of which is implemented in C++, although they used to use Lisp (which just goes to show that (1) Lisp is dead and (2) there's no bias in this news item).
thanks to Graham Klyne
Posted to theory by andrew cooke on 5/23/03; 6:21:34 AM
Discuss (3 responses)


Putting Failure in Context: ADTs of Computations & Contexts 
Mitchell Wand. Putting Failure in Context: Abstract Data Types of Computations and Contexts. submitted for publication, March 2003.
There are two standard models of backtracking programs: a model in which the many possible answers are represented as a stream, and a model in which the action of the backtracking program is represented by two continuations: a success continuation and a failure continuation. These two models, both intuitively plausible, have never been (to our knowledge) been formally related. We show how the twocontinuation model can be derived from the stream model. We do this by treating computations and contexts as abstract data types. We start by representing computations as elements in a term algebra. We derive an operational semantics for this term algebra by representing it in an extension of callbyvalue PCF. This allows us to build an algebra of contexts, in which the usual representation of contexts as terms with holes is an initial algebra. We then represent these contexts as a final algebra using two observers, one representing the action of the context on success, and one representing its action on failure. Finally, we regain a compositional semantics by representing computations by their action on contexts.
The two approaches for modelling backtracking were mentioned here in the past. Not only are they relevant for logic programming, they are also important for other situations requiring goal directed evaluation (e.g., Icon).
Note that the paper makes use of quite a few theoretical tools (operational and denotational semantics, algebraic and coalgebraic teqhniques, monads etc.), which may make it a bit difficult to read.
Posted to theory by Ehud Lamm on 4/6/03; 12:39:10 PM
Discuss


J. McCarthy: Towards a Mathematical Science of Computation 
The bulk of the paper introduces denotational semantics of a program
as a function from a state vector (the values of registers and other
memory locations) to a state vector. However the paper does not
mention the phrase "denotational semantics". The paper then argues
that the denotational function can be represented as a composition
or a conditional expression of primitive operations (such as addition,
etc.). The denotational semantics presentation is conventional  but
it is expressed in unconventional language.
An interesting quote:
It should be possible almost to eliminate debugging. Debugging is
the testing of a program on cases one hopes are typical, until it
seems to work. This hope is frequently vain. Instead of debugging a
program, one should prove that it meets its specifications, and this
proof should be checked by a computer program. For this to be
possible, formal systems are required in which it is easy to write
proofs. There is a good prospect of doing this, because we can
require the computer to do much more work in checking each step than a
human is willing to do. Therefore, the steps can be bigger than with
present formal systems. The prospects for this are discussed in
[McC62].
The last section "The Limitations of Machines And Men as
ProblemSolvers" is the best section. Here's its conclusion:
In conclusion, let me reassert that the question of whether there are
limitations in principle of what problems man can make machines solve for
him as compared to his own ability to solve problems, really is a technical
question in recursive function theory.
 Oleg.
Posted to theory by Ehud Lamm on 4/3/03; 3:49:23 AM
Discuss (9 responses)


An abstract view of programming languages 
Eugenio Moggi. An abstract view of programming languages. Tech. Report ECSLFCS90113, Edinburgh Univ., 1989.
The aim of these course notes is to show that notions and ideas from Category
Theory can be useful tools in Computer Science, by illustrating some recent applications to the study of programming languages based on the principle "notions
of computation as monads". The main objective is to propose a unified approach to the denotational semantics of programming languages. The categorytheoretic notions introduced in the course will be presented with specific applications in mind (so that they will not sound too abstract) and guidelines for linking abstract
and concrete concepts will be stressed.
A useful introduction.
Posted to theory by Ehud Lamm on 3/25/03; 6:32:14 AM
Discuss


Tarski's fixpoint theorem  history of use 
I'm curious to know the earliest paper that identifies Tarski's theorem
regarding the existence of least/greatest fixpoints of monotonic
functions on a complete lattice ("A Lattice Theoretical Fixpoint
Theorem and its Applications", Pacific J. of Math. 5, 285309, 1955) as
the basis for termination of the standard dataflow analysis algorithms.
An interesting thread about the use of the KnasterTarski theorems for programming language semantics.
Posted to theory by Ehud Lamm on 3/18/03; 11:02:26 AM
Discuss


Domain Theory 
S. Abramsky and A. Jung. Domain Theory. Handbook of Logic in Computer Science, Vol. III, Clarendon Press, 1994, pages 1168.
Related reading for the previous item.
Posted to theory by Ehud Lamm on 3/6/03; 7:21:16 AM
Discuss


Domains and Denotational Semantics: History, Open Problems 
M. P. Fiore and A. Jung and E. Moggi and P. O'Hearn
and J. Riecke and G. Rosolini and I. Stark. Domains and Denotational Semantics: History,Accomplishments and Open Problems. Bulletin of EATCS, 1996, vol. 59, 227256
In this collection we try to give an overview of some
selected topics in Domain Theory and Denotational
Semantics. In doing so, we first survey the mathematical
universes which have been used as semantic domains. The
emphasis is on those ordered structures which have been
introduced by Dana Scott in 1969 and which figure under
the name (Scott) domains. After surveying developments in
the concrete theory of domains we describe two newer
developments, the axiomatic and the synthetic approach. In
the second part we look at three computational phenomena
in detail, namely, sequential computation, polymorphism,
and mutable state, and at the challenges that these pose
for a mathematical model.
Long and dense, but surveys a set of important foundational problems.
Posted to theory by Ehud Lamm on 3/6/03; 7:01:05 AM
Discuss


Checking polynomial time complexity with types 
Patrick Baillot. Checking polynomial time complexity with types.
Foundations of Information Technology in the Era of Network and Mobile Computing (proceedings of TCS2002).
Several researchers have proposed programming languages with intrinsic complexity properties, for instance languages ensuring that all functions representable are PTIME without refering to an explicit time measure. This work follows the same approach.
By relegating the handling of the complexity property to the type system, the programming task becomes easier, since the programmer is freed from manually providing information needed for the complexity analysis. All well typed programs are PTIME.
But is the type system decidable? See section 4 for details!
More information can be found in the preprint Type inference for polynomial time complexity via constraints on words.
Posted to theory by Ehud Lamm on 2/20/03; 2:19:03 AM
Discuss (2 responses)


Bertrand Meyer: Proving Program Pointer Properties 
The present discussion proposes a mathematical theory for modeling pointerrich object structures and proving their properties.
The model has two principal applications:
 The coarsegrained version of the model,considering only the existence or not
of a reference between an object and another,gives a basis for discussing overall properties of the object structure,defining as a result the correctness
constraints of memory management and especially garbage collection,full or
incremental. Mathematically,this model uses a binary relation.
 The finegrained version,based on functions which together make up the
relation of the coarsegrained version,integrates the properties of individual
object fields. As a result,it allows proving the correctness of classes
describing structures with luxurious pointer foliage,from linked lists and
graphs to Btrees and doubleended queues.
Makes for interesting reading, though I am not sure if these papers contain new results.
As you might expect, Meyer uses an Eiffel like notation, and the preconditions, postconditions and invariants are used extensively...
Thanks Jussi!
Posted to theory by Ehud Lamm on 1/31/03; 3:00:15 AM
Discuss (1 response)


Recursion Theory and Joy 
Joy is a functional programming language which is not based on the application of functions to arguments but on the composition of functions. Many topics from the theory of computability are particularly easy to handle within Joy. They include the parameterisation theorem, the recursion theorem and Rice's theorem. Since programs are data, it is possible to define a Ycombinator for recursion and several variants. It follows that there are selfreproducing and selfdescribing programs in Joy. Practical programs can be written without recursive definitions by using several general purpose recursion combinators which are more intuitive and more efficient than the classical ones.
A short discussion of such cool things as fixed point combinators, Kleene's Smn theorem, and Rice's theorem.
Along the line you are introduced to Goedel numbering and selfreproducing programs.
Posted to theory by Ehud Lamm on 1/13/03; 5:12:15 AM
Discuss (5 responses)


Tony Hoare: Structured concurrent programming 
A nice discussion of the requirements from good programming building blocks (e.g., algebraic properties). Hoare defines three concurrency operators (or program structures): P first Q (run whichever starts first), P both Q (run them concurrently) and P any Q (nondeterministic choice). Hoare briefly talks about the properties of these operators.
Also check out the lecture given a week earlier: Concurrent programs wait faster.
Favorite quote: Now there are large sections of Microsoft code that contain no goto's.
Posted to theory by Ehud Lamm on 1/1/03; 2:09:07 PM
Discuss


A Lambda Calculus for Dynamic Binding 
Dami, Laurent, A LambdaCalculus for Dynamic Binding, Theoretical Computer Science, 1997
Dynamic binding is a runtime lookup operation which extracts values corresponding to some "names" from some "environments" ... Many situations related with flexible software assembly involve dynamic binding: firstclass modules, mobile code, objectoriented message passing. This paper proposes LambdaN, a compact extension of the lambdacalculus to model dynamic binding, where variables are labelled by names, and where arguments are passed to functions along named channels. The resulting formalism preserves familiar properties of the lambdacalculus, has a Currystyle type inference system, and has a formal notion of compatibility for reasoning about extensible environments...
(via comp.lang.functional)
There's also an implementation of these ideas in the language HOP, and a number of other resources (including Dami's thesis) in here.
Posted to theory by Bryn Keller on 12/27/02; 11:03:31 AM
Discuss


Generalising Monads to Arrows 
John Hughes, Generalising Monads to Arrows, in Science of Computer Programming 37, pp67111, May 2000.
... The monad interface has been found to be suitable for many combinator libraries, and is now extensively used. Numerous benefits flow from from using a common interface ... It is therefore a matter of some concern when libraries emerge which cannot, for fundamental reasons, use the monad interface ... I sought a generalization of the monad concept that could also offer a standardised interface to libraries [that could not support a monadic interface]. My proposal, which I call arrows, is the subject of this paper. Pleasingly, the arrow interface turned out to be applicable to other kinds of nonmonadic library also...
This paper introduces arrows, which are a generalization of monads. I mentioned this before when we were discussing the Fruit GUI toolkit (which uses arrows), but it really deserves a topic of its own. Full of motivating examples and thorough explanations.
Also, there are some other arrow resources:
Posted to theory by Bryn Keller on 12/18/02; 11:30:09 AM
Discuss (2 responses)


Typing Dynamic Typing 
Arthur I. Baars and S. Doaitse Swierstra. Typing Dynamic Typing. Proc. ICFP '02, 2002.
We show how, by a careful use of existentially and universally quantified types, one may achieve the [effect of dynamic typing], without extending the language with new or unsafe features. The techniques explained are universally applicable, provided the core language is expressive enough; this is the case for the common implementations of Haskell. The techniques are used in the description of a type checking compiler that, starting from an expression term, constructs a typed function representing the semantics of that expression. In this function the overhead associated with the type checking is only once being paid for; in this sense we have thus achieved static type checking.
Frank: They construct a calculus of proofs of type equality. A dynamically typed value is a value paired with a representation of its type. It is eliminated by passing a representation of the expected type, and exhibiting a witness proof that shows the expected type is equal to the one stored with the value.
Posted to theory by Ehud Lamm on 12/18/02; 2:53:59 AM
Discuss


The point of (any) semantics 
Matthias Felleisen:
1. I have done my share of semantics.
2. I am doing my share of systems building.
3. And I really wish that I could say 1 and 2 are related...
A very interesting discussion. Be careful not to miss the continuation of the discussion (with a new title, but this is just thread aliasing.)
Posted to theory by Ehud Lamm on 11/19/02; 5:20:28 AM
Discuss (1 response)


Harper: PL, The Essence of Computer Science 
Robert Harper. Programming Languages: The Essence of Computer Science. University of Washington Distinguished Lecture Series, October, 2002.
Another interesting talk from Robert Harper.
The first ten slides or so set the stage for what's to come, painting a background familiar to LtU readers: CS Is About Programming, Programming Is Linguistic, Languages Abound, It takes decades to go from research to practice, Type theory as the GUT of PL’s.
Harper then goes on to talk about type systems, types in compilation, typed intermediate languages, TAL, type for low and highlevel languages (including verifying invariants) and dependent types.
Packed full with insights.
Posted to theory by Ehud Lamm on 11/15/02; 3:17:18 AM
Discuss (1 response)


Robert Harper: Types and Programming Languages 
Scottfest 2002: Types and Programming Languages. Carnegie Mellon University, October, 2002.
A highly theoretical talk given in honor of Dana Scott. Constructive validity, constructivism in general, and how Dana Scott's work initiated the typetheoretic study of programming languages.
This talk would have a made a great paper. The slides are a bit too dense for mortals.
One great quote: Logical type theory is the GUT of PL’s!
Alas, I could only find PPT slides.
Posted to theory by Ehud Lamm on 10/15/02; 5:36:05 AM
Discuss (6 responses)


Types in Programming Languages 
A survey paper that provides basic definitions for type systems and a broad overview of the whole subject.
Covers types in denotational and operational semantics
as well as decidable and undecidable type systems (Cayenne and dependent types).
Introduces Church's "simply typed lambda calculus"
and defines simple types as well as
product, sum, and recursive types.
The section on polymorphism shows how
"the mechanism used to define polymorphic functions in ML is a very
simple and elegant extension of simply typed lambda calculus."
Covers subtyping and objectoriented programming,
constrained polymorphism, and abstract data types.
Shows how constrained types are "adequate for expressing the types of overloaded symbols."
Briefly overviews different types of constrained polymorphism:
"subtypebounded polymorphism"
using subtype constraints
an extension of which,
"Fbounded polymorphism"
uses functions from types to types.
(Types in Programming Languages,
Camarao, Figueiredo, and Pimentel, 1999)
Posted to theory by jon fernquest on 10/1/02; 2:11:24 AM
Discuss (3 responses)


Type Theory and Functional Programming 
Type Theory and Functional Programming. Simon Thompson.
ISBN 0201416670, AddisonWesley. 1991.
This is now out of print. I had hoped to prepare a revised version before making it available online, but sadly this hasn't happened. So, you can get the original version, warts and all, from this page.
Another great resource. Includes several nice case studies, as well as the essential mathematical foundations.
Posted to theory by Ehud Lamm on 9/2/02; 12:08:47 PM
Discuss


Do we Need Dependent Types? 
(via the Haskell mailing list)
Daniel Fridlender, Mia Indrika. Do we Need Dependent Types? Appears in Journal of Functional Programming, 10(4):409415,2000.
Inspired by Danvy, we describe a technique for defining, within the HindleyMilner type system, some functions which seem to require a language with dependent types. We illustrate this by giving a general definition of zipWith for which the Haskell library provides a family of functions, each member of the family having a different type and arity. Our technique consists in introducing ad hoc codings for natural numbers which resemble numerals in LC.
Simple yet cool solution.
Posted to theory by Ehud Lamm on 8/21/02; 2:54:52 AM
Discuss (1 response)


Putting Type Annotations to Work 
Putting Type Annotations to Work. Martin Odersky, Konstantin Läufer,
Conference Record of POPL '96
We study an extension of the HindleyMilner system with explicit type scheme annotations and type declarations. The system can express polymorphic function arguments, userdefined data types with abstract components, and structure types with polymorphic fields...
Heavy stuff, but important if you are interested in thing like Rank2 polymorphism etc.
And you should be interested, since all programming is about choosing the appropriate level of polymorphism, and finding the progrsmming technique that let's you achieve it.
Posted to theory by Ehud Lamm on 8/7/02; 8:19:57 AM
Discuss (2 responses)


Writing a counter function 
I'm trying to write a counter function that would return a tuple whose first element is the current value and whose second element is a new counter.
Our own Frank Atanassow offers a solution and gives an interesting discussion (including coalgebras and all that stuff).
(Thanks Oleg!)
Posted to theory by Ehud Lamm on 7/8/02; 5:47:22 AM
Discuss


Elephant 2000 
One stimulus for writing this article was a talk entitled ¨Programming languages of the year 2000¨ that I considered insufficiently ambitious.
Link above is to the discussion article posted by John Kozak (I'm just flagging this on the front page), which contains more comments and a link to the paper itself.
Posted to theory by andrew cooke on 6/21/02; 5:48:32 AM
Discuss


proof that (call/cc (lambda (c) (0 (c 1)))) => 1 
(via comp.lang.scheme)
William Clinger: In response to recent interest in the denotational semantics
of Scheme, and as an example of its use, I have calculated
that the expression
(callwithcurrentcontinuation (lambda (c) (0 (c 1))))
either returns 1 to its continuation or runs out of memory.
Posted to theory by Ehud Lamm on 6/14/02; 8:10:28 AM
Discuss (3 responses)


"System I Experimentation Tool" 
(via comp.lang.functional)
This web application provides a friendly tool for experimenting with System I. System I allows inferring principal typings for pure terms of the lambdacalculus with intersection types for arbitrary finite ranks. It is particularly interesting because it can infer types for terms that cannot be typed in other commonly studied systems, as well as the fact that due to the principal typing property, type inference is compositional.
More information about this project is available.
More on prinicipal typing.
Posted to theory by Ehud Lamm on 6/9/02; 8:05:19 AM
Discuss


When is a function a fold or an unfold? 
Jeremy Gibbons. When is a function a fold or an unfold? Workshop on Generic Programming, July 2001.
We present necessary and sufficient conditions for when a (partial or total) function can be expressed as a fold or an unfold.
The link above is for the slides. You may want to read the paper.
Consider the following two functions:
> either x y zs = elem x zs  elem y zs
> both x y zs = elem x zs && elem y zs
where
> elem x = foldr (().(x==)) False
Can either or both be expressed directly
as a foldr?
(They would be more efficient that way.) The workshop had many other interesting presentations, by the way.
Posted to theory by Ehud Lamm on 5/28/02; 1:24:20 PM
Discuss


Calculating Functional Programs 
Jeremy Gibbons. Calculating Functional Programs. In Algebraic and Coalgebraic Methods in the Mathematics of Program Construction, Lecture Notes in Computer Science 2297, p148203, January 2002.
A detailed introduction to the category theory approach. You don't need to know what a CPO is to read this. But you must be willing to learn...
Not for people who hate math.
Posted to theory by Ehud Lamm on 5/2/02; 1:20:49 AM
Discuss (1 response)


Types and Programming Languages 
(via the Types Forum)
Types and Programming Languages by Benjamin C. Pierce. ISBN 0262162091
This text provides a comprehensive introduction both to type systems in computer science and to the basic theory of programming languages. The approach is pragmatic and operational; each new concept is motivated by programming examples and the more theoretical sections are driven by the needs of implementations
Posted to theory by Ehud Lamm on 2/24/02; 1:32:51 AM
Discuss (1 response)


The Practice of Type Theory in Programming Languages 
Robert Harper.
Dagstuhl 10th Anniversary Symposium, Saarbruecken, Germany, August, 2000.
A good discussion of types, safety etc.
The focus is on refinement types, typed target languages and certifying compilers.
From the Fox Project (@CMU), which concentrates on the development of language support for building safe, highly composable, and reliable systems. Plenty on interesting stuff available.
The project was mentioned by Bryn Keller on the discussion group.
Posted to theory by Ehud Lamm on 2/16/02; 12:55:31 AM
Discuss


Modeling an Algebraic Stepper 
This paper describes the DrScheme stepper. The major attraction is the detailed presentation of the (semantics preserving) source transformations (based on continuation marks).
Alas, I don't think the code is available. I'd love to have a look..
A better link would be the Northeastern PLT Publications Page, which I am not able to access at the moment.
Posted to theory by Ehud Lamm on 1/31/02; 11:45:00 AM
Discuss (1 response)


Sound Generic Java type inference 
(via The Types Forum)
Martin Odersky:Alan Jeffrey has shown in a previous note to the types mailing list
that the original scheme for local type inference used in GJ was
unsound. The unsoundness was caused by the treatment of polymorphic
methods like Nil(), where the method arguments alone are not
sufficient to determine a type parameter instantiation. Prompted by
this observaton, I have been working on a new scheme for type
parameter inference which does not share that problem.
Posted to theory by Ehud Lamm on 1/15/02; 6:41:16 AM
Discuss


Formal semantics for C 
(TYPES forum)
Formalizing the semantics of C is far from trivial. The language has many implementation dependent features, making any model that presupposes their behaviour inadequate.
This discussion contains examples of such features (e.g., out of bounds array access), and links to some brave attempts to provide some formal treatment of C semantics.
Posted to theory by Ehud Lamm on 12/29/01; 11:12:26 AM
Discuss (1 response)


Lectures on the CurryHoward Isomorphism 
(via Frank's PLT Online)
Morten Heine B. Sørenson and Pawel Urzyczyn. Lectures on the CurryHoward Isomorphism.
Warning: This set of lecture notes is long!
This is a very good introduction to an important idea: The CurryHoward Isomorphism between systems of formal logic and computational type systems.
The isomorphism has many aspects, even at the syntactic level: formuals correspond to types, proofs correspond to terms, provability corresponds inhabitation, proof normalization corresponds to term reduction etc. (from the preface)
The text is fairly self contained, and starts from an explanation of the Lambda Calculus (LC) and the typed lambda calculus. It then goes on to talk about intuitionistic logic. Only after this preparation the CurryHoward isomorphism is explored.
The text includes exercises.
Posted to theory by Ehud Lamm on 12/27/01; 11:27:49 AM
Discuss


Categorical Programming with Abstract Data Types 
Martin Erwig. 7th Int. Conf. on Algebraic Methodology and Software Technology (AMAST'98), LNCS 1548, 406421, 1998
We show how to define fold operators for abstract data types. The main idea is to represent an ADT by a bialgebra, that is, an algebra/coalgebra pair with a common carrier. A program operating on an ADT is given by a mapping to another ADT. Such a mapping, called metamorphism, is basically a composition of the algebra of the second with the coalgebra of the first ADT. We investigate some properties of metamorphisms, and we show that the metamorphic programming style offers farreaching opportunities for program optimization that cover and even extend those known for algebraic data types.
If constructors are best represented using algebras, and observers are best represetned using colgebras, it is only natural to combine the two into a bialgebra!
This papers explores Metamorphic Programming. Haskell code for doing metamorphic programming is available.
Posted to theory by Ehud Lamm on 12/22/01; 8:41:46 AM
Discuss


The Charity Language 
Charity was mentioned here in the past, but we never got the chance to really discuss it. Following the recent discussion of coalgebra, I thought I'd mention it here again.
For those unfamiliar with the ideas of category theory in general, and the notions of initial and final algebra, I highly recommend the tutorial suggested by Frank: A Tutorial on (Co)Algebras and (Co)Induction by Jacobs and Rutten.
Two points to get the discussion going. Algebras are good for modelling contructors where as coalebras are good for modelling observers. ADTs obviously need both.
The line of work discussed in the Jacobs and Rutten tutorial is helpful for modelling ADTs. Objects in OOP are not the same as ADTs. Are there any works connecting theoretical foundatins of OO, and coalgebras?
How well all this connects to Charity, I don't know. I can't even surf their site decently at the moment (server errors etc. etc.). Is there any recent work on Charity? Can anyone share expereinces with using it?
Posted to theory by Ehud Lamm on 12/21/01; 8:50:32 AM
Discuss (15 responses)


Generic Java type inference is unsound 
It is a variant of the "classic" problem with polymorphic
references in SML, which resulted in the usual array of
fixes: notably value polymorphism.
This code compiles, but produces a ClassCastException
when executed, even though there are no explicit casts in
the program.
Enough said.
Posted to theory by Ehud Lamm on 12/18/01; 3:43:29 AM
Discuss (6 responses)


Foundational ProofCarrying Code 
Andrew W. Appel. Foundational ProofCarrying Code, 16th Annual IEEE Symposium on Logic in Computer Science (LICS '01), June 2001.
Proofcarrying code is a framework for the mechanical
verification of safety properties of machine language
programs, but the problem arises of quis custodiat ipsos
custodes  who will verify the verifier itself? Foundational
proofcarrying code is verification from the smallest
possible set of axioms, using the simplest possible verifier
and the smallest possible runtime system. I will describe
many of the mathematical and engineering problems
to be solved in the construction of a foundational
proofcarrying code system.
Very interesting survey paper. As an added bonus you can see examples of using Twelf (the metalogic framework), and an interesting approach to typing (section 5.2).
Warning: This paper is pretty dense.
Posted to theory by Ehud Lamm on 12/6/01; 3:57:04 AM
Discuss


Formalizing the Safety of Java, the Java Virtual Machine and 
P. H. Hartel and L. Moreau
Formalizing the Safety of Java, the Java Virtual Machine and Java Card, ACM Computing Surveys, to appear, 2001
We review the existing literature on Java safety, emphasizing
formal approaches, and the impact of Java safety on small
footprint devices such as smart cards. The conclusion is that while a
lot
of good work has been done, a more concerted effort is needed to build
a coherent set of machine readable formal models of the whole of Java
and its implementation. This is a formidable task but we believe it is
essential to building trust in Java safety, and thence to achieve
ITSEC level 6 or Common Criteria level 7 certification for Java
programs.
Compare to this book.
Posted to theory by Ehud Lamm on 11/15/01; 2:07:56 PM
Discuss


HM type inference and imperative language features 
(via comp.lang.functional)
Andrew K. Wright. Simple imperative polymorphism.
Lisp and Symbolic Computation, 8(4):343356, December 1995.
HM is unsound in the presence of imperative language features like references and exceptions.
This paper shows why, and describes a solution. Importantly, this solution has the property that equivalent abstractions have equivalent types  regardless of whether they are implemented functionally or imperatively. This is a desirable property if we want to separate interfaces from implementations  a very important SE goal.
Posted to theory by Ehud Lamm on 11/12/01; 11:52:24 AM
Discuss


Types vs. Sets 
Could someone explain the difference between types & sets? The maths
texts I have available (undergrad comp. sci. stuff) state their
equivalence explicitly, yet many writings I've looked up on the web
differentiate between them.
Interesting discussion in the Types Forum. (follow the thread)
Posted to theory by Ehud Lamm on 11/11/01; 2:17:11 PM
Discuss


Handbook of Automated Reasoning 
This Handbook presents overviews of the fundamental notions, techniques, ideas, and methods developed and used in automated reasoning and its applications, which are used in many areas of computer science, including software and hardware verification, logic and functional programming, formal methods, knowledge representation, deductive databases, and artificial intelligence.
Some highlights from the TOC (links are to the paper abstracts):
Posted to theory by Ehud Lamm on 11/10/01; 11:26:51 AM
Discuss


Semantics with Applications 
The 240 page book is online, for our enjoyment.
The stie also includes supplementary material.
The book talks about the three classic apporaches to semantics (operational, axiomatic and denotational), and tries to show how the relate to each other.
Posted to theory by Ehud Lamm on 10/31/01; 7:06:19 AM
Discuss (2 responses)


SLAM project at Microsoft Research 
(via HtP)
The SLAM project at Microsoft Research
investigates the relationships between Software (Specifications),
Languages, Analysis, and Model checking. Our goal is to be able to
check that software satisfies critical behavioral properties of the
interfaces it uses, and to aid software engineers in designing
interfaces and software to ensure reliable and correct functioning. The SLAM project is an effort of the Software Productivity
Tools Research group.
Interesting work on automatic verifaction, including methods for applying model checking to software.
The site includes some recent papers and presentations. The presentation contains a good summary of the various approaches to software verification.
Posted to theory by Ehud Lamm on 9/26/01; 2:50:05 AM
Discuss


System CT 
System CT extends MLlike type inference for supporting overloading and polymorphic recursion. The simplicity of coreML type inference is maintained, with no exception or special constructs included for coping with overloading. It uses a new decidable type inference algorithm for typing polymorphic recursion, extending the set of typeable terms of coreML
This work tries to deal with some classic restrictions (e.g., the monomorphism restriction). The site includes links to papers.
A Haskell implementation is provided.
Posted to theory by Ehud Lamm on 8/27/01; 5:41:28 AM
Discuss


VMlambda: A Functional Calculus for Scientific Discovery 
VML is a programming language proposed by discovery scientists for the
purpose of assisting the process of knowledge discovery. It is a
nontrivial extension of ML with _hypothetical views_. Operationally,
a hypothetical view is a value with a representation that indicates
how the value was created. The notion of hypothetical views has
already been successful in the domain of genome analysis, and known to
be useful in the process of knowledge discovery. However, VML as a
programming language was only informally defined in English prose, and
indeed found problematic both in theory and in practice. Thus, a
proper definition and implementation of VML with formal foundations
would be of great help to discovery science and hence corresponding
domain sciences.
...We also present a real
implementation of VMlambda, written in Camlp4 as a conservative
translator into OCaml. This implementation makes extensive use of
labeled arguments and polymorphic variants  two advanced features of OCaml that originate in OLabl.
Posted to theory by Ehud Lamm on 8/14/01; 2:34:24 PM
Discuss


Sequentiality and the PiCalculus 
We present a type discipline for the picalculus which precisely
captures the notion of sequential functional computation as a specific
class of name passing interactive behaviour. The typed calculus allows
direct interpretation of both callbyname and callbyvalue
sequential functions. The precision of the representation is
demonstrated by way of a fully abstract encoding of PCF. The result
shows how a typed picalculus can be used as a descriptive tool for
a significant class of programming languages without losing the
latter's semantic properties. Close correspondence with games
semantics and processtheoretic reasoning techniques are together used
to establish full abstraction.
Posted to theory by Ehud Lamm on 8/14/01; 12:34:02 AM
Discuss


Constrained Genericity 
One of the nice thing about genericity in Ada is that it is by its very nature constrained. When you define generic paramters you must, in effect, tell the compiler and the human code reader, what is required of the types passed as generic parameters. This is great for readability, but is also useful  naturally  for compile time checking and optimization.
This paper discusses the notion of constrained genericity in more detail.
Most object oriented languages don't support constrained genericity directly. Some don't support it at all.
Consider a sort<T> routine in C++. T must support a comparison routine <. However, this is unclear from the routines generic signature. The compiler may deduce, for example, the T must derive from a Comparable type, but this realization itself can only be achieved by compiling the routines implementation.
In Ada, the compiler would be able to compile the routine's declaration (spec), and then independently of the routine's implementation be able to check the correctness of instantiations of the routine (i.e., the T has the required comparison function).
It seems to me, but perhaps I am mistaken, that the disciplined apporach to constrained genericity isn't popular these days. The C++ approach is problematic, and it seems the proposals for adding genericity to Java and C#, folow the same path.
Posted to theory by Ehud Lamm on 6/30/01; 10:29:27 PM
Discuss (1 response)


Java is not typesafe 
Quote:
Let A and A' be two different classfiles defining a Java
class with the same fully qualified name (FQN) N. In a running Java
Virtual Machine (JVM) J, let A be loaded by a class loader L
(possibly the "null" loader) producing a Class object
C and A' by a class loader L' producing a Class
object C'. Let v' be a variable of "type" (we will have
more to say later about what is a "type" in Java) N in a class D
loaded in L'.
Proposition: Any instance t of C
can be stored in v'.
Proposition: J will (attempt to) execute
any operation defined in A' on t. J will (attempt to) read/write
any field defined in A' as if it existed in t.
end quote
This is not a new paper, but the result is worth looking into. Also, see this.
Posted to theory by Ehud Lamm on 6/26/01; 3:36:43 AM
Discuss


On the Unusual Effectiveness of Logic in Computer Science 
This verey interesting paper discusses various relations between CS and logic. Some examples come from PL research (propositionsastypes, semantics etc.); the other examples, though less related to LtU, are also quite interesting (e.g., reasoning about knowledge, automatic verification).
Posted to theory by Ehud Lamm on 6/19/01; 4:05:59 AM
Discuss


Categorical Logic and Type Theory 
This book gives a survey of categorical logic and type theory starting from the unifying concept of a fibration. Its intended audience consists of logicians, type theorists, category theorists and (theoretical) computer scientists. To get an impression, the prospectus from the book is made available.
Quite theoretical, and in depth (see the ToC on the web page)  but this stuff is important. A very warm review can be read here.
A sepcial 25% discount offer, available until July 31, was offered on various mailing lists. It may be worthwhile looking for it, since the book is expensive.
Posted to theory by Ehud Lamm on 6/14/01; 1:36:45 PM
Discuss


TypeBased Analysis and Applications 
Typebased analysis is an approach to static analysis of programs that has been studied for more than a decade. A typebased analysis assumes that the program type checks, and the analysis takes advantage of that. This website is a resource for researchers of typebased analysis. It contains links to papers in the area, including a short survey paper. The survey paper examines the state of the art of typebased analysis, and it surveys some of the many software tools that use typebased analysis.
The page lists quite a few links, all pointing to interesting papers. The survey paper itself provides a good introduction to the field.
Posted to theory by Ehud Lamm on 6/10/01; 8:05:53 AM
Discuss


Templates (C++, FPLs) 
There's an interesting discussion going on in the Haskell mailing list. Despite the source, the focus is on C++ templates (rather than Lisp macros). You'll have to click around; some selected posts to get you started:
One subdiscussion (including posts not yet archived, it lags a day behind) is how to add something similar to Haskell.
There was also a reference to macros in Clean  anyone have more info?
We last discussed macros here
Posted to theory by andrew cooke on 5/24/01; 12:35:13 AM
Discuss


Comprehending Monads 
Haskell uses Monads to (amongst other things) fit imperative IO into a functional framework. It's an interesting approach, but one that has a reputation for being difficult to fully understand (there's an argument that you don't need to understand monads to use Haskell, but I think it probably helps).
When someone asked for help with monads on the Haskell mailing list, this paper was mentioned several times. I'm still working my way thorugh it, but it's already cleared up some misunderstandings I had.
Posted to theory by andrew cooke on 5/20/01; 10:05:21 AM
Discuss


Adaptive Software 
Norvig (again), explaining what Adaptive Software is. We've covered this topic before, but I don't think I've seen a simple explanation like this.
In the second MIT talk someone characterised Adaptive programming as using "comeFrom" rather than "goto"  but I'm not sure if that's a help or not.
Thanks to Play With the Machine
Posted to theory by andrew cooke on 5/15/01; 8:16:59 AM
Discuss


Type Theory for Certified Code 
We present the type theory LTT, intended to form a basis for typed
target languages, providing an internal notion of logical proposition
and proof. The inclusion of explicit proofs allows the type system to
guarantee properties that would otherwise be incompatible with
decidable type checking. LTT also provides linear facilities for
tracking ephemeral properties that hold only for certain program
states.
Our type theory allows for reuse of typechecking software by casting
a variety of type systems within a single language. We provide
additional reuse with a framework for modular development of
operational semantics. This framework allows independent type systems
and their operational semantics to be joined together, automatically
inheriting the type safety properties of those individual systems
Posted to theory by Ehud Lamm on 5/5/01; 4:43:36 AM
Discuss


FunDeps (Type classes, databases and more) 
A wellwritten paper that explains some interesting ideas.
Type classes in Haskell allow programmers to define functions that can be used on a set of different types, with potentially different implementations in each case. [...] A commonly requested extension [...] has many potentially useful applications. [...] A particular novelty of this paper is the application of ideas from the theory of relational detabases to th edesign of type systems.
I think you could understand and enjoy (much of) this even if you don't know Haskell. Apart from the technical ideas it includes a survey of other approaches that illustrates the kind of thing computer scientists spend their time thinking about and how current type systems are progressing. A great paper!
Posted to theory by andrew cooke on 4/16/01; 1:47:39 AM
Discuss (1 response)


Programming with pictures 
This paper [pdf] [ps] (link above is to the abstract) shows how pictures and text can be combined.
This came up in c.l.functional a while back when someone pointed out that the balance function in Okasaki's red/black code wasn't very readable. Compare that with figure 4 in the paper above.
Posted to theory by andrew cooke on 3/30/01; 12:20:42 AM
Discuss (1 response)


Sharing Code through Firstclass Environments 
While we are discussing distributed programming, in the context of EJBs, I thought I'd point to this interesting suggestion.
Basically the idea is to allow reifying the environment. After doing so, you can also import environments.
This approach is, evidently, closely related to reflection.
The paper uses Scheme.
There's a related discussion on comp.lang.scheme, under the title why isn't the environment explicit ?, but I can't get to it it via google.
Posted to theory by Ehud Lamm on 3/26/01; 1:24:35 PM
Discuss (8 responses)


Metamorphic Programming 
One of the few things I have learnt so far from Algebra of Programming is how general the fold function is (it recursively applies a function to a list).
Metamorphic programming is a way of extending this to other data structures via an intermediate representation.
Posted to theory by andrew cooke on 3/25/01; 12:22:17 AM
Discuss (1 response)


