Type Theory

Automatic type inference via partial evaluation

Automatic type inference via partial evaluation. Aaron Tomb, Cormac Flanagan. PPDP’05.

Type checking and type inference are fundamentally similar problems. However, the algorithms for performing the two operations, on the same type system, often differ significantly. The type checker is typically a straightforward encoding of the original type rules. For many systems, type inference is performed using a two-phase, constraint-based algorithm.We present an approach that, given the original type rules written as clauses in a logic programming language, automatically generates an efficient, two-phase, constraint-based type inference algorithm. Our approach works by partially evaluating the type checking rules with respect to the target program to yield a set of constraints suitable for input to an external constraint solver. This approach avoids the need to manually develop and verify a separate type inference algorithm, and is ideal for experimentation with and rapid prototyping of novel type systems.

Also somewhat relevant to the discussions here about type checking as abstract interpretation.

A Typed, Compositional Logic for a Stack-Based Abstract Machine

A Typed, Compositional Logic for a Stack-Based Abstract Machine. Nick Benton. MSR-TR-2005-84. June 2005.

We define a compositional program logic in the style of Floyd and Hoare for a simple, typed, stack-based abstract machine with unstructured control flow, global variables and mutually recursive procedure calls. Notable features of the logic include a careful treatment of auxiliary variables and quantification and the use of substructural typing to permit local, modular reasoning about program fragments. Semantic soundness is established using an interpretation of types and assertions defined by orthogonality with respect to sets of contexts.

Also related to PCC, TAL etc.

A Formulae-as-Types Interpretation of Subtractive Logic

A Formulae-as-Types Interpretation of Subtractive Logic

We present a formulae-as-types interpretation of Subtractive Logic (i.e. bi-intuitionistic logic). This presentation is two-fold: we first define a very natural restriction of the lambda-μ-calculus which is closed under
reduction and whose type system is a constructive restriction of the Classical Natural Deduction. Then we extend this deduction system conservatively to Subtractive Logic. From a computational standpoint,
the resulting calculus provides a type system for first-class coroutines (a restricted form of first-class continuations).

Yet another connection between subtractive logic and control. I remember the author mentioned on LtU, but I cannot find any citations.

Generics are a mistake?

Generics Considered Harmful

Ken Arnold, "programmer and author who helped create Jini, JavaSpaces, Curses, and Rogue", writes that the usefulness of generics is outweighed by their complexity. Ken is talking about Java 5, but such critiques are well-known for C++, and C# is not immune either. Ken describes the Java case as follows:

So, I don't know how to ease into this gently. So I'll just spit it out.

Generics are a mistake.

This is not a problem based on technical disagreements. It's a fundamental language design problem.

Any feature added to any system has to pass a basic test: If it adds complexity, is the benefit worth the cost? The more obscure or minor the benefit, the less complexity its worth. Sometimes this is referred to with the name "complexity budget". A design should have a complexity budget to keep its overall complexity under control.

[...]

Which brings up the problem that I always cite for C++: I call it the "Nth order exception to the exception rule." It sounds like this: "You can do x, except in case y, unless y does z, in which case you can if ..."

Humans can't track this stuff. They are always losing which exception to what other exception applies (or doesn't) in any given case.

[...]

Without [an explicit complexity] budget, it feels like the JSR process ran far ahead, without a step back to ask “Is this feature really necessary”. It seemed to just be understood that it was necessary.

It was understood wrong.

The article contains a few simple supporting examples, including the interesting definition of Java 5's Enum type as:

Enum<T extends Enum<T>>

...which "we're assured by the type theorists ... we should simply not think about too much, for which we are grateful."

If we accept the article's premise, here's a question with an LtU spin: do the more elegant, tractable polymorphic inferencing type systems, as found in functional languages, improve on this situation enough to be a viable alternative that could address these complexity problems? In other words, are these problems a selling point for better type systems, or another barrier to adoption?

[Thanks to Perry Metzger for the pointer.]

TypeCase: A Design Pattern for Type-Indexed Functions

Bruno C. d. S. Oliveira and Jeremy Gibbons. TypeCase: A Design Pattern for Type-Indexed Functions. Submitted for publication, June 2005.

A type-indexed function is a function that is defined for each member of some family of types. Haskell's type class mechanism provides open type-indexed functions, in which the indexing family can be extended at any time by defining a new type class instance. The purpose of this paper is to present TypeCase: a design pattern that allows the definition of closed type-indexed functions. It is inspired by Cheney and Hinze's work on lightweight approaches to generic programming. We generalise their techniques as a design pattern. Furthermore, we show that type-indexed functions with type-indexed types, and consequently generic functions with generic types, can also be encoded in a lightweight manner, thereby overcoming one of the main limitations of the lightweight approaches.

A new paper in a field we follow quite closely (i.e., generic programming).

The paper starts with a useful summary of important previous results, which is worth reading even if you don't plan on studying the whole paper.

From shift and reset to polarized linear logic

By now, shift/reset should be as popular as call/cc was ten years ago. Some think these control operators are even more important in practice than call/cc, and should be directly supported by PLs. I believe, this paper by Chung-chieh Shan will be interesting to many who loves logic and Curry-Howard isomorphism.

From shift and reset to polarized linear logic

Abstract:

Griffin pointed out that, just as the pure lambda-calculus corresponds to intuitionistic logic, a lambda-calculus with first-class continuations corresponds to classical logic. We study how first-class delimited continuations, in the form of Danvy and Filinski’s shift and reset operators, can also be logically interpreted. First, we refine Danvy and Filinski’s type system for shift and reset to distinguish between pure and impure functions. This refinement not only paves the way for answer type polymorphism, which makes more terms typable, but also helps us invert the continuation-passing-style (CPS) transform: any pure lambda-term with an appropriate type is beta-eta-equivalent to the CPS transform of some shift-reset expression. We conclude that the lambda-calculus with shift and reset and the pure lambda-calculus have the same logical interpretation, namely good old intuitionistic logic. Second, we mix delimited continuations with undelimited ones. Informed by the preceding conclusion, we translate the lambda-calculus with shift and reset into a polarized variant of linear logic that integrates classical and intuitionistic reasoning. Extending previous work on the lambda-µ-calculus, this unifying intermediate language expresses computations with and without control effects, on delimited and undelimited continuations, in call-by-value and call-byname settings.

The Essence of Data Access in Cw

The Essence of Data Access in Cw, The power is in the dot! Gavin Bierman, Erik Meijer, and Wolfram Schulte.

In this paper we concentrate on the data dimension. Our aim is to describe the essence of these extentions; by which we mean we identify, exemplify and formalize their essential features. Our tool is a small core language FCw, which is a valid subset of the full Cw language... we are able to formalize the type system and operational semantics of the data access fragments of Cw.

If you have been following the discussions here of Cw, you already know about the language features discussed here, since the paper doesn't introduce new features. If you haven't seen Cw, section 2 is a short and readable introduction.

The rest of the paper is more formal, and unless you need to prove formal results regarding Cw, might not be all that interesting. It won't hurt to keep in mind that it exists, since some of us may need something like FCw at one point or another.

Generics: The Importance of Wildcards

Martin Bravenboer writes about generic wildcards in Java, and concludes that it is unfortunate that C# will not support wildcards or a similar mechanism.

Eric Gunnerson from Microsoft replies.

I was originally a type-erasure fan, but these days I am not so sure. I hope this turns into a fruitful discussion that helps me decide...

P.S

The Java paper was mentioned on LtU before.

A type discipline for authorization policies

A type discipline for authorization policies. Cedric Fournet; Andrew D. Gordon; Sergio Maffeis

Distributed systems and applications are often expected to enforce high-level authorization policies. To this end, the code for these systems relies on lower-level security mechanisms such as, for instance, digital signatures, local ACLs, and encrypted communications. In principle, authorization specifications can be separated from code and carefully audited. Logic programs, in particular, can express policies in a simple, abstract manner. For a given authorization policy, we consider the problem of checking whether a cryptographic implementation complies with the policy. We formalize authorization policies by embedding logical predicates and queries within a spi-calculus. This embedding is new, simple, and general; it allows us to treat logic programs as specifications of code using secure channels, cryptography, or a combination. Moreover, we propose a new dependent type system for verifying such implementations against their policies. Using Datalog as an authorization logic, we show how to type several examples using policies and present a general schema for compiling policies.

I guess it's dependent types day around here...

Why Dependent Types Matter

Why Dependent Types Matter. Thorsten Altenkirch Conor McBride James McKinna

We exhibit the rationale behind the design of Epigram, a dependently typed programming language and interactive program development system using refinements of a well known program, merge sort, as a running example. We discuss the relationship to other proposals to introduce aspects of dependent types into functional programming languages and sketch some topics for further work in this area.

Epigram, dependent types, general reucrsion, indexed datatypes - it's all here!

Enjoy.

XML feed