Fω^C: a symmetrically classical variant of System Fω

Lengrand & Miquel (2008). Classical Fω, orthogonality and symmetric candidates. Annals of Pure and Applied Logic 153:3-20.

We present a version of system Fω, called Fω^C, in which the layer of type
constructors is essentially the traditional one of Fω, whereas provability
of types is classical. The proof-term calculus accounting for the classical
reasoning is a variant of Barbanera and Berardi’s symmetric λ-calculus.
We prove that the whole calculus is strongly normalising. For the
layer of type constructors, we use Tait and Girard’s reducibility method
combined with orthogonality techniques. For the (classical) layer of terms,
we use Barbanera and Berardi’s method based on a symmetric notion of
reducibility candidate. We prove that orthogonality does not capture the
fixpoint construction of symmetric candidates.

We establish the consistency of Fω^C, and relate the calculus to the
traditional system Fω, also when the latter is extended with axioms for
classical logic.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Significance?

I am quite ready to believe that this is interesting, but I think that it makes sense for the OP to justify a bit more why they thought this paper is signicant for PLs. The most pressing question being: if a language were based on this new calculus, what would be the advantages?

OP? That'd be me...

Right: I see that it might not be obvious. This is not a programming language theory paper, and one should not be thinking along the lines of "what would a Haskell look like that included this typed lambda calculus?" Rather, this is a significant recent contribution to the so-called classical Curry-Howard correspondence. I think I can make good the justification gap by providing an annotated bibliography of the most significant work since Curien and Herbelin's The Duality of Computation, but I haven't time to do that right now. I can make a few comments here, though:

  1. I don't accept that either this paper, or Michel Parigot's strangely neglected 1997 article Strong Normalization of Second Order Symmetric lambda-Calculus, provides a Curry-Howard correspondence, since they are non-confluent calculi, and the Curry-Howard correspondence depends upon Prawitz's thesis, that the value of a proof is its normal form;
  2. However, among other reasons, these calculi are significant in that they give rise to many interesting Curry-Howard calculi by restricting their rewrite relation. Going from symmetric calculi to asymmetric calculi is understood, an active area of new, interesting work, and is the technology behind Wadler's Call-by-value is dual to call-by-name;
  3. And in general, these results are tricky: when you take an intuitionistic impredicative lambda-calculus and add constructions and rewrite rules to make it classical, one necessarily obtains a calculus that is harder to prove strong normalisation for than the original, and if you make that calculus symmetric, it becomes harder still. These are mother SN results, from which all kinds of other useful SN results follow by embedding translations.

To summarise, this is a theory paper that does not directly address PL design, but does increase the resources available to PL designers who want good type theories for their languages.

Posctscript: David and Nour's 2005 article Why the usual candidates of reducibility do not
work for the symmetric λμ-calculus
helps understand the difficulties proving SN for such symmetric classical calculi; section 3.4 of my DPhil gives a different, earlier, account: my calculus is not symmetric, but has some mu-bar like rules for natural numbers induction, and so runs into the same difficulties.

Yeah, but what is the result?

As I get it, this is a SN result for a new type-theory which some kind of hybrid between an intuitionistic and a classical type theory.

Why did they take that route, what does it help, or maybe even, for those of use who are new to this: Why is a SN result interesting?

Sorry, missed marco's question

marco asked: Why is a SN result interesting? — which I missed at the time...

Essentially, a term rewriting system needs CR if we are to consider it a sane candidate as the basis for a functional programming language. SN is not necessary for this, but one can identify subsets, kernels if you like, of the rewriting system that provide the basis of the administrative core of the programming language: things like definitions, non-recursive control structures, non-recursive function applications, etc.

If you want to use some extension of the lambda calculus as the foundation for some FPL with first-class continuations, then first you need to identify a reduction strategy —or at least some constraint on possible rewrites— in the FPL that corresponds to a CR subsystem of your extended lambda calculus; and this subsystem needs to support everything in the FPL that deals with first-class continuations, and it needs to be SN.

Is that intelligible? Given one accepts that, then it should follow that the CR subsystems of Lengrand & Miquel's calculus are very interesting to those who follow the "classical formulae-as-types" literature.

Classical Logic and Computation '06

Noam Zeilberger wrote to remind me that APAL #153 was a special issue edited by Berardi & van Bakel of six papers presented at the Classical Logic and Computation workshop, 2006; I had seen the CfP but quite forgotten about the possibility of proceedings. Many of the talks delivered there, or contributed later to the APAL CL&C special issue, look interesting, including Noam's contribution. Besides the Lengrand & Miquel's contribution (originally titled A classical version of system F-omega) the talks were:

  • Yevgeniy Makarov. Simplifying Programs Extracted from Classical Proofs;
  • Kentaro Kikuchi. Call-by-Name Reduction and Cut-Elimination in Classical Logic;
  • Berardi & Yamagta. A sequent calculus for Limit Computable Mathematics;
  • Lutz Strassburger. What is a Boolean Category? (who did not have a paper in the special issue, cf. his What could a Boolean category be?).

Additionally, there were two papers in the special issue not delivered at CL&C'06:

  • Noam Zeilberger. On the unity of duality.
  • Monika Seisenberger. Programs from proofs using classical dependent choice.

I haven't found unencumbered links to many of these papers. There was a Classical Logic and Computation workshop, 2008 as well.

APAL CL&C sequel

Ulrich Berger has announced a follow-up special issue of APAL: Second special issue on classical logic & computation, CL&L 2008. The CfP explicitly invites papers addressing programming language design, so there may be some LtUers out there for whom this call is interesting.

I have a 10 year old idea, still seems interesting, might be the time to dust it off...