User loginNavigation |
Type TheoryThe Simple Essence of Algebraic Subtyping: Principal Type Inference with Subtyping Made EasyThe Simple Essence of Algebraic Subtyping: Principal Type Inference with Subtyping Made Easy, Lionel Parreaux, ICFP 2020.
There's also an introductory blog post and an online demo. Stephen Dolan's Algebraic Subtyping (discussion) unexpectedly provided a solution to the problem of combining type inference and subtyping, but used somewhat heavy and unusual machinery. Now Lionel Parreaux shows that the system can be implemented in a very straightforward and pleasing way. Here's to hoping that it makes it into real languages! The Little TyperA new introductory book about dependent types, involving some familiar names: by Daniel P. Friedman and David Thrane Christiansen. Foreword by Robert Harper. Afterword by Conor McBride.
Safe Dynamic Memory Management in Ada and SPARKSafe Dynamic Memory Management in Ada and SPARK by Maroua Maalej, Tucker Taft, Yannick Moy:
For the systems programmers among you, you might be interested in some new developments in Ada where they propose to add ownership types to Ada's pointer/access types, to improve the flexibility of the programs that can be written and whose safety can be automatically verified. The automated satisfiability of these safety properties is a key goal of the SPARK Ada subset. By naasking at 2018-07-26 19:42 | Implementation | Type Theory | 1 comment | other blogs | 49430 reads
The Gentle Art of Levitation
The Gentle Art of Levitation
2010 by James Chapman, Pierre-Evariste Dagand, Conor McBride, Peter Morrisy We present a closed dependent type theory whose inductive types are given not by a scheme for generative declarations, but by encoding in a universe. Each inductive datatype arises by interpreting its description—a first-class value in a datatype of descriptions. Moreover, the latter itself has a description. Datatype-generic programming thus becomes ordinary programming. We show some of the resulting generic operations and deploy them in particular, useful ways on the datatype of datatype descriptions itself. Surprisingly this apparently self-supporting setup is achievable without paradox or infinite regress.It's datatype descriptions all the way down. By Andris Birkmanis at 2018-05-11 19:26 | Semantics | Type Theory | 1 comment | other blogs | 37820 reads
Comprehending Ringads
Comprehending Ringads
2016 by Jeremy Gibbons Ringad comprehensions represent a convenient notation for expressing database queries. The ringad structure alone does not provide a good explanation or an efficient implementation of relational joins; but by allowing heterogeneous comprehensions, involving both bag and indexed table ringads, we show how to accommodate these too.Indexed/parametric/graded monads are the key (read the paper to understand the pun). By Andris Birkmanis at 2018-05-05 02:59 | Category Theory | Semantics | Type Theory | login or register to post comments | other blogs | 37708 reads
Sequent Calculus as a Compiler Intermediate LanguageSequent Calculus as a Compiler Intermediate Language
By Andris Birkmanis at 2018-04-02 17:06 | Functional | General | Lambda Calculus | Semantics | Type Theory | 2 comments | other blogs | 45506 reads
The Syntax and Semantics of Quantitative Type TheoryThe Syntax and Semantics of Quantitative Type Theory by Robert Atkey:
Resource-aware programming is a hot topic these days, with Rust exploiting affine and ownership types to scope and track resource usage, and with Ethereum requiring programs to spend "gas" to execute. Combining linear and dependent types has proven difficult though, so making it easier to track and reason about resource usage in dependent type theories would then be a huge benefit to making verification more practical in domains where resources are limited. By naasking at 2017-07-25 17:28 | Semantics | Theory | Type Theory | 5 comments | other blogs | 32921 reads
RustBelt: Securing the Foundations of the Rust Programming LanguageRustBelt: Securing the Foundations of the Rust Programming Language by Ralf Jung, Jacques-Henri Jourdan, Robbert Krebbers, Derek Dreyer:
Rust is definitely pushing the envelope in a new direction, but there's always a little wariness around using libraries that make use of unsafe features, since "safety with performance" is a main reason people want to use Rust. So this is a great step in the right direction! By naasking at 2017-07-10 15:14 | Functional | Object-Functional | Type Theory | login or register to post comments | other blogs | 15980 reads
Type Systems as MacrosType Systems as Macros, by Stephen Chang, Alex Knauth, Ben Greenman:
This looks pretty awesome considering it's not limited to simple typed languages, but extends all the way to System F and F-omega! Even better, they can reuse previous type systems to define new ones, thereby reducing the effort to implement more expressive type systems. All code and further details available here, and here's a blog post where Ben Greenman further discusses the related "type tailoring", and of course, these are both directly related to Active Libraries. Taken to its extreme, why not have an assembler with a powerful macro system of this sort as your host language, and every high-level language would be built on this. I'm not sure if this approach would extend that far, but it's an interesting idea. You'd need a cpp-like highly portable macro tool, and porting to a new platform consists of writing architecture-specific macros for some core language, like System F. This work may also conceptually dovetail with another thread discussing fexprs and compilation. By naasking at 2017-04-19 23:38 | DSL | Functional | Lambda Calculus | Meta-Programming | Type Theory | 15 comments | other blogs | 56786 reads
Contextual isomorphisms
Contextual Isomorphisms
This paper is based on a very simple idea that everyone familiar with type-systems can enjoy. It then applies it in a technical setting in which it brings a useful contribution. I suspect that many readers will find that second part too technical, but they may still enjoy the idea itself. To facilitate this, I will rephrase the abstract above in a way that I hope makes it accessible to a larger audience. The problem that the paper solves is: how do we know what it means
for two types to be equivalent? For many languages they are reasonable
definitions of equivalence (such that: there exists a bijection
between these two types in the language), but for more advanced
languages these definitions break down. For example, in presence of
hidden mutable state, one can build a pair of functions from the
one-element type To define what it means for two program fragments to be equivalent, we have a "gold standard", which is contextual equivalence: two program fragments are contextually equivalent if we can replace one for the other in any complete program without changing its behavior. This is simple to state, it is usually clear how to instantiate this definition for a new system, and it gives you a satisfying notion of equivalent. It may not be the most convenient one to work with, so people define others, more specific notions of equivalence (typically beta-eta-equivalence or logical relations); it is fine if they are more sophisticated, and their definiton harder to justify or understand, because they can always be compared to this simple definition to gain confidence. The simple idea in the paper above is to use this exact same trick to define what it means for two types to be equivalent. Naively, one could say that two types are equivalent if, in any well-typed program, one can replace some occurrences of the first type by occurrences of the second type, all other things being unchanged. This does not quite work, as changing the types that appear in a program without changing its terms would create ill-typed terms. So instead, the paper proposes that two types are equivalent when we are told how to transform any program using the first type into a program using the second type, in a way that is bijective (invertible) and compositional -- see the paper for details. Then, the author can validate this definition by showing that, when instantiated to languages (simple or complex) where existing notions of equivalence have been proposed, this new notion of equivalence corresponds to the previous notions. (Readers may find that even the warmup part of the paper, namely the sections 1 to 4, pages 1 to 6, are rather dense, with a compactly exposed general idea and arguably a lack of concrete examples that would help understanding. Surely this terseness is in large part a consequence of strict page limits -- conference articles are the tweets of computer science research. A nice side-effect (no pun intended) is that you can observe a carefully chosen formal language at work, designed to expose the setting and perform relevant proofs in minimal space: category theory, and in particular the concept of naturality, is the killer space-saving measure here.) |
Browse archives
Active forum topics |
Recent comments
17 weeks 8 hours ago
17 weeks 12 hours ago
17 weeks 12 hours ago
39 weeks 1 day ago
43 weeks 3 days ago
45 weeks 20 hours ago
45 weeks 20 hours ago
47 weeks 5 days ago
1 year 2 days ago
1 year 2 days ago