Lambda the Ultimate - Semantics
http://lambda-the-ultimate.org/taxonomy/term/29/0
enImplementing Algebraic Effects in C
http://lambda-the-ultimate.org/node/5457
<p ><a href="https://www.microsoft.com/en-us/research/publication/implementing-algebraic-effects-c/">Implementing Algebraic Effects in C</a> by Daan Leijen:</p>
<blockquote ><p >We describe a full implementation of algebraic effects and handlers as a library in standard and portable C99, where effect operations can be used just like regular C functions. We use a formal operational semantics to guide the C implementation at every step where an evaluation context corresponds directly to a particular C execution context. Finally we show a novel extension to the formal semantics to describe optimized tail resumptions and prove that the extension is sound. This gives two orders of magnitude improvement to the performance of tail resumptive operations (up to about 150 million operations per second on a Core i7@2.6GHz)</p></blockquote>
<p >Another great paper by Daan Leijen, this time on a C library with immediate practical applications at Microsoft. The applicability is much wider though, since it's an ordinary C library for defining and using arbitrary algebraic effects. It looks pretty usable and is faster and more general than most of the C coroutine libraries that already exist.</p>
<p >It's a nice addition to your toolbox for creating language runtimes in C, particularly since it provides a unified, structured way of creating and handling a variety of sophisticated language behaviours, like async/await, in ordinary C with good performance. There has been considerable discussion here of C and low-level languages with green threads, coroutines and so on, so hopefully others will find this useful!</p>EffectsImplementationLambda CalculusSemanticsThu, 27 Jul 2017 13:50:17 +0000The Syntax and Semantics of Quantitative Type Theory
http://lambda-the-ultimate.org/node/5453
<p ><a href="http://bentnib.org/quantitative-type-theory.html">The Syntax and Semantics of Quantitative Type Theory</a> by Robert Atkey:</p>
<blockquote ><p >Type Theory offers a tantalising promise: that we can program and reason within a single unified system. However, this promise slips away when we try to produce efficient programs. Type Theory offers little control over the intensional aspect of programs: how are computational resources used, and when can they be reused. Tracking resource usage via types has a long history, starting with Girard's Linear Logic and culminating with recent work in contextual effects, coeffects, and quantitative type theories. However, there is conflict with full dependent Type Theory when accounting for the difference between usages in types and terms. Recently, McBride has proposed a system that resolves this conflict by treating usage in types as a zero usage, so that it doesn't affect the usage in terms. This leads to a simple expressive system, which we have named Quantitative Type Theory (QTT).</p>
<p >McBride presented a syntax and typing rules for the system, as well as an erasure property that exploits the difference between “not used” and “used”, but does not do anything with the finer usage information. In this paper, we present present a semantic interpretation of a variant of McBride's system, where we fully exploit the usage information. We interpret terms simultaneously as having extensional (compile-time) content and intensional (runtime) content. In our example models, extensional content is set-theoretic functions, representing the compile-time or type-level content of a type-theoretic construction. Intensional content is given by realisers for the extensional content. We use Abramsky et al.'s Linear Combinatory Algebras as realisers, yield a large range of potential models from Geometry of Interaction, graph models, and syntactic models. Read constructively, our models provide a resource sensitive compilation method for QTT.</p>
<p >To rigorously define the structure required for models of QTT, we introduce the concept of a Quantitative Category with Families, a generalisation of the standard Category with Families class of models of Type Theory, and show that this class of models soundly interprets Quantitative Type Theory.</p></blockquote>
<p >Resource-aware programming is a hot topic these days, with Rust exploiting affine and ownership types to scope and track resource usage, and with <a href="http://lambda-the-ultimate.org/node/5003">Ethereum</a> requiring programs to spend "gas" to execute. Combining linear and dependent types has proven difficult though, so making it easier to track and reason about resource usage in dependent type theories would then be a huge benefit to making verification more practical in domains where resources are limited.</p>SemanticsTheoryType TheoryTue, 25 Jul 2017 17:28:17 +0000Philip Wadler: Category Theory for the Working Hacker
http://lambda-the-ultimate.org/node/5366
<p ><a href="https://www.infoq.com/presentations/category-theory-propositions-principle">Nothing you don't already know</a>, if you are inteo this sort of thing (and many if not most LtU-ers are), but a quick way to get the basic idea if you are not. Wadler has papers that explain Curry-Howard better, and the category theory content here is very basic -- but it's an easy listen that will give you the fundamental points if you still wonder what this category thing is all about. </p>
<p >To make this a bit more fun for those already in the know: what is totally missing from the talk (understandable given time constraints) is why this should interest the "working hacker". So how about pointing out a few cool uses/ideas that discerning hackers will appreciate? Go for it!</p>Category TheoryLambda CalculusSemanticsSun, 07 Aug 2016 17:26:26 +0000Fully Abstract Compilation via Universal Embedding
http://lambda-the-ultimate.org/node/5364
<p ><a href="https://www.williamjbowman.com/resources/fabcc-paper.pdf">Fully Abstract Compilation via Universal Embedding</a> by Max S. New, William J. Bowman, and Amal Ahmed:</p>
<blockquote ><p >A <em >fully abstract</em> compiler guarantees that two source components are observationally equivalent in the source language if and only if their translations are observationally equivalent in the target. Full abstraction implies the translation is secure: target-language attackers can make no more observations of a compiled component than a source-language attacker interacting with the original source component. Proving full abstraction for realistic compilers is challenging because realistic target languages contain features (such as control effects) unavailable in the source, while proofs of full abstraction require showing that every target context to which a compiled component may be linked can be back-translated to a behaviorally equivalent source context.</p>
<p >We prove the first full abstraction result for a translation whose target language contains exceptions, but the source does not. Our translation—specifically, closure conversion of simply typed λ-calculus with recursive types—uses types at the target level to ensure that a compiled component is never linked with attackers that have more distinguishing power than source-level attackers. We present a new back-translation technique based on a deep embedding of the target language into the source language at a dynamic type. Then boundaries are inserted that mediate terms between the untyped embedding and the strongly-typed source. This technique allows back-translating non-terminating programs, target features that are untypeable in the source, and well-bracketed effects.</p></blockquote>
<p >Potentially a promising step forward to secure multilanguage runtimes. We've previously discussed security vulnerabilities caused by full abstraction failures <a href="http://lambda-the-ultimate.org/node/3830">here</a> and <a href="http://lambda-the-ultimate.org/node/1588">here</a>. The paper also provides a comprehensive review of associated literature, like various means of protection, back translations, embeddings, etc.</p>Lambda CalculusSemanticsTheoryType TheoryWed, 27 Jul 2016 15:57:02 +0000Simon Peyton Jones elected into the Royal Society Fellowship
http://lambda-the-ultimate.org/node/5332
<p ><A href='http://research.microsoft.com/en-us/people/simonpj/'>Simon Peyton Jones</A> has been <A href="https://royalsociety.org/fellows/fellows-directory/#?earliestelectedyear=2016">elected</A> as a <A href='https://royalsociety.org/people/simon-peyton-jones-12889/'>Fellow of the Royal Society</A>. The Royal Society biography reads:<BLOCKQUOTE ><br >
Simon's main research interest is in functional programming languages, their implementation, and their application. He was a key contributor to the design of the now-standard functional language Haskell, and is the lead designer of the widely-used Glasgow Haskell Compiler (GHC). He has written two textbooks about the implementation of functional languages.</p>
<p >More generally, Simon is interested in language design, rich type systems, compiler technology, code generation, runtime systems, virtual machines, and garbage collection. He is particularly motivated by direct use of principled theory to practical language design and implementation -- that is one reason he loves functional programming so much.</p>
<p >Simon is also chair of Computing at School, the grass-roots organisation that was at the epicentre of the 2014 reform of the English computing curriculum.<br >
</BLOCKQUOTE></p>
<p >Congratulations SPJ!</p>FunctionalGeneralImplementationParadigmsSemanticsSoftware EngineeringSpotlightTeaching & LearningTheorySat, 30 Apr 2016 19:44:24 +0000Running Probabilistic Programs Backwards
http://lambda-the-ultimate.org/node/5215
<p >I saw this work presented at ESOP 2015 by <a href="http://www.cs.umd.edu/~ntoronto/">Neil Toronto</a>, and the talk was excellent (<a href="http://www.cs.umd.edu/~ntoronto/papers/toronto-2015esop-slides.pdf">slides</a>).</p>
<p ><a href="http://arxiv.org/abs/1412.4053">Running Probabilistic Programs Backwards</a><br >
Neil Toronto, Jay McCarthy, David Van Horn<br >
2015</p>
<blockquote >
<p >Many probabilistic programming languages allow programs to be run under constraints in order to carry out Bayesian inference. Running programs under constraints could enable other uses such as rare event simulation and probabilistic verification---except that all such probabilistic languages are necessarily limited because they are defined or implemented in terms of an impoverished theory of probability. Measure-theoretic probability provides a more general foundation, but its generality makes finding computational content difficult.</p>
<p >We develop a measure-theoretic semantics for a first-order probabilistic language with recursion, which interprets programs as functions that compute preimages. Preimage functions are generally uncomputable, so we derive an abstract semantics. We implement the abstract semantics and use the implementation to carry out Bayesian inference, stochastic ray tracing (a rare event simulation), and probabilistic verification of floating-point error bounds.</p>
</blockquote>
<p >(also on <a href="https://scirate.com/arxiv/1412.4053">SciRate</a>)</p>
The introduction sells the practical side of the work a bit better than the abstract.
<blockquote >
<p >Stochastic ray tracing [30] is one such rare-event simulation task. As illus-
trated in Fig. 1, to carry out stochastic ray tracing, a probabilistic program
simulates a light source emitting a single photon in a random direction, which
is reflected or absorbed when it hits a wall. The program outputs the photon’s
path, which is constrained to pass through an aperture. Millions of paths that
meet the constraint are sampled, then projected onto a simulated sensor array.</p>
<p >The program’s main loop is a recursive function with two arguments: <code >path</code>,
the photon’s path so far as a list of points, and <code >dir</code>, the photon’s current direction.</p>
<code ><pre >
simulate-photon path dir :=
case (find-hit (fst path) dir) of
absorb pt −→ (pt, path)
reflect pt norm −→ simulate-photon (pt, path) (random-half-dir norm)
</pre></code>
<p >Running <code >simulate-photon (pt, ()) dir</code>, where <code >pt</code> is the light source’s location and
<code >dir</code> is a random emission direction, generates a photon path. The <code >fst</code> of the path
(the last collision point) is constrained to be in the aperture. The remainder of
the program is simple vector math that computes ray-plane intersections.</p>
<p >In contrast, hand-coded stochastic ray tracers, written in general-purpose
languages, are much more complex and divorced from the physical processes they
simulate, because they must interleave the advanced Monte Carlo algorithms
that ensure the aperture constraint is met.</p>
<p >Unfortunately, while many probabilistic programming languages support random real numbers, none are capable of running a probabilistic program like <code >simulate-photon</code> under constraints to carry out stochastic ray tracing. The reason is not lack of
engineering or weak algorithms, but is theoretical at its core: they are all either
defined or implemented using [density functions]. [...] Programs whose outputs are deterministic
functions of random values and programs with recursion generally cannot denote
density functions. The program <code >simulate-photon</code> exhibits both characteristics.</p>
<p >Measure-theoretic probability is a more powerful alternative to this naive
probability theory based on probability mass and density functions. It not only
subsumes naive probability theory, but is capable of defining any computable
probability distribution, and many uncomputable distributions. But while even
the earliest work [15] on probabilistic languages is measure-theoretic, the theory’s
generality has historically made finding useful computational content difficult.
We show that measure-theoretic probability can be made computational by</p>
<ol >
<li >Using measure-theoretic probability to define a compositional, denotational
semantics that gives a valid denotation to every program.</li>
<li >Deriving an abstract semantics, which allows computing answers to questions
about probabilistic programs to arbitrary accuracy.</li>
<li >Implementing the abstract semantics and efficiently solving problems.</li></ol>
<p >In fact, our primary implementation, Dr. Bayes, produced <a href="http://s13.postimg.org/7fki8zeg3/figure1.jpg?noCache=1438446927">Fig. 1b</a> by running a probabilistic program like <code >simulate-photon</code> under an aperture constraint.</p>
</blockquote>SemanticsSat, 01 Aug 2015 16:49:24 +0000Cakes, Custard, and Category Theory
http://lambda-the-ultimate.org/node/5200
<p ><A href="http://www.eugeniacheng.com/">Eugenia Cheng</A>'s new popular coscience book is out, in the U.K. under the title <A href="http://www.amazon.co.uk/Cakes-Custard-Category Theory-understanding/dp/1781252874/ref=sr_1_1?ie=UTF8&qid=1422244697&sr=8-1&keywords=eugenia+cheng">Cakes, Custard and Category Theory: Easy recipes for understanding complex maths</A>, and in the U.S. under the title <A href="http://www.amazon.com/How-Bake-Pi-Exploration-Mathematics/dp/0465051715/ref=sr_1_1?ie=UTF8&qid=1419352340&sr=8-1keywords=eugenia+cheng">How to Bake Pi: An Edible Exploration of the Mathematics of Mathematics</A>:</p>
<blockquote ><p >
Most people imagine maths is something like a slow cooker: very useful, but pretty limited in what it can do. Maths, though, isn't just a tool for solving a specific problem - and it's definitely not something to be afraid of. Whether you're a maths glutton or have forgotten how long division works (or never really knew in the first place), the chances are you've missed what really makes maths exciting. Calling on a baker's dozen of entertaining, puzzling examples and mathematically illuminating culinary analogies - including chocolate brownies, iterated Battenberg cakes, sandwich sandwiches, Yorkshire puddings and Möbius bagels - brilliant young academic and mathematical crusader Eugenia Cheng is here to tell us why we should all love maths.</p>
<p >From simple numeracy to category theory ('the mathematics of mathematics'), Cheng takes us through the joys of the mathematical world. Packed with recipes, puzzles to surprise and delight even the innumerate, Cake, Custard & Category Theory will whet the appetite of maths whizzes and arithmophobes alike. (Not to mention aspiring cooks: did you know you can use that slow cooker to make clotted cream?) This is maths at its absolute tastiest.<br >
</BLOCKQUOTE></p>
<p >Cheng, one of <A href="https://www.youtube.com/user/TheCatsters">the Catsters</A>, gives a guided tour of mathematical thinking and research activities, and through the core philosophy underlying category theory. This is the kind of book you can give to your grandma and grandpa so they can boast to their friends what her grandchildren are doing (and bake you a nice dessert when you come and visit :) ). A pleasant weekend reading.</p>Category TheoryCritiquesFunGeneralSemanticsTheoryFri, 17 Jul 2015 16:47:21 +0000Conservation laws for free!
http://lambda-the-ultimate.org/node/5078
<p >In this year's <A href="http://popl.mpi-sws.org/2014/">POPL</A>, <A href="bentnib.org">Bob Atkey</A> made a splash by showing how to get <A href="http://bentnib.org/conservation-laws.pdf">from parametricity to conservation laws, via Noether's theorem</A>:</p>
<blockquote ><p >
Invariance is of paramount importance in programming languages and in physics. In programming languages, John Reynolds’ theory of relational parametricity demonstrates that parametric polymorphic programs are invariant under change of data representation, a property that yields “free” theorems about programs just from their types. In physics, Emmy Noether showed that if the action of a physical system is invariant under change of coordinates, then the physical system has a conserved quantity: a quantity that remains constant for all time. Knowledge of conserved quantities can reveal deep properties of physical systems. For example, the conservation of energy, which by Noether’s theorem is a consequence of a system’s invariance under time-shifting.</p>
<p > In this paper, we link Reynolds’ relational parametricity with Noether’s theorem for deriving conserved quantities. We propose an extension of System Fω with new kinds, types and term constants for writing programs that describe classical mechanical systems in terms of their Lagrangians. We show, by constructing a relationally parametric model of our extension of Fω, that relational parametricity is enough to satisfy the hypotheses of Noether’s theorem, and so to derive conserved quantities for free, directly from the polymorphic types of Lagrangians expressed in our system.
</p></blockquote>Category TheoryFunFunctionalLambda CalculusScientific ProgrammingSemanticsTheoryType TheoryTue, 28 Oct 2014 07:52:46 +0000Seemingly impossible programs
http://lambda-the-ultimate.org/node/5074
<p >In case this one went under the radar, at <A href="http://www.cse.psu.edu/popl/12/">POPL'12</A>, <A href="http://www.cs.bham.ac.uk/~mhe/">Martín Escardó</A> gave a tutorial on <A href="http://www.cs.bham.ac.uk/~mhe/.talks/popl2012/escardo-popl2012.pdf">seemingly impossible functional programs</A>:</p>
<blockquote ><p >
Programming language semantics is typically applied to<br >
prove compiler correctness and allow (manual or automatic) program<br >
verification. Certain kinds of semantics can also be applied to<br >
discover programs that one wouldn't have otherwise thought of. This is<br >
the case, in particular, for semantics that incorporate topological<br >
ingredients (limits, continuity, openness, compactness). For example,<br >
it turns out that some function types (X -> Y) with X infinite (but<br >
compact) do have decidable equality, contradicting perhaps popular<br >
belief, but certainly not (higher-type) computability theory. More<br >
generally, one can often check infinitely many cases in finite time.</p>
<p >I will show you such programs, run them fast in surprising instances,<br >
and introduce the theory behind their derivation and working. In<br >
particular, I will study a single (very high type) program that (i)<br >
optimally plays sequential games of unbounded length, (ii) implements<br >
the Tychonoff Theorem from topology (and builds finite-time search<br >
functions for infinite sets), (iii) realizes the double-negation shift<br >
from proof theory (and allows us to extract programs from classical<br >
proofs that use the axiom of countable choice). There will be several<br >
examples in the languages Haskell and Agda.
</p></blockquote>
<p >A <A href="http://math.andrej.com/2007/09/28/seemingly-impossible-functional-programs/">shorter version</A> (coded in Haskell) appears in Andrej Bauer's blog.</p>Category TheoryFunFunctionalParadigmsSemanticsTheoryWed, 22 Oct 2014 09:57:47 +0000sml-family.org
http://lambda-the-ultimate.org/node/5058
<p >In his <A href="http://existentialtype.wordpress.com/2014/09/26/sml-family-org-up-and-running/">blog</A>, <A href="http://www.cs.cmu.edu/~rwh/">Bob Harper</A>, in joint effort with <A href="http://people.cs.uchicago.edu/~dbm/">Dave MacQueen</A> and <A href="http://www.lars.com/">Lars Bergstrom</A>, announces the launch of <A href="http://sml-family.org">sml-family.org</A>:</p>
<blockquote ><p >
The Standard ML Family project provides a home for online versions of various formal definitions of Standard ML, including the "Definition of Standard ML, Revised" (Standard ML 97). The site also supports coordination between different implementations of the Standard ML (SML) programming language by maintaining common resources such as the documentation for the <A href="http://sml-family.org/Basis/index.html">Standard ML Basis Library</A> and standard test suites. The goal is to increase compatibility and resource sharing between Standard ML implementations. </p>
<p >The site includes a <A href="http://sml-family.org/#History">history section</A> devoted to the history of ML, and of Standard ML in particular. This section will contain a collection of original source documents relating to the design of the language.
</p></blockquote>FunFunctionalHistoryImplementationParadigmsSemanticsTheoryTue, 30 Sep 2014 19:27:43 +0000Inferring algebraic effects
http://lambda-the-ultimate.org/node/5055
<p ><a href="http://www.lmcs-online.org/ojs/viewarticle.php?id=1469&layout=abstract">Logical methods in computer science</A> just published <a href="http://matija.pretnar.info/">Matija Pretnar</A>'s <A href="http://arxiv.org/pdf/1312.2334.pdf">latest take on algebraic effects and handlers</A>:</p>
<blockquote ><p >
We present a complete polymorphic effect inference algorithm for an ML-style language with handlers of not only exceptions, but of any other algebraic effect such as input & output, mutable references and many others. Our main aim is to offer the programmer a useful insight into the effectful behaviour of programs. Handlers help here by cutting down possible effects and the resulting lengthy output that often plagues precise effect systems. Additionally, we present a set of methods that further simplify the displayed types, some even by deliberately hiding inferred information from the programmer.
</p></blockquote>
<p >Pretnar and <A href="http://andrej.com/">Bauer</A>'s <A href="http://eff-lang.org/">Eff</A> has made <A href="http://lambda-the-ultimate.org/node/4090">previous</A> <A href="http://lambda-the-ultimate.org/node/4481">appearances</A> here on LtU. Apart from the new fangled polymorphic effect system, this paper also contains an Eff tutorial. </p>FunctionalImplementationParadigmsSemanticsTheorySat, 27 Sep 2014 23:16:37 +0000Luca Cardelli Festschrift
http://lambda-the-ultimate.org/node/5044
<p >Earlier this week Microsoft Research Cambridge organised a <A href="http://research.microsoft.com/en-us/events/lucacardellifest/">Festschrift</A> for Luca Cardelli. The preface from the <A href="http://research.microsoft.com/pubs/226237/Luca-Cardelli-Fest-MSR-TR-2014-104.pdf">book</A>:</p>
<blockquote ><p >
Luca Cardelli has made exceptional contributions to the world of programming<br >
languages and beyond. Throughout his career, he has re-invented himself every<br >
decade or so, while continuing to make true innovations. His achievements span<br >
many areas: software; language design, including experimental languages;<br >
programming language foundations; and the interaction of programming languages<br >
and biology. These achievements form the basis of his lasting scientific leadership<br >
and his wide impact.<br >
...<br >
Luca is always asking "what is new", and is always looking to<br >
the future. Therefore, we have asked authors to produce short pieces that would<br >
indicate where they are today and where they are going. Some of the resulting<br >
pieces are short scientific papers, or abridged versions of longer papers; others are<br >
less technical, with thoughts on the past and ideas for the future. We hope that<br >
they will all interest Luca.
</p></blockquote>
<p >Hopefully the videos will be posted soon.</p>Category TheoryLambda CalculusMisc BooksSemanticsTheoryType TheoryFri, 12 Sep 2014 10:10:08 +0000Dependently-Typed Metaprogramming (in Agda)
http://lambda-the-ultimate.org/node/4804
<p ><A href="http://www.strictlypositive.org">Conor McBride</A> gave an 8-lecture summer course on <A href="http://www.cl.cam.ac.uk/~ok259/agda-course-13/">Dependently typed metaprogramming (in Agda)</A> at the <A href="http://www.cl.cam.ac.uk/">Cambridge University Computer Laboratory</A>:</p>
<blockquote ><p >
Dependently typed functional programming languages such as Agda are capable of expressing very precise types for data. When those data themselves encode types, we gain a powerful mechanism for abstracting generic operations over carefully circumscribed universes. This course will begin with a rapid depedently-typed programming primer in Agda, then explore techniques for and consequences of universe constructions. Of central importance are the â€œpattern functorsâ€ which determine the node structure of inductive and coinductive datatypes. We shall consider syntactic presentations of these functors (allowing operations as useful as symbolic differentiation), and relate them to the more uniform abstract notion of â€œcontainerâ€. We shall expose the double-life containers lead as â€œinteraction structuresâ€ describing systems of effects. Later, we step up to functors over universes, acquiring the power of inductive-recursive definitions, and we use that power to build universes of dependent types.
</p></blockquote>
<p >The <A href="https://github.com/pigworker/MetaprogAgda/blob/master/notes.pdf?raw=true">lecture notes</A>, <A href="https://github.com/pigworker/MetaprogAgda">code</A>, and <A href="http://www.youtube.com/playlist?list=PL_shDsyy0xhKhsBUaVXTJ2uJ78EGBpvQa">video captures</A> are available online. </p>
<p >As with his <A href="http://www.cl.cam.ac.uk/~ok259/agda-course/">previous course</A>, the notes contain many(!) mind expanding exploratory exercises, some of which quite challenging.</p>Category TheoryFunctionalLambda CalculusMeta-ProgrammingParadigmsSemanticsTeaching & LearningTheoryType TheoryFri, 30 Aug 2013 07:34:49 +0000Milner Symposium 2012
http://lambda-the-ultimate.org/node/4620
<p >The <A href='http://events.inf.ed.ac.uk/Milner2012/'>Milner Symposium 2012</A> was held in Edinburgh this April in memory of the <A href='http://lambda-the-ultimate.org/node/3863'>late Robin Milner</A>. </p>
<blockquote ><p >The Milner Symposium is a celebration of the life and work of one of the world's greatest computer scientists, Robin Milner. The symposium will feature leading researchers whose work is inspired by Robin Milner.</p></blockquote>
<p >The <A href='http://events.inf.ed.ac.uk/Milner2012/programme.html'>programme</A> consisted of academic talks by colleagues and past students. The <A href='http://events.inf.ed.ac.uk/Milner2012/videos.html'>talks</A> and <A href='http://events.inf.ed.ac.uk/Milner2012/slides.html'>slides</A> are available online.</p>
<p >I particularly liked the interleaving of the personal and human narrative underlying the scientific journey. A particularly good example is Joachim Parrow's <A href='http://events.inf.ed.ac.uk/Milner2012/J_Parrow-html5-mp4.html'>talk</A> on the origins of the pi calculus. Of particular interest to LtU members is the <A href='http://events.inf.ed.ac.uk/Milner2012/Monday_Panel-html5-mp4.html'>panel</A> on the future of functional programming languages, consisting of Phil Wadler, Xavier Leroy, David MacQueen, Martin Odersky, Simon Peyton-Jones, and Don Syme.</p>FunctionalGeneralHistoryParallel/DistributedSemanticsTheoryTue, 16 Oct 2012 17:31:48 +0000Mechanized Î»<sub>JS</sub>
http://lambda-the-ultimate.org/node/4555
<p ><a href="http://brownplt.github.com/2012/06/04/lambdajs-coq.html">Mechanized Î»<sub >JS</sub></a><br >
The Brown PLT Blog, 2012-06-04</p>
<blockquote ><p >
In an earlier post, we introduced Î»<sub >JS</sub>, our operational semantics for JavaScript. Unlike many other operational semantics, Î»<sub >JS</sub> is no toy, but strives to correctly model JavaScript's messy details. To validate these claims, we test Î»<sub >JS</sub> with randomly generated tests and with portions of the Mozilla JavaScript test suite.</p>
<p >Testing is not enough. Despite our work, other researchers found a missing case in Î»<sub >JS</sub>. Today, we're introducing Mechanized Î»<sub >JS</sub>, which comes with a machine-checked proof of correctness, using the Coq proof assistant.
</p></blockquote>
<p >More work on mechanizing the actual, implemented semantics of a real language, rather than a toy.</p>FunctionalJavascriptLambda CalculusSemanticsWed, 27 Jun 2012 15:28:37 +0000