meta: September "propose a post" post (proposal)

I propose that once-per-month (or so), someone remember to create a "meta: <MONTH> "propose a post" post (proposal)".

There is no need for more than one "propose a post" topic per month (or so).

Within the comments of those monthly "propose a posts" posts people should follow the rules of a game.

The (voluntary) "rules" of the game:

(1) Top level comments propose "(new topic)" ltu posts that have not yet been made.

(2) 2nd level comments speak solely to the virtue of creating a new topic.

(3) No 3rd level comments and be brief, please.

(Just a proposal.)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

I think an interesting topic

I think an interesting topic to discuss is how PLT is progressing.

  • Lisp: 1958
  • ISWIM: 1966
  • ML: 1973
  • Martin-Löf type theory: ~1970

From 1958 to 1973 is 15 years. We're now an additional 42 years later. Is progress slowing down?

Imagine PLT like a tree with many branches, and progress is extending the leaves of that tree. What does progress look like? Will different branches of the tree continue to be successful, or will all but one branch die out or converge? To what extent is PLT designed, to what extent is it discovered?

More concretely, there's probably lots of interesting work from ICFP to discuss.

could be interesting

That could be quite interesting to discuss. You never really know till you try, of course; sometimes a promising topic just doesn't spark; but it seems to me to have some potential.

Seconded

I like the historical perspective, and the way that it ties neatly into (many) views of what is interesting right now.

Somebody really needs to

Somebody really needs to volunteer to become a PLT historian, but most of us are too busy chasing our pink elephants to bother.

suggestion: short bibliographies re milestones

Perhaps this could be broken up over several topics, each of which contains a brief bibliography of materials available on-line.

For example, for lisp: early papers; papers typical of what they were reacting to; some noteworthy retrospective papers.

It might be too much to ask but a pie-in-the-sky wish would be for this to result in a big pile of PDFs that can be free shared.

When did Haskell become hard?

I'm putting this here because it might fit as one specific sub-branch looking at how Haskell has changed between 1998 and 2010. It seems hard to pin down a specific point in time, but some (very vague) trends would be:
* Monads were once an 'exotic' topic. Literally at the back of the textbook that we had as undergrads. Now a monad transformer stacks seems to be the standard approach to any non-pure problem.
* Hitting the limits of expressibility in universal types, particular in domains that language hackers care about: i.e writing interpreters and compilers.
* The number of "Haskells" seems to have multiplied because many language extensions seem to have become necessary for different domain problems.
* Ten years ago there were interesting research directions around SYB on one hand, and tagless interpreters on the other. Where did it go next?
I'm writing this because sitting down to write an interpreter in Haskell for the first time in about a decade I'm quite shocked by how much it has changed. I'm wondering what happened to the research direction of how to handle the genericity of ASTs cleanly within Haskell. Did it reach a definite ending point as a solved problem (and if so where is the best place to look for a tutorial on how to do it) or did it just sort of peter out as reaching a state that was "good enough" that people moved on to other things? Roughly speaking this could be one strand within "good stuff at ICFP and other venues" over the years in question.

Those things are still being

Those things are still being researched. The main change in Haskell the language seems to be its convergence toward dependent types (driven, in part, by typed interpreters -> GADTs). In the community things that were once hard are now ordinary. This is how it goes with many topics. Things that were at the top of mathematics and physics achievements not so long ago are now taught in undergraduate mathematics and physics. Even in programming I've heard that this happened: long ago OOP was like monads; thought to be hard and exotic and only known to experts.

Here are some major advances

Here are some major advances in semantics, from about 1980 to about 2000. I'll stop there because the list is already getting long....

  • Rely/guarantee reasoning: 1981
  • The effective topos: 1982
  • Reynolds' parametricity: 1983
  • Girard's linear logic: 1987
  • Wadler and Blott's typeclasses: 1989
  • PER models of System F: 1990
  • Moggi's monadic semantics of effects: 1991
  • Wright/Felleisen syntactic type soundness: 1994
  • Proof-carrying code: 1996
  • O'Hearn and Reynolds' Separation logic: 1999
  • Step-indexed logical relations: 2001

This is a very partial list: I've left out a lot of stuff, such as denotational advances like the application of categorical methods to domain theory, game semantics, advances in proof theory like focalization, anything to do with concurrency and pi-calculus, and anything to do with model checking or SMT, and anything to do with implementation (whether of model checkers, compilers, or proof assistants).

Anyway, IMO the real excitement doesn't come from any single advance. It comes from the fact that enough knowledge has been built up that we can actually cash the checks that verification/semantics wrote in the 1970s -- completely correct system software, with machine-checked proofs all the way from specs to machine code, is now feasible to build. Milner, Hoare, Djikstra and co. were right, and we can prove it.

back to the future

So where is my commercially viable fully formally specified and checked operation system? And where is my hover board, it's 2015 already!

seL4

So where is my commercially viable fully formally specified and checked operation system?

seL4!

My answer is a bit tongue-in-cheek, but I could not resist. It is true that seL4 is only a kernel, and would probably not run as is on computers you are interested in. On the other hand, it was successfully transferred to the industry and eventually found its way into a bunch of phones, if I remember correctly. It's a tangible proof that, at least in some contexts, formally verified software is feasible and useful.

And where is my hover board, it's 2015 already!

Science takes time.

Hover board

The question: Which one?

From Lexus:
http://www.people.com/article/hoverboard-real-back-to-the-future

or Hendo: http://www.digitaltrends.com/cool-tech/hendo-hover-board-kickstarter-back-to-the-future/

:-)

Heiko

I really like this list,

I really like this list, thanks!

State of the art

I would like to see posts that pick a topic and attempt to survey the research lines that address that topic, identify which are state of the art or failed/superseded, and give arguments for which research directions are most likely to "win". Examples of topics I personally find interesting (in no particular order) are: FRP, module systems, how to design a numeric library (and units), high-level vs. low-level language tension, dependent types in the mainstream, strictness vs. laziness, extensible syntax, libraries vs. services, and very generally how to organize the concepts of a programming language.

POPL

After a brief search I could not find a post discussing POPL this year. Link would be appreciated if I did miss it. What I have in mind is roughly:
* What is going on this year, the aerial view of PL research according to POPL?
* What's important in this year's proceedings - which results are incremental/substantial or interesting/necessary?
* Why are any of this year's papers important - how do they relate to the important problems in PL research?

Experience suggests that often these things degenerate into less interesting topics that could be avoided:
* Why POPL ignores my subfield.
* Why I don't like POPL
* The problems that the "POPL crowd" are chasing are irrelevant to me.

It would be nice to discover if LtU can still hold a discussion about what the major conference in many of our fields can tell us about the direction of our subject this year.

Looking at the accepted

Looking at the accepted paper list...wow. They are up there with PLDI and OOPSLA.

  1. A Scalable, Correct Time-Stamped Stack
  2. From Communicating Machines to Graphical Choreographies
  3. Equations, contractions, and unique solutions
  4. Formal verification of a C static analyzer
  5. Space-Efficient Manifest Contracts
  6. Quantitative Interprocedural Analysis
  7. Integrating Linear and Dependent Types
  8. Functors are type refinement systems
  9. Safe and Efficient Gradual Typing for TypeScript
  10. Sound Modular Verification of C Code Executing in an Unverified Context
  11. Program Boosting: Program Synthesis via Crowd-Sourcing
  12. Programming up to Congruence
  13. Deep Specifications and Certified Abstraction Layers
  14. A Meta Lambda Calculus with Cross-Level Computation
  15. Specification Inference Using Context-Free Language Reachability
  16. Runtime enforcement of security policies on black box reactive programs
  17. Faster Algorithms for Algebraic Path Properties in RSMs with Constant Treewidth
  18. Iris: Monoids and Invariants as an Orthogonal Basis for Concurrent Reasoning
  19. DReX: A Declarative Language for Efficiently Computable Regular String Transformations
  20. K-Java: A Complete Semantics of Java
  21. Higher Inductive Types as Homotopy-Initial Algebras
  22. A Calculus for Relaxed Memory
  23. Compositional CompCert
  24. Abstract Symbolic Automata
  25. Analyzing Program Analyses
  26. Self-Representation in Girard's System U
  27. Conjugate Hylomorphisms: The Mother of All Structured Recursion Schemes
  28. Decentralizing SDN Policies
  29. Polymorphic Functions with Set-Theoretic Types. Part 2: Local Type Inference and Type Reconstruction
  30. Principal Type Schemes for Gradual Programs
  31. Full Abstraction for Signal Flow Graphs
  32. Dependent Information Flow Types
  33. Common compiler optimisations are invalid in the C11 memory model and what we can do about it
  34. Manifest Contracts for Datatypes
  35. Summary-Based Context-Sensitive Data-Dependence Analysis in Presence of Callbacks
  36. Leveraging Weighted Automata to Compositional Reasoning of Probabilistic Programs
  37. Algebraic effects linearity and quantum programming languages
  38. Higher-Order Approximate Relational Refinement Types for Mechanism Design and Differential Privacy
  39. Probabilistic Termination
  40. Tractable Refinement Checking for Concurrent Objects
  41. From Network Interface to Multithreaded Web Applications: A Case Study in Modular Program Verification
  42. Ur/Web: A Simple Model for Programming the Web
  43. Differential Privacy: Now it's Getting Personal
  44. Fiat: Deductive Synthesis of Abstract Data Types in a Proof Assistant
  45. Symbolic Algorithms for Language Equivalence and Kleene Algebra with Tests
  46. Succinct Representation of Concurrent Trace Sets
  47. Predicting Program Properties from Big Code
  48. On Characterizing the Data Access Complexity of Programs
  49. A Coalgebraic Decision Procedure for NetKAT
  50. Proof Spaces for Unbounded Parallelism
  51. Towards the Essence of Hygiene
  52. Data Parallel String Manipulating Programs

I can't see anything that suggests a topic changes or an emerging hot topic, but POPL isn't my thing.

Important at POPL

Topics that were big at POPL this year:

* Verified compiler correctness/compilation to relaxed memory
* Gradual typing/dynamic languages
* SDNs
* Program synthesis
* Big semantics

This is, of course, a subset of "what's hot in academic PL in the last 5 years".