Reverend Bayes, meet Countess Lovelace: Probabilistic Programming for Machine Learning

Andy Gordon's talk, Reverend Bayes, meet Countess Lovelace: Probabilistic Programming for Machine Learning, is online.


We propose a marriage of probabilistic functional programming with Bayesian reasoning. Infer.NET Fun turns the simple succinct syntax of F# into an executable modeling language – you can code up the conditional probability distributions of Bayes' rule using F# array comprehensions with constraints. Write your model in F#. Run it directly to synthesize test datasets and to debug models. Or compile it with Infer.NET for efficient statistical inference. Hence, efficient algorithms for a range of regression, classification, and specialist learning tasks derive by probabilistic functional programming.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

The recent developments

The recent developments around non-deterministic and probabilistic programming (aka weighted non-determinism) are very exciting. There are a lot of different implementation techniques with different strengths and weaknesses, like importance sampling, Gibbs sampling, belief propagation, etc. On the other hand, techniques for SAT solving like unit propagation, nogood learning and symmetry reduction have not yet been applied to probabilistic programming (as far as I know). Also, frequentist inference techniques like maximum likelihood & Bayesian loss function minimization can probably be done in the continuous case with automatic differentiation. Perhaps the bag of implementation techniques will continue to expand, or perhaps they will be unified into a single technique that combines the different strengths. In any case this direction of research is very exciting because the way you specify a generative model is often even higher level than the English prose you find in machine learning papers, yet you can automatically and generically do inference with them instead of having to hire a statistician and a computer scientist to develop inference algorithms. If the inference algorithms become good enough then that could transform how business decision making and experimental science are done.

The elephant in the Bayesian living room

Almost nobody knows how to handle queries like

P[X | Y = Z^2]

generally. Getting probabilistic and logical inference together could start to address this.

It's not like Bayesians don't know it's a problem. It shows up every time they want to observe the output of a deterministic function of random variables. But they can't say what those kinds of queries mean, because Bayes' law for densities can't express it. They just model around it for now.

FWIW, I'm working on this, and having some initial success with an arrow type whose inhabitants calculate conservative approximations of images and preimages. It's a fairly direct translation of the measure-theoretic approach.

Are you sure it's an

Are you sure it's an elephant? According to various Bayesian algorithms, it might be a pillar, a rope, a tree branch, a hand fan, a wall, or a solid pipe.

Time to become a

Time to become a Contributing Editor?

Contributing Editor

You should just force that privilege on people you deem worthy.

I just force their hand...

I just force their hand...

Texts?

Are there any comprehensive texts people here would recommend on probabilistic programming as a general topic? Is this something that's sitting in the primary literature at the moment?