POPL 2017
Sun 15 - Sat 21 January 2017
Wed 18 Jan 2017 14:45 - 15:10 at Amphitheater 44 - Probabilistic Programming Chair(s): Marco Gaboardi

Bayesian inference, of posterior knowledge from prior knowledge and observed evidence, is typically defined by Bayes’s rule, which says the posterior multiplied by the probability of an observation equals a joint probability. But the observation of a continuous quantity usually has probability zero, in which case Bayes’s rule says only that the unknown times zero is zero. To infer a posterior distribution from a zero-probability observation, the statistical notion of disintegration tells us to specify the observation as an expression rather than a predicate, but does not tell us how to compute the posterior. We present the first method of computing a disintegration from a probabilistic program and an expression of a quantity to be observed, even when the observation has probability zero. Because the method produces an exact posterior term and preserves a semantics in which monadic terms denote measures, it composes with other inference methods in a modular way—without sacrificing accuracy or performance.

Wed 18 Jan

POPL-2017-papers
14:20 - 16:00: POPL - Probabilistic Programming at Amphitheater 44
Chair(s): Marco GaboardiSUNY Buffalo, USA
POPL-2017-papers14:20 - 14:45
Talk
Leonidas LampropoulosUniversity of Pennsylvania, Diane Gallois-WongInria Paris, ENS Paris, Cătălin HriţcuInria Paris, John HughesChalmers University of Technology, Benjamin C. PierceUniversity of Pennsylvania, Li-yao XiaENS Paris
Pre-print
POPL-2017-papers14:45 - 15:10
Talk
Chung-chieh ShanIndiana University, USA, Norman Ramsey
Pre-print
POPL-2017-papers15:10 - 15:35
Talk
Krishnendu ChatterjeeIST Austria, Petr NovotnyIST Austria, Djordje ZikelicUniversity of Cambridge
POPL-2017-papers15:35 - 16:00
Talk