Write a Blog >>
Tue 21 Jan 2020 14:00 - 14:30 at St Claude - C

Universal probabilistic programming systems (PPSs) provide a powerful framework for specifying rich and complex probabilistic models. However, it comes at the cost of substantially complicating the process of drawing inferences from the model. In particular, inference can become challenging when the support of the model varies between executions. Though general-purpose inference engines have been designed to operate in such settings, they are typically inefficient, often relying on proposing from the prior to make transitions. To address this, we introduce a new inference framework: Divide, Conquer, and Combine (DCC). DCC divides the program into separate straight line sub-programs, each of which has a fixed support allowing more powerful inference algorithms to be run locally, before recombining their outputs in a principled fashion. We show how DCC can be implemented as an automated and general-purpose PPS inference engine, and empirically confirm that it can provide substantial performance improvements over previous approaches.

Tue 21 Jan

Displayed time zone: Saskatchewan, Central America change

14:00 - 15:05
14:00
30m
Talk
Divide, Conquer, and Combine: a New Inference Strategy for Probabilistic Programs with Stochastic Support
LAFI
Yuan Zhou University of Oxford, Hongseok Yang KAIST, Yee Whye Teh University of Oxford, Tom Rainforth Department of Statistics, University of Oxford
14:32
15m
Talk
MetaPPL: Inference Algorithms as First-Class Generative Models
LAFI
Alexander K. Lew Massachusetts Institute of Technology, USA, Benjamin Sherman Massachusetts Institute of Technology, USA, Marco Cusumano-Towner MIT-CSAIL, Austin Garrett MIT, Ben Zinberg MIT, Vikash K. Mansinghka MIT, Michael Carbin Massachusetts Institute of Technology
File Attached
14:49
16m
Talk
Monte Carlo Semantic Differencing of Probabilistic Programs
LAFI