Variational Bayesian Optimal Experimental Design
Adam Foster† Martin Jankowiak‡ Eli Bingham‡ Paul Horsfall‡ Yee Whye Teh† Tom Rainforth† Noah D. Goodman‡§
†Oxford Statistics, ‡Uber AI, §Stanford
Spotlight, NeurIPS 2019
Variational Bayesian Optimal Experimental Design Adam Foster - - PowerPoint PPT Presentation
Variational Bayesian Optimal Experimental Design Adam Foster Martin Jankowiak Eli Bingham Paul Horsfall Yee Whye Teh Tom Rainforth Noah D. Goodman Oxford Statistics, Uber AI, Stanford Spotlight, NeurIPS
Adam Foster† Martin Jankowiak‡ Eli Bingham‡ Paul Horsfall‡ Yee Whye Teh† Tom Rainforth† Noah D. Goodman‡§
†Oxford Statistics, ‡Uber AI, §Stanford
Spotlight, NeurIPS 2019
Experimental setup Controlled by experimenter
Data analyzed Model fitted
Data generated Response sampled
θ d y
Which would you prefer?
Which would you prefer?
Which would you prefer?
y d 𝜄
prior likelihood posterior
𝜄 : latent variable of interest d : design y : data
Design Observation Inference
Low information gain High information gain
Which would you prefer? Which would you prefer?
Expected reduction in entropy from the prior to the posterior
prior entropy posterior entropy (Lindley, 1956)
simulate samples prior posterior
“Doubly intractable”
approximate marginal density
Implicit? Variational estimator Consistent?
Marginal Posterior Variational NMC Marginal + likelihood
T = computational cost
independent point estimates
NMC = Nested Monte Carlo
Ours Baseline
n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a Marginal + likelihood VNMC Marginal Posterior NMC Laplace LFIRE DV
Which would you prefer?
Parameter recovery (RMSE)
Implementation in Pyro Full paper
docs.pyro.ai/en/stable/contrib.oed.html papers.nips.cc/paper/9553-variational-bayesian-optimal-experimental-design.pdf