Laplace approximation and robust Monte Carlo for Bayesian inference with infomative data
![Thumbnail](http://memento.epfl.ch/image/19118/1440x810.jpg)
Event details
Date | 20.05.2021 |
Hour | 16:15 › 17:15 |
Speaker | Bjoern Sprungk (TU Bergakademie Freiberg) |
Location | Online |
Category | Conferences - Seminars |
In Bayesian statistics and Bayesian inverse problems sampling from or integrating with respect to the resulting posterior distribution can become computationally quite challenging, for instance, if the parameter space is high-dimensional or if the posterior is highly concentrated. Regarding high-dimensional Bayesian inference a lot of effort has been spent on deriving dimension-independent sampling methods in recent years. However, the challenge of a concentrated posterior resulting from large or informative data has drawn less attention so far -- despite its importance for practical purposes. In this talk, we exploit the well-known Laplace approximation as a suitable Gaussian reference measure for importance sampling as well as for Markov chain Monte Carlo sampling of the posterior. We analyse the statistical efficiency of these methods for a decaying observational noise in the data, i.e., for an increasing concentration of the posterior measure. Besides convergence in Hellinger distance of the Laplace approximation to the posterior (a result closely related to the Bernstein-von Mises theorem) we also show that the proposed Laplace-based Monte Carlo methods perform robustly w.r.t. increasing concentration of the posterior whereas prior-based sampling methods do not.
This is joint work together with Claudia Schillings, Daniel Rudolf, and Philipp Wacker.
Practical information
- General public
- Free
Organizer
- Fabio Nobile
Contact
- Nicolas Boumal