Exploiting Independence for Optimal Gaussian Importance Sampling
16.02.2024, 09:00 - 10:00
– 2.9.1.22
SFB-Seminar
Stefan Heyder, TU Ilmenau
Importance sampling is a Monte Carlo technique that estimates posterior expectations in Bayesian computation by sampling from a tractable proposal distribution instead of the intractable posterior. The performance of importance sampling crucially depends on how good the proposal approximates the true posterior. As the optimal proposal is usually inaccessible, parametric, e.g., Gaussian proposals, are popular. If the prior is Gaussian as well, the Laplace approximation to the posterior provides a reasonable proposal.
This talk consists of two parts: First, I will derive a spatio-temporal state space model for reported infections with COVID-19 in German counties (Landkreise), where a negative binomial distribution models infections, making the problem analytically intractable. In this model I show how one may exploit available structure, i.e. conditional independence, to obtain importance sampling proposals to the full posterior, enabling frequentist inference and predictions. As the Laplace approximation is only a local approximation, but the quality of importance sampling depends on the global proximity of the proposal to the posterior, better Gaussian approximations are called for, and I show how they may be obtained using the efficient importance sampling framework. This part of the talk is joint work with my PhD advisor, Thomas Hotz (Ilmenau).
Subsequently, I will show how one can exploit similar structures in more general Bayesian inverse problems by iteratively improving on the Laplace approximation to efficiently obtain an optimal Gaussian approximation to the posterior. Finally, simulation studies demonstrate that improving on the Laplace approximation is worth the additional computational effort.