Emergent Mind

MCMC-driven importance samplers

(2105.02579)
Published May 6, 2021 in stat.CO and cs.LG

Abstract

Monte Carlo sampling methods are the standard procedure for approximating complicated integrals of multidimensional posterior distributions in Bayesian inference. In this work, we focus on the class of Layered Adaptive Importance Sampling (LAIS) scheme, which is a family of adaptive importance samplers where Markov chain Monte Carlo algorithms are employed to drive an underlying multiple importance sampling scheme. The modular nature of LAIS allows for different possible implementations, yielding a variety of different performance and computational costs. In this work, we propose different enhancements of the classical LAIS setting in order to increase the efficiency and reduce the computational cost, of both upper and lower layers. The different variants address computational challenges arising in real-world applications, for instance with highly concentrated posterior distributions. Furthermore, we introduce different strategies for designing cheaper schemes, for instance, recycling samples generated in the upper layer and using them in the final estimators in the lower layer. Different numerical experiments, considering several challenging scenarios, show the benefits of the proposed schemes comparing with benchmark methods presented in the literature.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.