Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MCMC-driven importance samplers (2105.02579v4)

Published 6 May 2021 in stat.CO and cs.LG

Abstract: Monte Carlo sampling methods are the standard procedure for approximating complicated integrals of multidimensional posterior distributions in Bayesian inference. In this work, we focus on the class of Layered Adaptive Importance Sampling (LAIS) scheme, which is a family of adaptive importance samplers where Markov chain Monte Carlo algorithms are employed to drive an underlying multiple importance sampling scheme. The modular nature of LAIS allows for different possible implementations, yielding a variety of different performance and computational costs. In this work, we propose different enhancements of the classical LAIS setting in order to increase the efficiency and reduce the computational cost, of both upper and lower layers. The different variants address computational challenges arising in real-world applications, for instance with highly concentrated posterior distributions. Furthermore, we introduce different strategies for designing cheaper schemes, for instance, recycling samples generated in the upper layer and using them in the final estimators in the lower layer. Different numerical experiments, considering several challenging scenarios, show the benefits of the proposed schemes comparing with benchmark methods presented in the literature.

Citations (10)

Summary

We haven't generated a summary for this paper yet.