Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 33 tok/s Pro
GPT-5 Medium 25 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 75 tok/s Pro
Kimi K2 220 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

An adaptive mixture-population Monte Carlo method for likelihood-free inference (2112.00420v1)

Published 1 Dec 2021 in math.NA, cs.NA, math.ST, and stat.TH

Abstract: This paper focuses on variational inference with intractable likelihood functions that can be unbiasedly estimated. A flexible variational approximation based on Gaussian mixtures is developed, by adopting the mixture population Monte Carlo (MPMC) algorithm in \cite{cappe2008adaptive}. MPMC updates iteratively the parameters of mixture distributions with importance sampling computations, instead of the complicated gradient estimation of the optimization objective in usual variational Bayes. Noticing that MPMC uses a fixed number of mixture components, which is difficult to predict for real applications, we further propose an automatic component--updating procedure to derive an appropriate number of components. The derived adaptive MPMC algorithm is capable of finding good approximations of the multi-modal posterior distributions even with a standard Gaussian as the initial distribution, as demonstrated in our numerical experiments.

Citations (2)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.