Papers
Topics
Authors
Recent
2000 character limit reached

f-Divergence Variational Inference (2009.13093v4)

Published 28 Sep 2020 in cs.LG, cs.IT, math.IT, and stat.ML

Abstract: This paper introduces the $f$-divergence variational inference ($f$-VI) that generalizes variational inference to all $f$-divergences. Initiated from minimizing a crafty surrogate $f$-divergence that shares the statistical consistency with the $f$-divergence, the $f$-VI framework not only unifies a number of existing VI methods, e.g. Kullback-Leibler VI, R\'{e}nyi's $\alpha$-VI, and $\chi$-VI, but offers a standardized toolkit for VI subject to arbitrary divergences from $f$-divergence family. A general $f$-variational bound is derived and provides a sandwich estimate of marginal likelihood (or evidence). The development of the $f$-VI unfolds with a stochastic optimization scheme that utilizes the reparameterization trick, importance weighting and Monte Carlo approximation; a mean-field approximation scheme that generalizes the well-known coordinate ascent variational inference (CAVI) is also proposed for $f$-VI. Empirical examples, including variational autoencoders and Bayesian neural networks, are provided to demonstrate the effectiveness and the wide applicability of $f$-VI.

Citations (30)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.