Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variational Inference via Transformations on Distributions (1707.02510v1)

Published 9 Jul 2017 in stat.ML and cs.LG

Abstract: Variational inference methods often focus on the problem of efficient model optimization, with little emphasis on the choice of the approximating posterior. In this paper, we review and implement the various methods that enable us to develop a rich family of approximating posteriors. We show that one particular method employing transformations on distributions results in developing very rich and complex posterior approximation. We analyze its performance on the MNIST dataset by implementing with a Variational Autoencoder and demonstrate its effectiveness in learning better posterior distributions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Siddhartha Saxena (1 paper)
  2. Shibhansh Dohare (6 papers)
  3. Jaivardhan Kapoor (6 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.