Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 145 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 127 tok/s Pro
Kimi K2 200 tok/s Pro
GPT OSS 120B 433 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

Latent Variables on Spheres for Autoencoders in High Dimensions (1912.10233v2)

Published 21 Dec 2019 in cs.LG, cs.CV, and stat.ML

Abstract: Variational Auto-Encoder (VAE) has been widely applied as a fundamental generative model in machine learning. For complex samples like imagery objects or scenes, however, VAE suffers from the dimensional dilemma between reconstruction precision that needs high-dimensional latent codes and probabilistic inference that favors a low-dimensional latent space. By virtue of high-dimensional geometry, we propose a very simple algorithm, called Spherical Auto-Encoder (SAE), completely different from existing VAEs to address the issue. SAE is in essence the vanilla autoencoder with spherical normalization on the latent space. We analyze the unique characteristics of random variables on spheres in high dimensions and argue that random variables on spheres are agnostic to various prior distributions and data modes when the dimension is sufficiently high. Therefore, SAE can harness a high-dimensional latent space to improve the inference precision of latent codes while maintain the property of stochastic sampling from priors. The experiments on sampling and inference validate our theoretical analysis and the superiority of SAE.

Citations (10)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.