Emergent Mind

Abstract

Deep generative models are universal tools for learning data distributions on high dimensional data spaces via a mapping to lower dimensional latent spaces. We provide a study of latent space geometries and extend and build upon previous results on Riemannian metrics. We show how a class of heuristic measures gives more flexibility in finding meaningful, problem-specific distances, and how it can be applied to diverse generator types such as autoregressive generators commonly used in e.g. language and other sequence modeling. We further demonstrate how a diffusion-inspired transformation previously studied in cartography can be used to smooth out latent spaces, stretching them according to a chosen measure. In addition to providing more meaningful distances directly in latent space, this also provides a unique tool for novel kinds of data visualizations. We believe that the proposed methods can be a valuable tool for studying the structure of latent spaces and learned data distributions of generative models.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.