Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compositional uncertainty in deep Gaussian processes (1909.07698v3)

Published 17 Sep 2019 in stat.ML and cs.LG

Abstract: Gaussian processes (GPs) are nonparametric priors over functions. Fitting a GP implies computing a posterior distribution of functions consistent with the observed data. Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations. However, exact Bayesian inference is intractable for DGPs, motivating the use of various approximations. We show that the application of simplifying mean-field assumptions across the hierarchy leads to the layers of a DGP collapsing to near-deterministic transformations. We argue that such an inference scheme is suboptimal, not taking advantage of the potential of the model to discover the compositional structure in the data. To address this issue, we examine alternative variational inference schemes allowing for dependencies across different layers and discuss their advantages and limitations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Ivan Ustyuzhaninov (9 papers)
  2. Ieva Kazlauskaite (18 papers)
  3. Markus Kaiser (10 papers)
  4. Erik Bodin (8 papers)
  5. Neill D. F. Campbell (26 papers)
  6. Carl Henrik Ek (34 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.