Papers
Topics
Authors
Recent
2000 character limit reached

Stochastic Collapsed Variational Inference for Structured Gaussian Process Regression Network (2106.00719v2)

Published 1 Jun 2021 in cs.LG and stat.ML

Abstract: This paper presents an efficient variational inference framework for deriving a family of structured gaussian process regression network (SGPRN) models. The key idea is to incorporate auxiliary inducing variables in latent functions and jointly treats both the distributions of the inducing variables and hyper-parameters as variational parameters. Then we propose structured variable distributions and marginalize latent variables, which enables the decomposability of a tractable variational lower bound and leads to stochastic optimization. Our inference approach is able to model data in which outputs do not share a common input set with a computational complexity independent of the size of the inputs and outputs and thus easily handle datasets with missing values. We illustrate the performance of our method on synthetic data and real datasets and show that our model generally provides better imputation results on missing data than the state-of-the-art. We also provide a visualization approach for time-varying correlation across outputs in electrocoticography data and those estimates provide insight to understand the neural population dynamics.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Video Overview

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.