Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multivariate Gaussian Variational Inference by Natural Gradient Descent (2001.10025v2)

Published 27 Jan 2020 in stat.ML, cs.LG, cs.RO, math.ST, and stat.TH

Abstract: This short note reviews so-called Natural Gradient Descent (NGD) for multivariate Gaussians. The Fisher Information Matrix (FIM) is derived for several different parameterizations of Gaussians. Careful attention is paid to the symmetric nature of the covariance matrix when calculating derivatives. We show that there are some advantages to choosing a parameterization comprising the mean and inverse covariance matrix and provide a simple NGD update that accounts for the symmetric (and sparse) nature of the inverse covariance matrix.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Timothy D. Barfoot (89 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.