Emergent Mind

Kernel Stein Discrepancy on Lie Groups: Theory and Applications

(2305.12551)
Published May 21, 2023 in math.ST , math.PR , and stat.TH

Abstract

Distributional approximation is a fundamental problem in machine learning with numerous applications across all fields of science and engineering and beyond. The key challenge in most approximation methods is the need to tackle the intractable normalization constant pertaining to the parametrized distributions used to model the data. In this paper, we present a novel Stein operator on Lie groups leading to a kernel Stein discrepancy (KSD) which is a normalization-free loss function. We present several theoretical results characterizing the properties of this new KSD on Lie groups and its minimizers namely, the minimum KSD estimator (MKSDE). Proof of several properties of MKSDE are presented, including strong consistency, CLT and a closed form of the MKSDE for the von Mises-Fisher distribution on SO(N). Finally, we present experimental evidence depicting advantages of minimizing KSD over maximum likelihood estimation.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.