2000 character limit reached
On Sibson's $α$-Mutual Information (2202.03951v1)
Published 8 Feb 2022 in cs.IT, math.IT, math.ST, and stat.TH
Abstract: We explore a family of information measures that stems from R\'enyi's $\alpha$-Divergences with $\alpha<0$. In particular, we extend the definition of Sibson's $\alpha$-Mutual Information to negative values of $\alpha$ and show several properties of these objects. Moreover, we highlight how this family of information measures is related to functional inequalities that can be employed in a variety of fields, including lower-bounds on the Risk in Bayesian Estimation Procedures.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.