Papers
Topics
Authors
Recent
2000 character limit reached

Finite-sample concentration of the empirical relative entropy around its mean (2203.00800v1)

Published 2 Mar 2022 in math.ST, cs.IT, math.IT, math.PR, and stat.TH

Abstract: In this note, we show that the relative entropy of an empirical distribution of $n$ samples drawn from a set of size $k$ with respect to the true underlying distribution is exponentially concentrated around its expectation, with central moment generating function bounded by that of a gamma distribution with shape $2k$ and rate $n/2$. This improves on recent work of Bhatt and Pensia (arXiv 2021) on the same problem, who showed such a similar bound with an additional polylogarithmic factor of $k$ in the shape, and also confirms a recent conjecture of Mardia et al. (Information and Inference 2020). The proof proceeds by reducing the case $k>3$ of the multinomial distribution to the simpler case $k=2$ of the binomial, for which the desired bound follows from standard results on the concentration of the binomial.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.