Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 186 tok/s
Gemini 2.5 Pro 55 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 41 tok/s Pro
GPT-4o 124 tok/s Pro
Kimi K2 184 tok/s Pro
GPT OSS 120B 440 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Finite-sample concentration of the empirical relative entropy around its mean (2203.00800v1)

Published 2 Mar 2022 in math.ST, cs.IT, math.IT, math.PR, and stat.TH

Abstract: In this note, we show that the relative entropy of an empirical distribution of $n$ samples drawn from a set of size $k$ with respect to the true underlying distribution is exponentially concentrated around its expectation, with central moment generating function bounded by that of a gamma distribution with shape $2k$ and rate $n/2$. This improves on recent work of Bhatt and Pensia (arXiv 2021) on the same problem, who showed such a similar bound with an additional polylogarithmic factor of $k$ in the shape, and also confirms a recent conjecture of Mardia et al. (Information and Inference 2020). The proof proceeds by reducing the case $k>3$ of the multinomial distribution to the simpler case $k=2$ of the binomial, for which the desired bound follows from standard results on the concentration of the binomial.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.