Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 171 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 43 tok/s Pro
GPT-5 High 44 tok/s Pro
GPT-4o 94 tok/s Pro
Kimi K2 180 tok/s Pro
GPT OSS 120B 438 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Monotonic Convergence in an Information-Theoretic Law of Small Numbers (0810.5203v4)

Published 29 Oct 2008 in cs.IT, math.IT, and math.PR

Abstract: An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein et al. 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis et al. (2005) and Harremo\"es et al.\ (2007, 2008). Ingredients in the proofs include convexity, majorization, and stochastic orders.

Citations (30)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.