Monotonic Convergence in an Information-Theoretic Law of Small Numbers
(0810.5203)Abstract
An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein et al. 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis et al. (2005) and Harremo\"es et al.\ (2007, 2008). Ingredients in the proofs include convexity, majorization, and stochastic orders.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.