Emergent Mind

Optimal rates of entropy estimation over Lipschitz balls

(1711.02141)
Published Nov 6, 2017 in math.ST , cs.IT , math.IT , stat.ME , and stat.TH

Abstract

We consider the problem of minimax estimation of the entropy of a density over Lipschitz balls. Dropping the usual assumption that the density is bounded away from zero, we obtain the minimax rates $(n\ln n){-s/(s+d)} + n{-1/2}$ for $0<s\leq 2$ for densities supported on $[0,1]d$, where $s$ is the smoothness parameter and $n$ is the number of independent samples. We generalize the results to densities with unbounded support: given an Orlicz functions $\Psi$ of rapid growth (such as the sub-exponential and sub-Gaussian classes), the minimax rates for densities with bounded $\Psi$-Orlicz norm increase to $(n\ln n){-s/(s+d)} (\Psi{-1}(n)){d(1-d/p(s+d))} + n{-1/2}$, where $p$ is the norm parameter in the Lipschitz ball. We also show that the integral-form plug-in estimators with kernel density estimates fail to achieve the minimax rates, and characterize their worst case performances over the Lipschitz ball. One of the key steps in analyzing the bias relies on a novel application of the Hardy-Littlewood maximal inequality, which also leads to a new inequality on the Fisher information that may be of independent interest.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.