Minimax Estimation of Discrete Distributions under $\ell_1$ Loss (1411.1467v3)
Abstract: We analyze the problem of discrete distribution estimation under $\ell_1$ loss. We provide non-asymptotic upper and lower bounds on the maximum risk of the empirical distribution (the maximum likelihood estimator), and the minimax risk in regimes where the alphabet size $S$ may grow with the number of observations $n$. We show that among distributions with bounded entropy $H$, the asymptotic maximum risk for the empirical distribution is $2H/\ln n$, while the asymptotic minimax risk is $H/\ln n$. Moreover, Moreover, we show that a hard-thresholding estimator oblivious to the unknown upper bound $H$, is asymptotically minimax. However, if we constrain the estimates to lie in the simplex of probability distributions, then the asymptotic minimax risk is again $2H/\ln n$. We draw connections between our work and the literature on density estimation, entropy estimation, total variation distance ($\ell_1$ divergence) estimation, joint distribution estimation in stochastic processes, normal mean estimation, and adaptive estimation.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.