Emergent Mind

Minimax Estimation of the $L_1$ Distance

(1705.00807)
Published May 2, 2017 in math.ST , cs.IT , math.IT , and stat.TH

Abstract

We consider the problem of estimating the $L1$ distance between two discrete probability measures $P$ and $Q$ from empirical data in a nonasymptotic and large alphabet setting. When $Q$ is known and one obtains $n$ samples from $P$, we show that for every $Q$, the minimax rate-optimal estimator with $n$ samples achieves performance comparable to that of the maximum likelihood estimator (MLE) with $n\ln n$ samples. When both $P$ and $Q$ are unknown, we construct minimax rate-optimal estimators whose worst case performance is essentially that of the known $Q$ case with $Q$ being uniform, implying that $Q$ being uniform is essentially the most difficult case. The \emph{effective sample size enlargement} phenomenon, identified in Jiao \emph{et al.} (2015), holds both in the known $Q$ case for every $Q$ and the $Q$ unknown case. However, the construction of optimal estimators for $|P-Q|1$ requires new techniques and insights beyond the approximation-based method of functional estimation in Jiao \emph{et al.} (2015).

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.