Emergent Mind

Communication Complexity of Estimating Correlations

(1901.09100)
Published Jan 25, 2019 in cs.IT , cs.LG , math.IT , math.ST , stat.ML , and stat.TH

Abstract

We characterize the communication complexity of the following distributed estimation problem. Alice and Bob observe infinitely many iid copies of $\rho$-correlated unit-variance (Gaussian or $\pm1$ binary) random variables, with unknown $\rho\in[-1,1]$. By interactively exchanging $k$ bits, Bob wants to produce an estimate $\hat\rho$ of $\rho$. We show that the best possible performance (optimized over interaction protocol $\Pi$ and estimator $\hat \rho$) satisfies $\inf{\Pi \hat\rho}\sup\rho \mathbb{E} [|\rho-\hat\rho|2] = \tfrac{1}{k} (\frac{1}{2 \ln 2} + o(1))$. Curiously, the number of samples in our achievability scheme is exponential in $k$; by contrast, a naive scheme exchanging $k$ samples achieves the same $\Omega(1/k)$ rate but with a suboptimal prefactor. Our protocol achieving optimal performance is one-way (non-interactive). We also prove the $\Omega(1/k)$ bound even when $\rho$ is restricted to any small open sub-interval of $[-1,1]$ (i.e. a local minimax lower bound). Our proof techniques rely on symmetric strong data-processing inequalities and various tensorization techniques from information-theoretic interactive common-randomness extraction. Our results also imply an $\Omega(n)$ lower bound on the information complexity of the Gap-Hamming problem, for which we show a direct information-theoretic proof.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.