Emergent Mind

Sharp indistinguishability bounds from non-uniform approximations

(2103.07842)
Published Mar 14, 2021 in cs.CC

Abstract

We study the problem of distinguishing between two symmetric probability distributions over $n$ bits by observing $k$ bits of a sample, subject to the constraint that all $k-1$-wise marginal distributions of the two distributions are identical to each other. Previous works of Bogdanov et al. and of Huang and Viola have established approximately tight results on the maximal statistical distance when $k$ is at most a small constant fraction of $n$ and Naor and Shamir gave a tight bound for all $k$ in the case of distinguishing with the OR function. In this work we provide sharp upper and lower bounds on the maximal statistical distance that holds for all $k$. Upper bounds on the statistical distance have typically been obtained by providing uniform low-degree polynomial approximations to certain higher-degree polynomials; the sharpness and wider applicability of our result stems from the construction of suitable non-uniform approximations.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.