Emergent Mind

Testing Closeness of Multivariate Distributions via Ramsey Theory

(2311.13154)
Published Nov 22, 2023 in cs.DS , cs.IT , cs.LG , math.IT , math.ST , stat.ML , and stat.TH

Abstract

We investigate the statistical task of closeness (or equivalence) testing for multidimensional distributions. Specifically, given sample access to two unknown distributions $\mathbf p, \mathbf q$ on $\mathbb Rd$, we want to distinguish between the case that $\mathbf p=\mathbf q$ versus $|\mathbf p-\mathbf q|{Ak} > \epsilon$, where $|\mathbf p-\mathbf q|{Ak}$ denotes the generalized ${A}k$ distance between $\mathbf p$ and $\mathbf q$ -- measuring the maximum discrepancy between the distributions over any collection of $k$ disjoint, axis-aligned rectangles. Our main result is the first closeness tester for this problem with {\em sub-learning} sample complexity in any fixed dimension and a nearly-matching sample complexity lower bound. In more detail, we provide a computationally efficient closeness tester with sample complexity $O\left((k{6/7}/ \mathrm{poly}d(\epsilon)) \logd(k)\right)$. On the lower bound side, we establish a qualitatively matching sample complexity lower bound of $\Omega(k{6/7}/\mathrm{poly}(\epsilon))$, even for $d=2$. These sample complexity bounds are surprising because the sample complexity of the problem in the univariate setting is $\Theta(k{4/5}/\mathrm{poly}(\epsilon))$. This has the interesting consequence that the jump from one to two dimensions leads to a substantial increase in sample complexity, while increases beyond that do not. As a corollary of our general $Ak$ tester, we obtain $d{\mathrm TV}$-closeness testers for pairs of $k$-histograms on $\mathbb Rd$ over a common unknown partition, and pairs of uniform distributions supported on the union of $k$ unknown disjoint axis-aligned rectangles. Both our algorithm and our lower bound make essential use of tools from Ramsey theory.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.