Polynomial-Time Sum-of-Squares Can Robustly Estimate Mean and Covariance of Gaussians Optimally (2110.11853v1)
Abstract: In this work, we revisit the problem of estimating the mean and covariance of an unknown $d$-dimensional Gaussian distribution in the presence of an $\varepsilon$-fraction of adversarial outliers. The pioneering work of [DKK+16] gave a polynomial time algorithm for this task with optimal $\tilde{O}(\varepsilon)$ error using $n = \textrm{poly}(d, 1/\varepsilon)$ samples. On the other hand, [KS17b] introduced a general framework for robust moment estimation via a canonical sum-of-squares relaxation that succeeds for the more general class of certifiably subgaussian and certifiably hypercontractive [BK20] distributions. When specialized to Gaussians, this algorithm obtains the same $\tilde{O}(\varepsilon)$ error guarantee as [DKK+16] but incurs a super-polynomial sample complexity ($n = d{O(\log(1/\varepsilon)}$) and running time ($n{O(\log(1/\varepsilon))}$). This cost appears inherent to their analysis as it relies only on sum-of-squares certificates of upper bounds on directional moments while the analysis in [DKK+16] relies on lower bounds on directional moments inferred from algebraic relationships between moments of Gaussian distributions. We give a new, simple analysis of the same canonical sum-of-squares relaxation used in [KS17b, BK20] and show that for Gaussian distributions, their algorithm achieves the same error, sample complexity and running time guarantees as of the specialized algorithm in [DKK+16]. Our key innovation is a new argument that allows using moment lower bounds without having sum-of-squares certificates for them. We believe that our proof technique will likely be useful in developing further robust estimation algorithms.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.