Papers
Topics
Authors
Recent
2000 character limit reached

SQ Lower Bounds for Learning Bounded Covariance GMMs (2306.13057v1)

Published 22 Jun 2023 in cs.LG, cs.DS, math.ST, stat.ML, and stat.TH

Abstract: We study the complexity of learning mixtures of separated Gaussians with common unknown bounded covariance matrix. Specifically, we focus on learning Gaussian mixture models (GMMs) on $\mathbb{R}d$ of the form $P= \sum_{i=1}k w_i \mathcal{N}(\boldsymbol \mu_i,\mathbf \Sigma_i)$, where $\mathbf \Sigma_i = \mathbf \Sigma \preceq \mathbf I$ and $\min_{i \neq j} | \boldsymbol \mu_i - \boldsymbol \mu_j|_2 \geq k\epsilon$ for some $\epsilon>0$. Known learning algorithms for this family of GMMs have complexity $(dk){O(1/\epsilon)}$. In this work, we prove that any Statistical Query (SQ) algorithm for this problem requires complexity at least $d{\Omega(1/\epsilon)}$. In the special case where the separation is on the order of $k{1/2}$, we additionally obtain fine-grained SQ lower bounds with the correct exponent. Our SQ lower bounds imply similar lower bounds for low-degree polynomial tests. Conceptually, our results provide evidence that known algorithms for this problem are nearly best possible.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.