We study the complexity of learning mixtures of separated Gaussians with
common unknown bounded covariance matrix. Specifically, we focus on learning
Gaussian mixture models (GMMs) on Rd of the form P=∑i=1kwiN(μi,Σi), where Σi=Σ⪯I and mini=j∥μi−μj∥2≥kϵ for some ϵ>0. Known learning
algorithms for this family of GMMs have complexity (dk)O(1/ϵ). In
this work, we prove that any Statistical Query (SQ) algorithm for this problem
requires complexity at least dΩ(1/ϵ). In the special case
where the separation is on the order of k1/2, we additionally obtain
fine-grained SQ lower bounds with the correct exponent. Our SQ lower bounds
imply similar lower bounds for low-degree polynomial tests. Conceptually, our
results provide evidence that known algorithms for this problem are nearly best
possible