1 research outputs found
Coresets for Gaussian Mixture Models of Any Shape
An -coreset for a given set of points, is usually a
small weighted set, such that querying the coreset \emph{provably} yields a
-factor approximation to the original (full) dataset, for a
given family of queries. Using existing techniques, coresets can be maintained
for streaming, dynamic (insertion/deletions), and distributed data in parallel,
e.g. on a network, GPU or cloud.
We suggest the first coresets that approximate the negative log-likelihood
for -Gaussians Mixture Models (GMM) of arbitrary shapes (ratio between
eigenvalues of their covariance matrices). For example, for any input set
whose coordinates are integers in and any fixed , the coreset size is , and can be computed in
time near-linear in , with high probability. The optimal GMM may then be
approximated quickly by learning the small coreset.
Previous results [NIPS'11, JMLR'18] suggested such small coresets for the
case of semi-speherical unit Gaussians, i.e., where their corresponding
eigenvalues are constants between to .
Our main technique is a reduction between coresets for -GMMs and
projective clustering problems. We implemented our algorithms, and provide open
code, and experimental results. Since our coresets are generic, with no special
dependency on GMMs, we hope that they will be useful for many other functions