Sketching Algorithms for Sparse Dictionary Learning: PTAS and Turnstile Streaming

Abstract

Sketching algorithms have recently proven to be a powerful approach both for designing low-space streaming algorithms as well as fast polynomial time approximation schemes (PTAS). In this work, we develop new techniques to extend the applicability of sketching-based approaches to the sparse dictionary learning and the Euclidean kk-means clustering problems. In particular, we initiate the study of the challenging setting where the dictionary/clustering assignment for each of the nn input points must be output, which has surprisingly received little attention in prior work. On the fast algorithms front, we obtain a new approach for designing PTAS's for the kk-means clustering problem, which generalizes to the first PTAS for the sparse dictionary learning problem. On the streaming algorithms front, we obtain new upper bounds and lower bounds for dictionary learning and kk-means clustering. In particular, given a design matrix A∈Rn×d\mathbf A\in\mathbb R^{n\times d} in a turnstile stream, we show an O~(nr/ϵ2+dk/ϵ)\tilde O(nr/\epsilon^2 + dk/\epsilon) space upper bound for rr-sparse dictionary learning of size kk, an O~(n/ϵ2+dk/ϵ)\tilde O(n/\epsilon^2 + dk/\epsilon) space upper bound for kk-means clustering, as well as an O~(n)\tilde O(n) space upper bound for kk-means clustering on random order row insertion streams with a natural "bounded sensitivity" assumption. On the lower bounds side, we obtain a general Ω~(n/ϵ+dk/ϵ)\tilde\Omega(n/\epsilon + dk/\epsilon) lower bound for kk-means clustering, as well as an Ω~(n/ϵ2)\tilde\Omega(n/\epsilon^2) lower bound for algorithms which can estimate the cost of a single fixed set of candidate centers.Comment: To appear in NeurIPS 202

    Similar works

    Full text

    thumbnail-image

    Available Versions