595 research outputs found

    Mixtures of Gaussians are Privately Learnable with a Polynomial Number of Samples

    Full text link
    We study the problem of estimating mixtures of Gaussians under the constraint of differential privacy (DP). Our main result is that O~(k2d4log(1/δ)/α2ε)\tilde{O}(k^2 d^4 \log(1/\delta) / \alpha^2 \varepsilon) samples are sufficient to estimate a mixture of kk Gaussians up to total variation distance α\alpha while satisfying (ε,δ)(\varepsilon, \delta)-DP. This is the first finite sample complexity upper bound for the problem that does not make any structural assumptions on the GMMs. To solve the problem, we devise a new framework which may be useful for other tasks. On a high level, we show that if a class of distributions (such as Gaussians) is (1) list decodable and (2) admits a "locally small'' cover (Bun et al., 2021) with respect to total variation distance, then the class of its mixtures is privately learnable. The proof circumvents a known barrier indicating that, unlike Gaussians, GMMs do not admit a locally small cover (Aden-Ali et al., 2021b)

    Some Applications of Coding Theory in Computational Complexity

    Full text link
    Error-correcting codes and related combinatorial constructs play an important role in several recent (and old) results in computational complexity theory. In this paper we survey results on locally-testable and locally-decodable error-correcting codes, and their applications to complexity theory and to cryptography. Locally decodable codes are error-correcting codes with sub-linear time error-correcting algorithms. They are related to private information retrieval (a type of cryptographic protocol), and they are used in average-case complexity and to construct ``hard-core predicates'' for one-way permutations. Locally testable codes are error-correcting codes with sub-linear time error-detection algorithms, and they are the combinatorial core of probabilistically checkable proofs
    corecore