31 research outputs found

    Polynomial-time Tensor Decompositions with Sum-of-Squares

    Full text link
    We give new algorithms based on the sum-of-squares method for tensor decomposition. Our results improve the best known running times from quasi-polynomial to polynomial for several problems, including decomposing random overcomplete 3-tensors and learning overcomplete dictionaries with constant relative sparsity. We also give the first robust analysis for decomposing overcomplete 4-tensors in the smoothed analysis model. A key ingredient of our analysis is to establish small spectral gaps in moment matrices derived from solutions to sum-of-squares relaxations. To enable this analysis we augment sum-of-squares relaxations with spectral analogs of maximum entropy constraints.Comment: to appear in FOCS 201

    The power of sum-of-squares for detecting hidden structures

    Full text link
    We study planted problems---finding hidden structures in random noisy inputs---through the lens of the sum-of-squares semidefinite programming hierarchy (SoS). This family of powerful semidefinite programs has recently yielded many new algorithms for planted problems, often achieving the best known polynomial-time guarantees in terms of accuracy of recovered solutions and robustness to noise. One theme in recent work is the design of spectral algorithms which match the guarantees of SoS algorithms for planted problems. Classical spectral algorithms are often unable to accomplish this: the twist in these new spectral algorithms is the use of spectral structure of matrices whose entries are low-degree polynomials of the input variables. We prove that for a wide class of planted problems, including refuting random constraint satisfaction problems, tensor and sparse PCA, densest-k-subgraph, community detection in stochastic block models, planted clique, and others, eigenvalues of degree-d matrix polynomials are as powerful as SoS semidefinite programs of roughly degree d. For such problems it is therefore always possible to match the guarantees of SoS without solving a large semidefinite program. Using related ideas on SoS algorithms and low-degree matrix polynomials (and inspired by recent work on SoS and the planted clique problem by Barak et al.), we prove new nearly-tight SoS lower bounds for the tensor and sparse principal component analysis problems. Our lower bounds for sparse principal component analysis are the first to suggest that going beyond existing algorithms for this problem may require sub-exponential time

    Third Powers of Quadratics are generically Identifiable up to quadratic Rank

    Full text link
    We consider the inverse problem for the map (S2(Cn))m→S6(Cn),(q1,…,qm)↦∑i=1mqi3,(n,m∈N) (S^2 (\mathbb C^n))^{m} \to S^6(\mathbb C^n), (q_{1},\ldots, q_{m}) \mapsto \sum_{i=1}^m q_{i}^3, \qquad (n, m \in \mathbb N) which captures the moment problem for mixtures of centered Gaussians in the smallest interesting degree. We show that for any n∈N n\in \mathbb N , this map is generically one-to-one (up to permutations of q1,…,qm q_1,\ldots, q_m ) as long as m≤(n2)+1 m\le {n\choose 2} + 1 , thus proving generic identifiability for mixtures of centered Gaussians from their (exact) moments of degree at most 6 6 up to rank (n2)+1 {n\choose 2} + 1 . We rely on the study of tangent spaces of secant varieties and the contact locus.Comment: 14 pages. Code for the base case computations can be found on GitHu
    corecore