1,249 research outputs found
Progress in the mathematical theory of quantum disordered systems
We review recent progress in the mathematical theory of quantum disordered
systems: the Anderson transition (joint work with Domingos Marchetti), the
(quantum and classical) Edwards-Anderson (EA) spin glass model and return to
equilibrium for a class of spin glass models, which includes the EA model
initially in a very large transverse magnetic field.Comment: 25 pages, 1 figure, based on lectures at Bressanone and G\"ottinge
Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method
We give a new approach to the dictionary learning (also known as "sparse
coding") problem of recovering an unknown matrix (for ) from examples of the form where is a random vector in
with at most nonzero coordinates, and is a random
noise vector in with bounded magnitude. For the case ,
our algorithm recovers every column of within arbitrarily good constant
accuracy in time , in particular achieving
polynomial time if for any , and time if is (a sufficiently small) constant. Prior algorithms with
comparable assumptions on the distribution required the vector to be much
sparser---at most nonzero coordinates---and there were intrinsic
barriers preventing these algorithms from applying for denser .
We achieve this by designing an algorithm for noisy tensor decomposition that
can recover, under quite general conditions, an approximate rank-one
decomposition of a tensor , given access to a tensor that is
-close to in the spectral norm (when considered as a matrix). To our
knowledge, this is the first algorithm for tensor decomposition that works in
the constant spectral-norm noise regime, where there is no guarantee that the
local optima of and have similar structures.
Our algorithm is based on a novel approach to using and analyzing the Sum of
Squares semidefinite programming hierarchy (Parrilo 2000, Lasserre 2001), and
it can be viewed as an indication of the utility of this very general and
powerful tool for unsupervised learning problems
Efficient Algorithms for Sparse Moment Problems without Separation
We consider the sparse moment problem of learning a -spike mixture in
high-dimensional space from its noisy moment information in any dimension. We
measure the accuracy of the learned mixtures using transportation distance.
Previous algorithms either assume certain separation assumptions, use more
recovery moments, or run in (super) exponential time. Our algorithm for the
one-dimensional problem (also called the sparse Hausdorff moment problem) is a
robust version of the classic Prony's method, and our contribution mainly lies
in the analysis. We adopt a global and much tighter analysis than previous work
(which analyzes the perturbation of the intermediate results of Prony's
method). A useful technical ingredient is a connection between the linear
system defined by the Vandermonde matrix and the Schur polynomial, which allows
us to provide tight perturbation bound independent of the separation and may be
useful in other contexts. To tackle the high-dimensional problem, we first
solve the two-dimensional problem by extending the one-dimensional algorithm
and analysis to complex numbers. Our algorithm for the high-dimensional case
determines the coordinates of each spike by aligning a 1d projection of the
mixture to a random vector and a set of 2d projections of the mixture. Our
results have applications to learning topic models and Gaussian mixtures,
implying improved sample complexity results or running time over prior work
Polynomial-time Tensor Decompositions with Sum-of-Squares
We give new algorithms based on the sum-of-squares method for tensor
decomposition. Our results improve the best known running times from
quasi-polynomial to polynomial for several problems, including decomposing
random overcomplete 3-tensors and learning overcomplete dictionaries with
constant relative sparsity. We also give the first robust analysis for
decomposing overcomplete 4-tensors in the smoothed analysis model. A key
ingredient of our analysis is to establish small spectral gaps in moment
matrices derived from solutions to sum-of-squares relaxations. To enable this
analysis we augment sum-of-squares relaxations with spectral analogs of maximum
entropy constraints.Comment: to appear in FOCS 201
- …