22,649 research outputs found
Stable low-rank matrix recovery via null space properties
The problem of recovering a matrix of low rank from an incomplete and
possibly noisy set of linear measurements arises in a number of areas. In order
to derive rigorous recovery results, the measurement map is usually modeled
probabilistically. We derive sufficient conditions on the minimal amount of
measurements ensuring recovery via convex optimization. We establish our
results via certain properties of the null space of the measurement map. In the
setting where the measurements are realized as Frobenius inner products with
independent standard Gaussian random matrices we show that
measurements are enough to uniformly and stably recover an
matrix of rank at most . We then significantly generalize this result by
only requiring independent mean-zero, variance one entries with four finite
moments at the cost of replacing by some universal constant. We also study
the case of recovering Hermitian rank- matrices from measurement matrices
proportional to rank-one projectors. For rank-one projective
measurements onto independent standard Gaussian vectors, we show that nuclear
norm minimization uniformly and stably reconstructs Hermitian rank- matrices
with high probability. Next, we partially de-randomize this by establishing an
analogous statement for projectors onto independent elements of a complex
projective 4-designs at the cost of a slightly higher sampling rate . Moreover, if the Hermitian matrix to be recovered is known to be
positive semidefinite, then we show that the nuclear norm minimization approach
may be replaced by minimizing the -norm of the residual subject to the
positive semidefinite constraint. Then no estimate of the noise level is
required a priori. We discuss applications in quantum physics and the phase
retrieval problem.Comment: 26 page
Sparse recovery on Euclidean Jordan algebras
This paper is concerned with the problem of sparse recovery on Euclidean Jordan algebra (SREJA), which includes the sparse signal recovery problem and the low-rank symmetric matrix recovery problem as special cases. We introduce the notions of restricted isometry property (RIP), null space property (NSP), and s-goodness for linear transformations in s-SREJA, all of which provide sufficient conditions for s-sparse recovery via the nuclear norm minimization on Euclidean Jordan algebra. Moreover, we show that both the s-goodness and the NSP are necessary and sufficient conditions for exact s-sparse recovery via the nuclear norm minimization on Euclidean Jordan algebra. Applying these characteristic properties, we establish the exact and stable recovery results for solving SREJA problems via nuclear norm minimization
Augmented L1 and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm
This paper studies the long-existing idea of adding a nice smooth function to
"smooth" a non-differentiable objective function in the context of sparse
optimization, in particular, the minimization of
, where is a vector, as well as the
minimization of , where is a matrix and
and are the nuclear and Frobenius norms of ,
respectively. We show that they can efficiently recover sparse vectors and
low-rank matrices. In particular, they enjoy exact and stable recovery
guarantees similar to those known for minimizing and under
the conditions on the sensing operator such as its null-space property,
restricted isometry property, spherical section property, or RIPless property.
To recover a (nearly) sparse vector , minimizing
returns (nearly) the same solution as minimizing
almost whenever . The same relation also
holds between minimizing and minimizing
for recovering a (nearly) low-rank matrix , if . Furthermore, we show that the linearized Bregman algorithm for
minimizing subject to enjoys global
linear convergence as long as a nonzero solution exists, and we give an
explicit rate of convergence. The convergence property does not require a
solution solution or any properties on . To our knowledge, this is the best
known global convergence result for first-order sparse optimization algorithms.Comment: arXiv admin note: text overlap with arXiv:1207.5326 by other author
- …