8,652 research outputs found
Guaranteed Rank Minimization via Singular Value Projection
Minimizing the rank of a matrix subject to affine constraints is a
fundamental problem with many important applications in machine learning and
statistics. In this paper we propose a simple and fast algorithm SVP (Singular
Value Projection) for rank minimization with affine constraints (ARMP) and show
that SVP recovers the minimum rank solution for affine constraints that satisfy
the "restricted isometry property" and show robustness of our method to noise.
Our results improve upon a recent breakthrough by Recht, Fazel and Parillo
(RFP07) and Lee and Bresler (LB09) in three significant ways:
1) our method (SVP) is significantly simpler to analyze and easier to
implement,
2) we give recovery guarantees under strictly weaker isometry assumptions
3) we give geometric convergence guarantees for SVP even in presense of noise
and, as demonstrated empirically, SVP is significantly faster on real-world and
synthetic problems.
In addition, we address the practically important problem of low-rank matrix
completion (MCP), which can be seen as a special case of ARMP. We empirically
demonstrate that our algorithm recovers low-rank incoherent matrices from an
almost optimal number of uniformly sampled entries. We make partial progress
towards proving exact recovery and provide some intuition for the strong
performance of SVP applied to matrix completion by showing a more restricted
isometry property. Our algorithm outperforms existing methods, such as those of
\cite{RFP07,CR08,CT09,CCS08,KOM09,LB09}, for ARMP and the matrix-completion
problem by an order of magnitude and is also significantly more robust to
noise.Comment: An earlier version of this paper was submitted to NIPS-2009 on June
5, 200
Improving compressed sensing with the diamond norm
In low-rank matrix recovery, one aims to reconstruct a low-rank matrix from a
minimal number of linear measurements. Within the paradigm of compressed
sensing, this is made computationally efficient by minimizing the nuclear norm
as a convex surrogate for rank.
In this work, we identify an improved regularizer based on the so-called
diamond norm, a concept imported from quantum information theory. We show that
-for a class of matrices saturating a certain norm inequality- the descent cone
of the diamond norm is contained in that of the nuclear norm. This suggests
superior reconstruction properties for these matrices. We explicitly
characterize this set of matrices. Moreover, we demonstrate numerically that
the diamond norm indeed outperforms the nuclear norm in a number of relevant
applications: These include signal analysis tasks such as blind matrix
deconvolution or the retrieval of certain unitary basis changes, as well as the
quantum information problem of process tomography with random measurements.
The diamond norm is defined for matrices that can be interpreted as order-4
tensors and it turns out that the above condition depends crucially on that
tensorial structure. In this sense, this work touches on an aspect of the
notoriously difficult tensor completion problem.Comment: 25 pages + Appendix, 7 Figures, published versio
A Deterministic Theory for Exact Non-Convex Phase Retrieval
In this paper, we analyze the non-convex framework of Wirtinger Flow (WF) for
phase retrieval and identify a novel sufficient condition for universal exact
recovery through the lens of low rank matrix recovery theory. Via a perspective
in the lifted domain, we show that the convergence of the WF iterates to a true
solution is attained geometrically under a single condition on the lifted
forward model. As a result, a deterministic relationship between the accuracy
of spectral initialization and the validity of {the regularity condition} is
derived. In particular, we determine that a certain concentration property on
the spectral matrix must hold uniformly with a sufficiently tight constant.
This culminates into a sufficient condition that is equivalent to a restricted
isometry-type property over rank-1, positive semi-definite matrices, and
amounts to a less stringent requirement on the lifted forward model than those
of prominent low-rank-matrix-recovery methods in the literature. We
characterize the performance limits of our framework in terms of the tightness
of the concentration property via novel bounds on the convergence rate and on
the signal-to-noise ratio such that the theoretical guarantees are valid using
the spectral initialization at the proper sample complexity.Comment: In Revision for IEEE Transactions on Signal Processin
- …