27 research outputs found
Convergence of fixed-point continuation algorithms for matrix rank minimization
The matrix rank minimization problem has applications in many fields such as
system identification, optimal control, low-dimensional embedding, etc. As this
problem is NP-hard in general, its convex relaxation, the nuclear norm
minimization problem, is often solved instead. Recently, Ma, Goldfarb and Chen
proposed a fixed-point continuation algorithm for solving the nuclear norm
minimization problem. By incorporating an approximate singular value
decomposition technique in this algorithm, the solution to the matrix rank
minimization problem is usually obtained. In this paper, we study the
convergence/recoverability properties of the fixed-point continuation algorithm
and its variants for matrix rank minimization. Heuristics for determining the
rank of the matrix when its true rank is not known are also proposed. Some of
these algorithms are closely related to greedy algorithms in compressed
sensing. Numerical results for these algorithms for solving affinely
constrained matrix rank minimization problems are reported.Comment: Conditions on RIP constant for an approximate recovery are improve
Randomized Low-Memory Singular Value Projection
Affine rank minimization algorithms typically rely on calculating the
gradient of a data error followed by a singular value decomposition at every
iteration. Because these two steps are expensive, heuristic approximations are
often used to reduce computational burden. To this end, we propose a recovery
scheme that merges the two steps with randomized approximations, and as a
result, operates on space proportional to the degrees of freedom in the
problem. We theoretically establish the estimation guarantees of the algorithm
as a function of approximation tolerance. While the theoretical approximation
requirements are overly pessimistic, we demonstrate that in practice the
algorithm performs well on the quantum tomography recovery problem.Comment: 13 pages. This version has a revised theorem and new numerical
experiment
Guaranteed Rank Minimization via Singular Value Projection
Minimizing the rank of a matrix subject to affine constraints is a
fundamental problem with many important applications in machine learning and
statistics. In this paper we propose a simple and fast algorithm SVP (Singular
Value Projection) for rank minimization with affine constraints (ARMP) and show
that SVP recovers the minimum rank solution for affine constraints that satisfy
the "restricted isometry property" and show robustness of our method to noise.
Our results improve upon a recent breakthrough by Recht, Fazel and Parillo
(RFP07) and Lee and Bresler (LB09) in three significant ways:
1) our method (SVP) is significantly simpler to analyze and easier to
implement,
2) we give recovery guarantees under strictly weaker isometry assumptions
3) we give geometric convergence guarantees for SVP even in presense of noise
and, as demonstrated empirically, SVP is significantly faster on real-world and
synthetic problems.
In addition, we address the practically important problem of low-rank matrix
completion (MCP), which can be seen as a special case of ARMP. We empirically
demonstrate that our algorithm recovers low-rank incoherent matrices from an
almost optimal number of uniformly sampled entries. We make partial progress
towards proving exact recovery and provide some intuition for the strong
performance of SVP applied to matrix completion by showing a more restricted
isometry property. Our algorithm outperforms existing methods, such as those of
\cite{RFP07,CR08,CT09,CCS08,KOM09,LB09}, for ARMP and the matrix-completion
problem by an order of magnitude and is also significantly more robust to
noise.Comment: An earlier version of this paper was submitted to NIPS-2009 on June
5, 200
PEAR: PEriodic And fixed Rank separation for fast fMRI
In functional MRI (fMRI), faster acquisition via undersampling of data can
improve the spatial-temporal resolution trade-off and increase statistical
robustness through increased degrees-of-freedom. High quality reconstruction of
fMRI data from undersampled measurements requires proper modeling of the data.
We present an fMRI reconstruction approach based on modeling the fMRI signal as
a sum of periodic and fixed rank components, for improved reconstruction from
undersampled measurements. We decompose the fMRI signal into a component which
a has fixed rank and a component consisting of a sum of periodic signals which
is sparse in the temporal Fourier domain. Data reconstruction is performed by
solving a constrained problem that enforces a fixed, moderate rank on one of
the components, and a limited number of temporal frequencies on the other. Our
approach is coined PEAR - PEriodic And fixed Rank separation for fast fMRI.
Experimental results include purely synthetic simulation, a simulation with
real timecourses and retrospective undersampling of a real fMRI dataset.
Evaluation was performed both quantitatively and visually versus ground truth,
comparing PEAR to two additional recent methods for fMRI reconstruction from
undersampled measurements. Results demonstrate PEAR's improvement in estimating
the timecourses and activation maps versus the methods compared against at
acceleration ratios of R=8,16 (for simulated data) and R=6.66,10 (for real
data). PEAR results in reconstruction with higher fidelity than when using a
fixed-rank based model or a conventional Low-rank+Sparse algorithm. We have
shown that splitting the functional information between the components leads to
better modeling of fMRI, over state-of-the-art methods
Robust Principal Component Analysis on Graphs
Principal Component Analysis (PCA) is the most widely used tool for linear
dimensionality reduction and clustering. Still it is highly sensitive to
outliers and does not scale well with respect to the number of data samples.
Robust PCA solves the first issue with a sparse penalty term. The second issue
can be handled with the matrix factorization model, which is however
non-convex. Besides, PCA based clustering can also be enhanced by using a graph
of data similarity. In this article, we introduce a new model called "Robust
PCA on Graphs" which incorporates spectral graph regularization into the Robust
PCA framework. Our proposed model benefits from 1) the robustness of principal
components to occlusions and missing values, 2) enhanced low-rank recovery, 3)
improved clustering property due to the graph smoothness assumption on the
low-rank matrix, and 4) convexity of the resulting optimization problem.
Extensive experiments on 8 benchmark, 3 video and 2 artificial datasets with
corruptions clearly reveal that our model outperforms 10 other state-of-the-art
models in its clustering and low-rank recovery tasks