1,199 research outputs found
A Simplified Approach to Recovery Conditions for Low Rank Matrices
Recovering sparse vectors and low-rank matrices from noisy linear
measurements has been the focus of much recent research. Various reconstruction
algorithms have been studied, including and nuclear norm minimization
as well as minimization with . These algorithms are known to
succeed if certain conditions on the measurement map are satisfied. Proofs of
robust recovery for matrices have so far been much more involved than in the
vector case.
In this paper, we show how several robust classes of recovery conditions can
be extended from vectors to matrices in a simple and transparent way, leading
to the best known restricted isometry and nullspace conditions for matrix
recovery. Our results rely on the ability to "vectorize" matrices through the
use of a key singular value inequality.Comment: 6 pages, This is a modified version of a paper submitted to ISIT
2011; Proc. Intl. Symp. Info. Theory (ISIT), Aug 201
Compressed Sensing with Coherent and Redundant Dictionaries
This article presents novel results concerning the recovery of signals from
undersampled data in the common situation where such signals are not sparse in
an orthonormal basis or incoherent dictionary, but in a truly redundant
dictionary. This work thus bridges a gap in the literature and shows not only
that compressed sensing is viable in this context, but also that accurate
recovery is possible via an L1-analysis optimization problem. We introduce a
condition on the measurement/sensing matrix, which is a natural generalization
of the now well-known restricted isometry property, and which guarantees
accurate recovery of signals that are nearly sparse in (possibly) highly
overcomplete and coherent dictionaries. This condition imposes no incoherence
restriction on the dictionary and our results may be the first of this kind. We
discuss practical examples and the implications of our results on those
applications, and complement our study by demonstrating the potential of
L1-analysis for such problems
Subspace Methods for Joint Sparse Recovery
We propose robust and efficient algorithms for the joint sparse recovery
problem in compressed sensing, which simultaneously recover the supports of
jointly sparse signals from their multiple measurement vectors obtained through
a common sensing matrix. In a favorable situation, the unknown matrix, which
consists of the jointly sparse signals, has linearly independent nonzero rows.
In this case, the MUSIC (MUltiple SIgnal Classification) algorithm, originally
proposed by Schmidt for the direction of arrival problem in sensor array
processing and later proposed and analyzed for joint sparse recovery by Feng
and Bresler, provides a guarantee with the minimum number of measurements. We
focus instead on the unfavorable but practically significant case of
rank-defect or ill-conditioning. This situation arises with limited number of
measurement vectors, or with highly correlated signal components. In this case
MUSIC fails, and in practice none of the existing methods can consistently
approach the fundamental limit. We propose subspace-augmented MUSIC (SA-MUSIC),
which improves on MUSIC so that the support is reliably recovered under such
unfavorable conditions. Combined with subspace-based greedy algorithms also
proposed and analyzed in this paper, SA-MUSIC provides a computationally
efficient algorithm with a performance guarantee. The performance guarantees
are given in terms of a version of restricted isometry property. In particular,
we also present a non-asymptotic perturbation analysis of the signal subspace
estimation that has been missing in the previous study of MUSIC.Comment: submitted to IEEE transactions on Information Theory, revised versio
- β¦