12 research outputs found
Exact Recovery Conditions for Sparse Representations with Partial Support Information
We address the exact recovery of a k-sparse vector in the noiseless setting
when some partial information on the support is available. This partial
information takes the form of either a subset of the true support or an
approximate subset including wrong atoms as well. We derive a new sufficient
and worst-case necessary (in some sense) condition for the success of some
procedures based on lp-relaxation, Orthogonal Matching Pursuit (OMP) and
Orthogonal Least Squares (OLS). Our result is based on the coherence "mu" of
the dictionary and relaxes the well-known condition mu<1/(2k-1) ensuring the
recovery of any k-sparse vector in the non-informed setup. It reads
mu<1/(2k-g+b-1) when the informed support is composed of g good atoms and b
wrong atoms. We emphasize that our condition is complementary to some
restricted-isometry based conditions by showing that none of them implies the
other.
Because this mutual coherence condition is common to all procedures, we carry
out a finer analysis based on the Null Space Property (NSP) and the Exact
Recovery Condition (ERC). Connections are established regarding the
characterization of lp-relaxation procedures and OMP in the informed setup.
First, we emphasize that the truncated NSP enjoys an ordering property when p
is decreased. Second, the partial ERC for OMP (ERC-OMP) implies in turn the
truncated NSP for the informed l1 problem, and the truncated NSP for p<1.Comment: arXiv admin note: substantial text overlap with arXiv:1211.728
On Probability of Support Recovery for Orthogonal Matching Pursuit Using Mutual Coherence
In this paper we present a new coherence-based performance guarantee for the
Orthogonal Matching Pursuit (OMP) algorithm. A lower bound for the probability
of correctly identifying the support of a sparse signal with additive white
Gaussian noise is derived. Compared to previous work, the new bound takes into
account the signal parameters such as dynamic range, noise variance, and
sparsity. Numerical simulations show significant improvements over previous
work and a closer match to empirically obtained results of the OMP algorithm.Comment: Submitted to IEEE Signal Processing Letters. arXiv admin note:
substantial text overlap with arXiv:1608.0038
Relaxed Recovery Conditions for OMP/OLS by Exploiting both Coherence and Decay
We propose extended coherence-based conditions for exact sparse support
recovery using orthogonal matching pursuit (OMP) and orthogonal least squares
(OLS). Unlike standard uniform guarantees, we embed some information about the
decay of the sparse vector coefficients in our conditions. As a result, the
standard condition (where denotes the mutual coherence and
the sparsity level) can be weakened as soon as the non-zero coefficients
obey some decay, both in the noiseless and the bounded-noise scenarios.
Furthermore, the resulting condition is approaching for strongly
decaying sparse signals. Finally, in the noiseless setting, we prove that the
proposed conditions, in particular the bound , are the tightest
achievable guarantees based on mutual coherence
Recovery of Sparse Signals Using Multiple Orthogonal Least Squares
We study the problem of recovering sparse signals from compressed linear
measurements. This problem, often referred to as sparse recovery or sparse
reconstruction, has generated a great deal of interest in recent years. To
recover the sparse signals, we propose a new method called multiple orthogonal
least squares (MOLS), which extends the well-known orthogonal least squares
(OLS) algorithm by allowing multiple indices to be chosen per iteration.
Owing to inclusion of multiple support indices in each selection, the MOLS
algorithm converges in much fewer iterations and improves the computational
efficiency over the conventional OLS algorithm. Theoretical analysis shows that
MOLS () performs exact recovery of all -sparse signals within
iterations if the measurement matrix satisfies the restricted isometry property
(RIP) with isometry constant The recovery performance of MOLS in the noisy scenario is also
studied. It is shown that stable recovery of sparse signals can be achieved
with the MOLS algorithm when the signal-to-noise ratio (SNR) scales linearly
with the sparsity level of input signals
Joint Block-Sparse Recovery Using Simultaneous BOMP/BOLS
We consider the greedy algorithms for the joint recovery of high-dimensional
sparse signals based on the block multiple measurement vector (BMMV) model in
compressed sensing (CS). To this end, we first put forth two versions of
simultaneous block orthogonal least squares (S-BOLS) as the baseline for the
OLS framework. Their cornerstone is to sequentially check and select the
support block to minimize the residual power. Then, parallel performance
analysis for the existing simultaneous block orthogonal matching pursuit
(S-BOMP) and the two proposed S-BOLS algorithms is developed. It indicates that
under the conditions based on the mutual incoherence property (MIP) and the
decaying magnitude structure of the nonzero blocks of the signal, the
algorithms select all the significant blocks before possibly choosing incorrect
ones. In addition, we further consider the problem of sufficient data volume
for reliable recovery, and provide its MIP-based bounds in closed-form. These
results together highlight the key role of the block characteristic in
addressing the weak-sparse issue, i.e., the scenario where the overall sparsity
is too large. The derived theoretical results are also universally valid for
conventional block-greedy algorithms and non-block algorithms by setting the
number of measurement vectors and the block length to 1, respectively.Comment: This work has been submitted to the IEEE for possible publicatio