4,139 research outputs found
On Probability of Support Recovery for Orthogonal Matching Pursuit Using Mutual Coherence
In this paper we present a new coherence-based performance guarantee for the
Orthogonal Matching Pursuit (OMP) algorithm. A lower bound for the probability
of correctly identifying the support of a sparse signal with additive white
Gaussian noise is derived. Compared to previous work, the new bound takes into
account the signal parameters such as dynamic range, noise variance, and
sparsity. Numerical simulations show significant improvements over previous
work and a closer match to empirically obtained results of the OMP algorithm.Comment: Submitted to IEEE Signal Processing Letters. arXiv admin note:
substantial text overlap with arXiv:1608.0038
Noise Folding in Compressed Sensing
The literature on compressed sensing has focused almost entirely on settings
where the signal is noiseless and the measurements are contaminated by noise.
In practice, however, the signal itself is often subject to random noise prior
to measurement. We briefly study this setting and show that, for the vast
majority of measurement schemes employed in compressed sensing, the two models
are equivalent with the important difference that the signal-to-noise ratio is
divided by a factor proportional to p/n, where p is the dimension of the signal
and n is the number of observations. Since p/n is often large, this leads to
noise folding which can have a severe impact on the SNR
Conditioning of Random Block Subdictionaries with Applications to Block-Sparse Recovery and Regression
The linear model, in which a set of observations is assumed to be given by a
linear combination of columns of a matrix, has long been the mainstay of the
statistics and signal processing literature. One particular challenge for
inference under linear models is understanding the conditions on the dictionary
under which reliable inference is possible. This challenge has attracted
renewed attention in recent years since many modern inference problems deal
with the "underdetermined" setting, in which the number of observations is much
smaller than the number of columns in the dictionary. This paper makes several
contributions for this setting when the set of observations is given by a
linear combination of a small number of groups of columns of the dictionary,
termed the "block-sparse" case. First, it specifies conditions on the
dictionary under which most block subdictionaries are well conditioned. This
result is fundamentally different from prior work on block-sparse inference
because (i) it provides conditions that can be explicitly computed in
polynomial time, (ii) the given conditions translate into near-optimal scaling
of the number of columns of the block subdictionaries as a function of the
number of observations for a large class of dictionaries, and (iii) it suggests
that the spectral norm and the quadratic-mean block coherence of the dictionary
(rather than the worst-case coherences) fundamentally limit the scaling of
dimensions of the well-conditioned block subdictionaries. Second, this paper
investigates the problems of block-sparse recovery and block-sparse regression
in underdetermined settings. Near-optimal block-sparse recovery and regression
are possible for certain dictionaries as long as the dictionary satisfies
easily computable conditions and the coefficients describing the linear
combination of groups of columns can be modeled through a mild statistical
prior.Comment: 39 pages, 3 figures. A revised and expanded version of the paper
published in IEEE Transactions on Information Theory (DOI:
10.1109/TIT.2015.2429632); this revision includes corrections in the proofs
of some of the result
Relaxed Recovery Conditions for OMP/OLS by Exploiting both Coherence and Decay
We propose extended coherence-based conditions for exact sparse support
recovery using orthogonal matching pursuit (OMP) and orthogonal least squares
(OLS). Unlike standard uniform guarantees, we embed some information about the
decay of the sparse vector coefficients in our conditions. As a result, the
standard condition (where denotes the mutual coherence and
the sparsity level) can be weakened as soon as the non-zero coefficients
obey some decay, both in the noiseless and the bounded-noise scenarios.
Furthermore, the resulting condition is approaching for strongly
decaying sparse signals. Finally, in the noiseless setting, we prove that the
proposed conditions, in particular the bound , are the tightest
achievable guarantees based on mutual coherence
- …