6,744 research outputs found

    Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit

    Get PDF
    This paper demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with mm nonzero entries in dimension dd given rmO(mlnd) {rm O}(m ln d) random linear measurements of that signal. This is a massive improvement over previous results, which require rmO(m2){rm O}(m^{2}) measurements. The new results for OMP are comparable with recent results for another approach called Basis Pursuit (BP). In some settings, the OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal recovery problems

    Relaxed Recovery Conditions for OMP/OLS by Exploiting both Coherence and Decay

    Full text link
    We propose extended coherence-based conditions for exact sparse support recovery using orthogonal matching pursuit (OMP) and orthogonal least squares (OLS). Unlike standard uniform guarantees, we embed some information about the decay of the sparse vector coefficients in our conditions. As a result, the standard condition ÎŒ<1/(2k−1)\mu<1/(2k-1) (where ÎŒ\mu denotes the mutual coherence and kk the sparsity level) can be weakened as soon as the non-zero coefficients obey some decay, both in the noiseless and the bounded-noise scenarios. Furthermore, the resulting condition is approaching ÎŒ<1/k\mu<1/k for strongly decaying sparse signals. Finally, in the noiseless setting, we prove that the proposed conditions, in particular the bound ÎŒ<1/k\mu<1/k, are the tightest achievable guarantees based on mutual coherence

    Recovery of Sparse Signals Using Multiple Orthogonal Least Squares

    Full text link
    We study the problem of recovering sparse signals from compressed linear measurements. This problem, often referred to as sparse recovery or sparse reconstruction, has generated a great deal of interest in recent years. To recover the sparse signals, we propose a new method called multiple orthogonal least squares (MOLS), which extends the well-known orthogonal least squares (OLS) algorithm by allowing multiple LL indices to be chosen per iteration. Owing to inclusion of multiple support indices in each selection, the MOLS algorithm converges in much fewer iterations and improves the computational efficiency over the conventional OLS algorithm. Theoretical analysis shows that MOLS (L>1L > 1) performs exact recovery of all KK-sparse signals within KK iterations if the measurement matrix satisfies the restricted isometry property (RIP) with isometry constant ÎŽLK<LK+2L.\delta_{LK} < \frac{\sqrt{L}}{\sqrt{K} + 2 \sqrt{L}}. The recovery performance of MOLS in the noisy scenario is also studied. It is shown that stable recovery of sparse signals can be achieved with the MOLS algorithm when the signal-to-noise ratio (SNR) scales linearly with the sparsity level of input signals

    Coherence-based Partial Exact Recovery Condition for OMP/OLS

    Get PDF
    We address the exact recovery of the support of a k-sparse vector with Orthogonal Matching Pursuit (OMP) and Orthogonal Least Squares (OLS) in a noiseless setting. We consider the scenario where OMP/OLS have selected good atoms during the first l iterations (l<k) and derive a new sufficient and worst-case necessary condition for their success in k steps. Our result is based on the coherence \mu of the dictionary and relaxes Tropp's well-known condition \mu<1/(2k-1) to the case where OMP/OLS have a partial knowledge of the support
    • 

    corecore