106,050 research outputs found

    Coherence-Based Performance Guarantees of Orthogonal Matching Pursuit

    Full text link
    In this paper, we present coherence-based performance guarantees of Orthogonal Matching Pursuit (OMP) for both support recovery and signal reconstruction of sparse signals when the measurements are corrupted by noise. In particular, two variants of OMP either with known sparsity level or with a stopping rule are analyzed. It is shown that if the measurement matrix XCn×pX\in\mathbb{C}^{n\times p} satisfies the strong coherence property, then with nO(klogp)n\gtrsim\mathcal{O}(k\log p), OMP will recover a kk-sparse signal with high probability. In particular, the performance guarantees obtained here separate the properties required of the measurement matrix from the properties required of the signal, which depends critically on the minimum signal to noise ratio rather than the power profiles of the signal. We also provide performance guarantees for partial support recovery. Comparisons are given with other performance guarantees for OMP using worst-case analysis and the sorted one step thresholding algorithm.Comment: appeared at 2012 Allerton conferenc

    Sparse Support Recovery with Non-smooth Loss Functions

    Get PDF
    In this paper, we study the support recovery guarantees of underdetermined sparse regression using the 1\ell_1-norm as a regularizer and a non-smooth loss function for data fidelity. More precisely, we focus in detail on the cases of 1\ell_1 and \ell_\infty losses, and contrast them with the usual 2\ell_2 loss. While these losses are routinely used to account for either sparse (1\ell_1 loss) or uniform (\ell_\infty loss) noise models, a theoretical analysis of their performance is still lacking. In this article, we extend the existing theory from the smooth 2\ell_2 case to these non-smooth cases. We derive a sharp condition which ensures that the support of the vector to recover is stable to small additive noise in the observations, as long as the loss constraint size is tuned proportionally to the noise level. A distinctive feature of our theory is that it also explains what happens when the support is unstable. While the support is not stable anymore, we identify an "extended support" and show that this extended support is stable to small additive noise. To exemplify the usefulness of our theory, we give a detailed numerical analysis of the support stability/instability of compressed sensing recovery with these different losses. This highlights different parameter regimes, ranging from total support stability to progressively increasing support instability.Comment: in Proc. NIPS 201

    Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing with Prior Information

    Full text link
    Commonly employed reconstruction algorithms in compressed sensing (CS) use the L2L_2 norm as the metric for the residual error. However, it is well-known that least squares (LS) based estimators are highly sensitive to outliers present in the measurement vector leading to a poor performance when the noise no longer follows the Gaussian assumption but, instead, is better characterized by heavier-than-Gaussian tailed distributions. In this paper, we propose a robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse signals in the presence of impulsive noise. To address this problem, we use a Lorentzian cost function instead of the L2L_2 cost function employed by the traditional IHT algorithm. We also modify the algorithm to incorporate prior signal information in the recovery process. Specifically, we study the case of CS with partially known support. The proposed algorithm is a fast method with computational load comparable to the LS based IHT, whilst having the advantage of robustness against heavy-tailed impulsive noise. Sufficient conditions for stability are studied and a reconstruction error bound is derived. We also derive sufficient conditions for stable sparse signal recovery with partially known support. Theoretical analysis shows that including prior support information relaxes the conditions for successful reconstruction. Simulation results demonstrate that the Lorentzian-based IHT algorithm significantly outperform commonly employed sparse reconstruction techniques in impulsive environments, while providing comparable performance in less demanding, light-tailed environments. Numerical results also demonstrate that the partially known support inclusion improves the performance of the proposed algorithm, thereby requiring fewer samples to yield an approximate reconstruction.Comment: 28 pages, 9 figures, accepted in IEEE Transactions on Signal Processin

    Successful Recovery Performance Guarantees of SOMP Under the L2-norm of Noise

    Full text link
    The simultaneous orthogonal matching pursuit (SOMP) is a popular, greedy approach for common support recovery of a row-sparse matrix. However, compared to the noiseless scenario, the performance analysis of noisy SOMP is still nascent, especially in the scenario of unbounded noise. In this paper, we present a new study based on the mutual incoherence property (MIP) for performance analysis of noisy SOMP. Specifically, when noise is bounded, we provide the condition on which the exact support recovery is guaranteed in terms of the MIP. When noise is unbounded, we instead derive a bound on the successful recovery probability (SRP) that depends on the specific distribution of the 2\ell_2-norm of the noise matrix. Then we focus on the common case when noise is random Gaussian and show that the lower bound of SRP follows Tracy-Widom law distribution. The analysis reveals the number of measurements, noise level, the number of sparse vectors, and the value of mutual coherence that are required to guarantee a predefined recovery performance. Theoretically, we show that the mutual coherence of the measurement matrix must decrease proportionally to the noise standard deviation, and the number of sparse vectors needs to grow proportionally to the noise variance. Finally, we extensively validate the derived analysis through numerical simulations
    corecore