2,455 research outputs found

    Coherence-Based Performance Guarantees of Orthogonal Matching Pursuit

    Full text link
    In this paper, we present coherence-based performance guarantees of Orthogonal Matching Pursuit (OMP) for both support recovery and signal reconstruction of sparse signals when the measurements are corrupted by noise. In particular, two variants of OMP either with known sparsity level or with a stopping rule are analyzed. It is shown that if the measurement matrix X∈Cn×pX\in\mathbb{C}^{n\times p} satisfies the strong coherence property, then with n≳O(klog⁡p)n\gtrsim\mathcal{O}(k\log p), OMP will recover a kk-sparse signal with high probability. In particular, the performance guarantees obtained here separate the properties required of the measurement matrix from the properties required of the signal, which depends critically on the minimum signal to noise ratio rather than the power profiles of the signal. We also provide performance guarantees for partial support recovery. Comparisons are given with other performance guarantees for OMP using worst-case analysis and the sorted one step thresholding algorithm.Comment: appeared at 2012 Allerton conferenc

    Relaxed Recovery Conditions for OMP/OLS by Exploiting both Coherence and Decay

    Full text link
    We propose extended coherence-based conditions for exact sparse support recovery using orthogonal matching pursuit (OMP) and orthogonal least squares (OLS). Unlike standard uniform guarantees, we embed some information about the decay of the sparse vector coefficients in our conditions. As a result, the standard condition ÎŒ<1/(2k−1)\mu<1/(2k-1) (where ÎŒ\mu denotes the mutual coherence and kk the sparsity level) can be weakened as soon as the non-zero coefficients obey some decay, both in the noiseless and the bounded-noise scenarios. Furthermore, the resulting condition is approaching ÎŒ<1/k\mu<1/k for strongly decaying sparse signals. Finally, in the noiseless setting, we prove that the proposed conditions, in particular the bound ÎŒ<1/k\mu<1/k, are the tightest achievable guarantees based on mutual coherence

    Submodular meets Spectral: Greedy Algorithms for Subset Selection, Sparse Approximation and Dictionary Selection

    Full text link
    We study the problem of selecting a subset of k random variables from a large set, in order to obtain the best linear prediction of another variable of interest. This problem can be viewed in the context of both feature selection and sparse approximation. We analyze the performance of widely used greedy heuristics, using insights from the maximization of submodular functions and spectral analysis. We introduce the submodularity ratio as a key quantity to help understand why greedy algorithms perform well even when the variables are highly correlated. Using our techniques, we obtain the strongest known approximation guarantees for this problem, both in terms of the submodularity ratio and the smallest k-sparse eigenvalue of the covariance matrix. We further demonstrate the wide applicability of our techniques by analyzing greedy algorithms for the dictionary selection problem, and significantly improve the previously known guarantees. Our theoretical analysis is complemented by experiments on real-world and synthetic data sets; the experiments show that the submodularity ratio is a stronger predictor of the performance of greedy algorithms than other spectral parameters

    On Probability of Support Recovery for Orthogonal Matching Pursuit Using Mutual Coherence

    Full text link
    In this paper we present a new coherence-based performance guarantee for the Orthogonal Matching Pursuit (OMP) algorithm. A lower bound for the probability of correctly identifying the support of a sparse signal with additive white Gaussian noise is derived. Compared to previous work, the new bound takes into account the signal parameters such as dynamic range, noise variance, and sparsity. Numerical simulations show significant improvements over previous work and a closer match to empirically obtained results of the OMP algorithm.Comment: Submitted to IEEE Signal Processing Letters. arXiv admin note: substantial text overlap with arXiv:1608.0038

    Exact Recovery Conditions for Sparse Representations with Partial Support Information

    Get PDF
    We address the exact recovery of a k-sparse vector in the noiseless setting when some partial information on the support is available. This partial information takes the form of either a subset of the true support or an approximate subset including wrong atoms as well. We derive a new sufficient and worst-case necessary (in some sense) condition for the success of some procedures based on lp-relaxation, Orthogonal Matching Pursuit (OMP) and Orthogonal Least Squares (OLS). Our result is based on the coherence "mu" of the dictionary and relaxes the well-known condition mu<1/(2k-1) ensuring the recovery of any k-sparse vector in the non-informed setup. It reads mu<1/(2k-g+b-1) when the informed support is composed of g good atoms and b wrong atoms. We emphasize that our condition is complementary to some restricted-isometry based conditions by showing that none of them implies the other. Because this mutual coherence condition is common to all procedures, we carry out a finer analysis based on the Null Space Property (NSP) and the Exact Recovery Condition (ERC). Connections are established regarding the characterization of lp-relaxation procedures and OMP in the informed setup. First, we emphasize that the truncated NSP enjoys an ordering property when p is decreased. Second, the partial ERC for OMP (ERC-OMP) implies in turn the truncated NSP for the informed l1 problem, and the truncated NSP for p<1.Comment: arXiv admin note: substantial text overlap with arXiv:1211.728
    • 

    corecore