2 research outputs found

    On the Fundamental Recovery Limit of Orthogonal Least Squares

    Full text link
    Orthogonal least squares (OLS) is a classic algorithm for sparse recovery, function approximation, and subset selection. In this paper, we analyze the performance guarantee of the OLS algorithm. Specifically, we show that OLS guarantees the exact reconstruction of any KK-sparse vector in KK iterations, provided that a sensing matrix has unit β„“2\ell_{2}-norm columns and satisfies the restricted isometry property (RIP) of order K+1K+1 with \begin{align*} \delta_{K+1} &<C_{K} = \begin{cases} \frac{1}{\sqrt{K}}, & K=1, \\ \frac{1}{\sqrt{K+\frac{1}{4}}}, & K=2, \\ \frac{1}{\sqrt{K+\frac{1}{16}}}, & K=3, \\ \frac{1}{\sqrt{K}}, & K \ge 4. \end{cases} \end{align*} Furthermore, we show that the proposed guarantee is optimal in the sense that if Ξ΄K+1β‰₯CK\delta_{K+1} \ge C_{K}, then there exists a counterexample for which OLS fails the recovery

    Joint Sparse Recovery Using Signal Space Matching Pursuit

    Full text link
    In this paper, we put forth a new joint sparse recovery algorithm called signal space matching pursuit (SSMP). The key idea of the proposed SSMP algorithm is to sequentially investigate the support of jointly sparse vectors to minimize the subspace distance to the residual space. Our performance guarantee analysis indicates that SSMP accurately reconstructs any row KK-sparse matrix of rank rr in the full row rank scenario if the sampling matrix A\mathbf{A} satisfies krank(A)β‰₯K+1\text{krank}(\mathbf{A}) \ge K+1, which meets the fundamental minimum requirement on A\mathbf{A} to ensure exact recovery. We also show that SSMP guarantees exact reconstruction in at most Kβˆ’r+⌈rLβŒ‰K-r+\lceil \frac{r}{L} \rceil iterations, provided that A\mathbf{A} satisfies the restricted isometry property (RIP) of order L(Kβˆ’r)+r+1L(K-r)+r+1 with Ξ΄L(Kβˆ’r)+r+1<max⁑{rK+r4+r4,LK+1.15L},\delta_{L(K-r)+r+1} < \max \left \{ \frac{\sqrt{r}}{\sqrt{K+\frac{r}{4}}+\sqrt{\frac{r}{4}}}, \frac{\sqrt{L}}{\sqrt{K}+1.15 \sqrt{L}} \right \}, where LL is the number of indices chosen in each iteration. This implies that the requirement on the RIP constant becomes less restrictive when rr increases. Such behavior seems to be natural but has not been reported for most of conventional methods. We further show that if r=1r=1, then by running more than KK iterations, the performance guarantee of SSMP can be improved to δ⌊7.8KβŒ‹β‰€0.155\delta_{\lfloor 7.8K \rfloor} \le 0.155. In addition, we show that under a suitable RIP condition, the reconstruction error of SSMP is upper bounded by a constant multiple of the noise power, which demonstrates the stability of SSMP under measurement noise. Finally, from extensive numerical experiments, we show that SSMP outperforms conventional joint sparse recovery algorithms both in noiseless and noisy scenarios
    corecore