2 research outputs found
On the Fundamental Recovery Limit of Orthogonal Least Squares
Orthogonal least squares (OLS) is a classic algorithm for sparse recovery,
function approximation, and subset selection. In this paper, we analyze the
performance guarantee of the OLS algorithm. Specifically, we show that OLS
guarantees the exact reconstruction of any -sparse vector in iterations,
provided that a sensing matrix has unit -norm columns and satisfies
the restricted isometry property (RIP) of order with \begin{align*}
\delta_{K+1} &<C_{K} = \begin{cases} \frac{1}{\sqrt{K}}, & K=1, \\
\frac{1}{\sqrt{K+\frac{1}{4}}}, & K=2, \\ \frac{1}{\sqrt{K+\frac{1}{16}}}, &
K=3, \\ \frac{1}{\sqrt{K}}, & K \ge 4. \end{cases} \end{align*} Furthermore, we
show that the proposed guarantee is optimal in the sense that if , then there exists a counterexample for which OLS fails the
recovery
Joint Sparse Recovery Using Signal Space Matching Pursuit
In this paper, we put forth a new joint sparse recovery algorithm called
signal space matching pursuit (SSMP). The key idea of the proposed SSMP
algorithm is to sequentially investigate the support of jointly sparse vectors
to minimize the subspace distance to the residual space. Our performance
guarantee analysis indicates that SSMP accurately reconstructs any row
-sparse matrix of rank in the full row rank scenario if the sampling
matrix satisfies , which meets
the fundamental minimum requirement on to ensure exact recovery.
We also show that SSMP guarantees exact reconstruction in at most iterations, provided that satisfies the
restricted isometry property (RIP) of order with
where is the number of
indices chosen in each iteration. This implies that the requirement on the RIP
constant becomes less restrictive when increases. Such behavior seems to be
natural but has not been reported for most of conventional methods. We further
show that if , then by running more than iterations, the performance
guarantee of SSMP can be improved to .
In addition, we show that under a suitable RIP condition, the reconstruction
error of SSMP is upper bounded by a constant multiple of the noise power, which
demonstrates the stability of SSMP under measurement noise. Finally, from
extensive numerical experiments, we show that SSMP outperforms conventional
joint sparse recovery algorithms both in noiseless and noisy scenarios