230 research outputs found
Oracle-order Recovery Performance of Greedy Pursuits with Replacement against General Perturbations
Applying the theory of compressive sensing in practice always takes different
kinds of perturbations into consideration. In this paper, the recovery
performance of greedy pursuits with replacement for sparse recovery is analyzed
when both the measurement vector and the sensing matrix are contaminated with
additive perturbations. Specifically, greedy pursuits with replacement include
three algorithms, compressive sampling matching pursuit (CoSaMP), subspace
pursuit (SP), and iterative hard thresholding (IHT), where the support
estimation is evaluated and updated in each iteration. Based on restricted
isometry property, a unified form of the error bounds of these recovery
algorithms is derived under general perturbations for compressible signals. The
results reveal that the recovery performance is stable against both
perturbations. In addition, these bounds are compared with that of oracle
recovery--- least squares solution with the locations of some largest entries
in magnitude known a priori. The comparison shows that the error bounds of
these algorithms only differ in coefficients from the lower bound of oracle
recovery for some certain signal and perturbations, as reveals that
oracle-order recovery performance of greedy pursuits with replacement is
guaranteed. Numerical simulations are performed to verify the conclusions.Comment: 27 pages, 4 figures, 5 table
Global and Quadratic Convergence of Newton Hard-Thresholding Pursuit
Algorithms based on the hard thresholding principle have been well studied
with sounding theoretical guarantees in the compressed sensing and more general
sparsity-constrained optimization. It is widely observed in existing empirical
studies that when a restricted Newton step was used (as the debiasing step),
the hard-thresholding algorithms tend to meet halting conditions in a
significantly low number of iterations and are very efficient. Hence, the thus
obtained Newton hard-thresholding algorithms call for stronger theoretical
guarantees than for their simple hard-thresholding counterparts. This paper
provides a theoretical justification for the use of the restricted Newton step.
We build our theory and algorithm, Newton Hard-Thresholding Pursuit (NHTP), for
the sparsity-constrained optimization. Our main result shows that NHTP is
quadratically convergent under the standard assumption of restricted strong
convexity and smoothness. We also establish its global convergence to a
stationary point under a weaker assumption. In the special case of the
compressive sensing, NHTP effectively reduces to some of the existing
hard-thresholding algorithms with a Newton step. Consequently, our fast
convergence result justifies why those algorithms perform better than without
the Newton step. The efficiency of NHTP was demonstrated on both synthetic and
real data in compressed sensing and sparse logistic regression
- …