927 research outputs found
Uniform Uncertainty Principle and signal recovery via Regularized Orthogonal Matching Pursuit
This paper seeks to bridge the two major algorithmic approaches to sparse
signal recovery from an incomplete set of linear measurements --
L_1-minimization methods and iterative methods (Matching Pursuits). We find a
simple regularized version of the Orthogonal Matching Pursuit (ROMP) which has
advantages of both approaches: the speed and transparency of OMP and the strong
uniform guarantees of the L_1-minimization. Our algorithm ROMP reconstructs a
sparse signal in a number of iterations linear in the sparsity (in practice
even logarithmic), and the reconstruction is exact provided the linear
measurements satisfy the Uniform Uncertainty Principle.Comment: This is the final version of the paper, including referee suggestion
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
This paper demonstrates theoretically and empirically that a greedy algorithm called Orthogonal Matching Pursuit (OMP) can reliably recover a signal with nonzero entries in dimension given random linear measurements of that signal. This is a massive improvement over previous results, which require measurements. The new results for OMP are comparable with recent results for another approach called Basis Pursuit (BP). In some settings, the OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal recovery problems
Greedy Signal Recovery Review
The two major approaches to sparse recovery are L1-minimization and greedy
methods. Recently, Needell and Vershynin developed Regularized Orthogonal
Matching Pursuit (ROMP) that has bridged the gap between these two approaches.
ROMP is the first stable greedy algorithm providing uniform guarantees.
Even more recently, Needell and Tropp developed the stable greedy algorithm
Compressive Sampling Matching Pursuit (CoSaMP). CoSaMP provides uniform
guarantees and improves upon the stability bounds and RIC requirements of ROMP.
CoSaMP offers rigorous bounds on computational cost and storage. In many cases,
the running time is just O(NlogN), where N is the ambient dimension of the
signal. This review summarizes these major advances
A* Orthogonal Matching Pursuit: Best-First Search for Compressed Sensing Signal Recovery
Compressed sensing is a developing field aiming at reconstruction of sparse
signals acquired in reduced dimensions, which make the recovery process
under-determined. The required solution is the one with minimum norm
due to sparsity, however it is not practical to solve the minimization
problem. Commonly used techniques include minimization, such as Basis
Pursuit (BP) and greedy pursuit algorithms such as Orthogonal Matching Pursuit
(OMP) and Subspace Pursuit (SP). This manuscript proposes a novel semi-greedy
recovery approach, namely A* Orthogonal Matching Pursuit (A*OMP). A*OMP
performs A* search to look for the sparsest solution on a tree whose paths grow
similar to the Orthogonal Matching Pursuit (OMP) algorithm. Paths on the tree
are evaluated according to a cost function, which should compensate for
different path lengths. For this purpose, three different auxiliary structures
are defined, including novel dynamic ones. A*OMP also incorporates pruning
techniques which enable practical applications of the algorithm. Moreover, the
adjustable search parameters provide means for a complexity-accuracy trade-off.
We demonstrate the reconstruction ability of the proposed scheme on both
synthetically generated data and images using Gaussian and Bernoulli
observation matrices, where A*OMP yields less reconstruction error and higher
exact recovery frequency than BP, OMP and SP. Results also indicate that novel
dynamic cost functions provide improved results as compared to a conventional
choice.Comment: accepted for publication in Digital Signal Processin
Orthogonal Matching Pursuit: A Brownian Motion Analysis
A well-known analysis of Tropp and Gilbert shows that orthogonal matching
pursuit (OMP) can recover a k-sparse n-dimensional real vector from 4 k log(n)
noise-free linear measurements obtained through a random Gaussian measurement
matrix with a probability that approaches one as n approaches infinity. This
work strengthens this result by showing that a lower number of measurements, 2
k log(n - k), is in fact sufficient for asymptotic recovery. More generally,
when the sparsity level satisfies kmin <= k <= kmax but is unknown, 2 kmax
log(n - kmin) measurements is sufficient. Furthermore, this number of
measurements is also sufficient for detection of the sparsity pattern (support)
of the vector with measurement errors provided the signal-to-noise ratio (SNR)
scales to infinity. The scaling 2 k log(n - k) exactly matches the number of
measurements required by the more complex lasso method for signal recovery with
a similar SNR scaling.Comment: 11 pages, 2 figure
- …