442 research outputs found
Deterministic Sampling of Sparse Trigonometric Polynomials
One can recover sparse multivariate trigonometric polynomials from few
randomly taken samples with high probability (as shown by Kunis and Rauhut). We
give a deterministic sampling of multivariate trigonometric polynomials
inspired by Weil's exponential sum. Our sampling can produce a deterministic
matrix satisfying the statistical restricted isometry property, and also nearly
optimal Grassmannian frames. We show that one can exactly reconstruct every
-sparse multivariate trigonometric polynomial with fixed degree and of
length from the determinant sampling , using the orthogonal matching
pursuit, and # X is a prime number greater than . This result is
almost optimal within the factor. The simulations show that the
deterministic sampling can offer reconstruction performance similar to the
random sampling.Comment: 9 page
Accuracy guarantees for L1-recovery
We discuss two new methods of recovery of sparse signals from noisy
observation based on - minimization. They are closely related to the
well-known techniques such as Lasso and Dantzig Selector. However, these
estimators come with efficiently verifiable guaranties of performance. By
optimizing these bounds with respect to the method parameters we are able to
construct the estimators which possess better statistical properties than the
commonly used ones. We also show how these techniques allow to provide
efficiently computable accuracy bounds for Lasso and Dantzig Selector. We link
our performance estimations to the well known results of Compressive Sensing
and justify our proposed approach with an oracle inequality which links the
properties of the recovery algorithms and the best estimation performance when
the signal support is known. We demonstrate how the estimates can be computed
using the Non-Euclidean Basis Pursuit algorithm
Verifiable conditions of -recovery of sparse signals with sign restrictions
We propose necessary and sufficient conditions for a sensing matrix to be
"s-semigood" -- to allow for exact -recovery of sparse signals with at
most nonzero entries under sign restrictions on part of the entries. We
express the error bounds for imperfect -recovery in terms of the
characteristics underlying these conditions. Furthermore, we demonstrate that
these characteristics, although difficult to evaluate, lead to verifiable
sufficient conditions for exact sparse -recovery and to efficiently
computable upper bounds on those for which a given sensing matrix is
-semigood. We concentrate on the properties of proposed verifiable
sufficient conditions of -semigoodness and describe their limits of
performance
TV-min and Greedy Pursuit for Constrained Joint Sparsity and Application to Inverse Scattering
This paper proposes a general framework for compressed sensing of constrained
joint sparsity (CJS) which includes total variation minimization (TV-min) as an
example. TV- and 2-norm error bounds, independent of the ambient dimension, are
derived for the CJS version of Basis Pursuit and Orthogonal Matching Pursuit.
As an application the results extend Cand`es, Romberg and Tao's proof of exact
recovery of piecewise constant objects with noiseless incomplete Fourier data
to the case of noisy data.Comment: Mathematics and Mechanics of Complex Systems (2013
- …