442 research outputs found

    Deterministic Sampling of Sparse Trigonometric Polynomials

    Get PDF
    One can recover sparse multivariate trigonometric polynomials from few randomly taken samples with high probability (as shown by Kunis and Rauhut). We give a deterministic sampling of multivariate trigonometric polynomials inspired by Weil's exponential sum. Our sampling can produce a deterministic matrix satisfying the statistical restricted isometry property, and also nearly optimal Grassmannian frames. We show that one can exactly reconstruct every MM-sparse multivariate trigonometric polynomial with fixed degree and of length DD from the determinant sampling XX, using the orthogonal matching pursuit, and # X is a prime number greater than (MlogD)2(M\log D)^2. This result is almost optimal within the (logD)2(\log D)^2 factor. The simulations show that the deterministic sampling can offer reconstruction performance similar to the random sampling.Comment: 9 page

    Accuracy guarantees for L1-recovery

    Full text link
    We discuss two new methods of recovery of sparse signals from noisy observation based on 1\ell_1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to construct the estimators which possess better statistical properties than the commonly used ones. We also show how these techniques allow to provide efficiently computable accuracy bounds for Lasso and Dantzig Selector. We link our performance estimations to the well known results of Compressive Sensing and justify our proposed approach with an oracle inequality which links the properties of the recovery algorithms and the best estimation performance when the signal support is known. We demonstrate how the estimates can be computed using the Non-Euclidean Basis Pursuit algorithm

    Verifiable conditions of 1\ell_1-recovery of sparse signals with sign restrictions

    Full text link
    We propose necessary and sufficient conditions for a sensing matrix to be "s-semigood" -- to allow for exact 1\ell_1-recovery of sparse signals with at most ss nonzero entries under sign restrictions on part of the entries. We express the error bounds for imperfect 1\ell_1-recovery in terms of the characteristics underlying these conditions. Furthermore, we demonstrate that these characteristics, although difficult to evaluate, lead to verifiable sufficient conditions for exact sparse 1\ell_1-recovery and to efficiently computable upper bounds on those ss for which a given sensing matrix is ss-semigood. We concentrate on the properties of proposed verifiable sufficient conditions of ss-semigoodness and describe their limits of performance

    TV-min and Greedy Pursuit for Constrained Joint Sparsity and Application to Inverse Scattering

    Full text link
    This paper proposes a general framework for compressed sensing of constrained joint sparsity (CJS) which includes total variation minimization (TV-min) as an example. TV- and 2-norm error bounds, independent of the ambient dimension, are derived for the CJS version of Basis Pursuit and Orthogonal Matching Pursuit. As an application the results extend Cand`es, Romberg and Tao's proof of exact recovery of piecewise constant objects with noiseless incomplete Fourier data to the case of noisy data.Comment: Mathematics and Mechanics of Complex Systems (2013
    corecore