149 research outputs found

    A probabilistic and RIPless theory of compressed sensing

    Full text link
    This paper introduces a simple and very general theory of compressive sensing. In this theory, the sensing mechanism simply selects sensing vectors independently at random from a probability distribution F; it includes all models - e.g. Gaussian, frequency measurements - discussed in the literature, but also provides a framework for new measurement strategies as well. We prove that if the probability distribution F obeys a simple incoherence property and an isotropy property, one can faithfully recover approximately sparse signals from a minimal number of noisy measurements. The novelty is that our recovery results do not require the restricted isometry property (RIP) - they make use of a much weaker notion - or a random model for the signal. As an example, the paper shows that a signal with s nonzero entries can be faithfully recovered from about s log n Fourier coefficients that are contaminated with noise.Comment: 36 page

    Nonuniform Sparse Recovery with Subgaussian Matrices

    Full text link
    Compressive sensing predicts that sufficiently sparse vectors can be recovered from highly incomplete information. Efficient recovery methods such as â„“1\ell_1-minimization find the sparsest solution to certain systems of equations. Random matrices have become a popular choice for the measurement matrix. Indeed, near-optimal uniform recovery results have been shown for such matrices. In this note we focus on nonuniform recovery using Gaussian random matrices and â„“1\ell_1-minimization. We provide a condition on the number of samples in terms of the sparsity and the signal length which guarantees that a fixed sparse signal can be recovered with a random draw of the matrix using â„“1\ell_1-minimization. The constant 2 in the condition is optimal, and the proof is rather short compared to a similar result due to Donoho and Tanner

    Sparsity and Parallel Acquisition: Optimal Uniform and Nonuniform Recovery Guarantees

    Full text link
    The problem of multiple sensors simultaneously acquiring measurements of a single object can be found in many applications. In this paper, we present the optimal recovery guarantees for the recovery of compressible signals from multi-sensor measurements using compressed sensing. In the first half of the paper, we present both uniform and nonuniform recovery guarantees for the conventional sparse signal model in a so-called distinct sensing scenario. In the second half, using the so-called sparse and distributed signal model, we present nonuniform recovery guarantees which effectively broaden the class of sensing scenarios for which optimal recovery is possible, including to the so-called identical sampling scenario. To verify our recovery guarantees we provide several numerical results including phase transition curves and numerically-computed bounds.Comment: 13 pages and 3 figure

    Sampling by blocks of measurements in compressed sensing

    Get PDF
    Various acquisition devices impose sampling blocks of measurements. A typical example is parallel magnetic resonance imaging (MRI) where several radio-frequency coils simultaneously acquire a set of Fourier modulated coefficients. We study a new random sampling approach that consists in selecting a set of blocks that are predefined by the application of interest. We provide theoretical results on the number of blocks that are required for exact sparse signal reconstruction. We finish by illustrating these results on various examples, and discuss their connection to the literature on CS

    Necessary and sufficient conditions of solution uniqueness in â„“1\ell_1 minimization

    Full text link
    This paper shows that the solutions to various convex ℓ1\ell_1 minimization problems are \emph{unique} if and only if a common set of conditions are satisfied. This result applies broadly to the basis pursuit model, basis pursuit denoising model, Lasso model, as well as other ℓ1\ell_1 models that either minimize f(Ax−b)f(Ax-b) or impose the constraint f(Ax−b)≤σf(Ax-b)\leq\sigma, where ff is a strictly convex function. For these models, this paper proves that, given a solution x∗x^* and defining I=\supp(x^*) and s=\sign(x^*_I), x∗x^* is the unique solution if and only if AIA_I has full column rank and there exists yy such that AITy=sA_I^Ty=s and ∣aiTy∣∞<1|a_i^Ty|_\infty<1 for i∉Ii\not\in I. This condition is previously known to be sufficient for the basis pursuit model to have a unique solution supported on II. Indeed, it is also necessary, and applies to a variety of other ℓ1\ell_1 models. The paper also discusses ways to recognize unique solutions and verify the uniqueness conditions numerically.Comment: 6 pages; revised version; submitte

    RIPless compressed sensing from anisotropic measurements

    Full text link
    Compressed sensing is the art of reconstructing a sparse vector from its inner products with respect to a small set of randomly chosen measurement vectors. It is usually assumed that the ensemble of measurement vectors is in isotropic position in the sense that the associated covariance matrix is proportional to the identity matrix. In this paper, we establish bounds on the number of required measurements in the anisotropic case, where the ensemble of measurement vectors possesses a non-trivial covariance matrix. Essentially, we find that the required sampling rate grows proportionally to the condition number of the covariance matrix. In contrast to other recent contributions to this problem, our arguments do not rely on any restricted isometry properties (RIP's), but rather on ideas from convex geometry which have been systematically studied in the theory of low-rank matrix recovery. This allows for a simple argument and slightly improved bounds, but may lead to a worse dependency on noise (which we do not consider in the present paper).Comment: 19 pages. To appear in Linear Algebra and its Applications, Special Issue on Sparse Approximate Solution of Linear System
    • …
    corecore