106 research outputs found

    New Bounds for Restricted Isometry Constants

    Get PDF
    In this paper we show that if the restricted isometry constant δk\delta_k of the compressed sensing matrix satisfies δk<0.307, \delta_k < 0.307, then kk-sparse signals are guaranteed to be recovered exactly via ℓ1\ell_1 minimization when no noise is present and kk-sparse signals can be estimated stably in the noisy case. It is also shown that the bound cannot be substantively improved. An explicitly example is constructed in which δk=k−12k−1<0.5\delta_{k}=\frac{k-1}{2k-1} < 0.5, but it is impossible to recover certain kk-sparse signals

    Limits on Sparse Data Acquisition: RIC Analysis of Finite Gaussian Matrices

    Full text link
    One of the key issues in the acquisition of sparse data by means of compressed sensing (CS) is the design of the measurement matrix. Gaussian matrices have been proven to be information-theoretically optimal in terms of minimizing the required number of measurements for sparse recovery. In this paper we provide a new approach for the analysis of the restricted isometry constant (RIC) of finite dimensional Gaussian measurement matrices. The proposed method relies on the exact distributions of the extreme eigenvalues for Wishart matrices. First, we derive the probability that the restricted isometry property is satisfied for a given sufficient recovery condition on the RIC, and propose a probabilistic framework to study both the symmetric and asymmetric RICs. Then, we analyze the recovery of compressible signals in noise through the statistical characterization of stability and robustness. The presented framework determines limits on various sparse recovery algorithms for finite size problems. In particular, it provides a tight lower bound on the maximum sparsity order of the acquired data allowing signal recovery with a given target probability. Also, we derive simple approximations for the RICs based on the Tracy-Widom distribution.Comment: 11 pages, 6 figures, accepted for publication in IEEE transactions on information theor

    Sparse approximation property and stable recovery of sparse signals from noisy measurements

    Full text link
    In this paper, we introduce a sparse approximation property of order ss for a measurement matrix A{\bf A}: ∥xs∥2≤D∥Ax∥2+βσs(x)sfor all x,\|{\bf x}_s\|_2\le D \|{\bf A}{\bf x}\|_2+ \beta \frac{\sigma_s({\bf x})}{\sqrt{s}} \quad {\rm for\ all} \ {\bf x}, where xs{\bf x}_s is the best ss-sparse approximation of the vector x{\bf x} in ℓ2\ell^2, σs(x)\sigma_s({\bf x}) is the ss-sparse approximation error of the vector x{\bf x} in ℓ1\ell^1, and DD and β\beta are positive constants. The sparse approximation property for a measurement matrix can be thought of as a weaker version of its restricted isometry property and a stronger version of its null space property. In this paper, we show that the sparse approximation property is an appropriate condition on a measurement matrix to consider stable recovery of any compressible signal from its noisy measurements. In particular, we show that any compressible signalcan be stably recovered from its noisy measurements via solving an ℓ1\ell^1-minimization problem if the measurement matrix has the sparse approximation property with β∈(0,1)\beta\in (0,1), and conversely the measurement matrix has the sparse approximation property with β∈(0,∞)\beta\in (0,\infty) if any compressible signal can be stably recovered from its noisy measurements via solving an ℓ1\ell^1-minimization problem.Comment: To appear in IEEE Trans. Signal Processing, 201
    • …
    corecore