5,387 research outputs found

    Sparsity Order Estimation from a Single Compressed Observation Vector

    Full text link
    We investigate the problem of estimating the unknown degree of sparsity from compressive measurements without the need to carry out a sparse recovery step. While the sparsity order can be directly inferred from the effective rank of the observation matrix in the multiple snapshot case, this appears to be impossible in the more challenging single snapshot case. We show that specially designed measurement matrices allow to rearrange the measurement vector into a matrix such that its effective rank coincides with the effective sparsity order. In fact, we prove that matrices which are composed of a Khatri-Rao product of smaller matrices generate measurements that allow to infer the sparsity order. Moreover, if some samples are used more than once, one of the matrices needs to be Vandermonde. These structural constraints reduce the degrees of freedom in choosing the measurement matrix which may incur in a degradation in the achievable coherence. We thus also address suitable choices of the measurement matrices. In particular, we analyze Khatri-Rao and Vandermonde matrices in terms of their coherence and provide a new design for Vandermonde matrices that achieves a low coherence

    Deterministic Constructions of Binary Measurement Matrices from Finite Geometry

    Full text link
    Deterministic constructions of measurement matrices in compressed sensing (CS) are considered in this paper. The constructions are inspired by the recent discovery of Dimakis, Smarandache and Vontobel which says that parity-check matrices of good low-density parity-check (LDPC) codes can be used as {provably} good measurement matrices for compressed sensing under â„“1\ell_1-minimization. The performance of the proposed binary measurement matrices is mainly theoretically analyzed with the help of the analyzing methods and results from (finite geometry) LDPC codes. Particularly, several lower bounds of the spark (i.e., the smallest number of columns that are linearly dependent, which totally characterizes the recovery performance of â„“0\ell_0-minimization) of general binary matrices and finite geometry matrices are obtained and they improve the previously known results in most cases. Simulation results show that the proposed matrices perform comparably to, sometimes even better than, the corresponding Gaussian random matrices. Moreover, the proposed matrices are sparse, binary, and most of them have cyclic or quasi-cyclic structure, which will make the hardware realization convenient and easy.Comment: 12 pages, 11 figure

    Structured random measurements in signal processing

    Full text link
    Compressed sensing and its extensions have recently triggered interest in randomized signal acquisition. A key finding is that random measurements provide sparse signal reconstruction guarantees for efficient and stable algorithms with a minimal number of samples. While this was first shown for (unstructured) Gaussian random measurement matrices, applications require certain structure of the measurements leading to structured random measurement matrices. Near optimal recovery guarantees for such structured measurements have been developed over the past years in a variety of contexts. This article surveys the theory in three scenarios: compressed sensing (sparse recovery), low rank matrix recovery, and phaseless estimation. The random measurement matrices to be considered include random partial Fourier matrices, partial random circulant matrices (subsampled convolutions), matrix completion, and phase estimation from magnitudes of Fourier type measurements. The article concludes with a brief discussion of the mathematical techniques for the analysis of such structured random measurements.Comment: 22 pages, 2 figure
    • …
    corecore