6 research outputs found

    On Sparse Vector Recovery Performance in Structurally Orthogonal Matrices via LASSO

    Get PDF
    In this paper, we consider the compressed sensing problem of reconstructing a sparse signal from an undersampled set of noisy linear measurements. The regularized least squares or least absolute shrinkage and selection operator (LASSO) formulation is used for signal estimation. The measurement matrix is assumed to be constructed by concatenating several randomly orthogonal bases, which we refer to as structurally orthogonal matrices. Such measurement matrix is highly relevant to large-scale compressive sensing applications because it facilitates rapid computation and parallel processing. Using the replica method in statistical physics, we derive the mean-squared-error (MSE) formula of reconstruction over the structurally orthogonal matrix in the large-system regime. Extensive numerical experiments are provided to verify the analytical result. We then consider the analytical result to investigate the MSE behaviors of the LASSO over the structurally orthogonal matrix, with an emphasis on performance comparisons with matrices with independent and identically distributed (i.i.d.) Gaussian entries. We find that structurally orthogonal matrices are at least as good as their i.i.d. Gaussian counterparts. Thus, the use of structurally orthogonal matrices is attractive in practical applications

    On the Performance of Turbo Signal Recovery with Partial DFT Sensing Matrices

    Full text link
    This letter is on the performance of the turbo signal recovery (TSR) algorithm for partial discrete Fourier transform (DFT) matrices based compressed sensing. Based on state evolution analysis, we prove that TSR with a partial DFT sensing matrix outperforms the well-known approximate message passing (AMP) algorithm with an independent identically distributed (IID) sensing matrix.Comment: to appear in IEEE Signal Processing Letter

    Isotropically Random Orthogonal Matrices: Performance of LASSO and Minimum Conic Singular Values

    Full text link
    Recently, the precise performance of the Generalized LASSO algorithm for recovering structured signals from compressed noisy measurements, obtained via i.i.d. Gaussian matrices, has been characterized. The analysis is based on a framework introduced by Stojnic and heavily relies on the use of Gordon's Gaussian min-max theorem (GMT), a comparison principle on Gaussian processes. As a result, corresponding characterizations for other ensembles of measurement matrices have not been developed. In this work, we analyze the corresponding performance of the ensemble of isotropically random orthogonal (i.r.o.) measurements. We consider the constrained version of the Generalized LASSO and derive a sharp characterization of its normalized squared error in the large-system limit. When compared to its Gaussian counterpart, our result analytically confirms the superiority in performance of the i.r.o. ensemble. Our second result, derives an asymptotic lower bound on the minimum conic singular values of i.r.o. matrices. This bound is larger than the corresponding bound on Gaussian matrices. To prove our results we express i.r.o. matrices in terms of Gaussians and show that, with some modifications, the GMT framework is still applicable

    On Sparse Vector Recovery Performance in Structurally Orthogonal Matrices via LASSO

    No full text
    corecore