141 research outputs found
Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
Recovering a sparse signal from an undersampled set of random linear
measurements is the main problem of interest in compressed sensing. In this
paper, we consider the case where both the signal and the measurements are
complex. We study the popular reconstruction method of -regularized
least squares or LASSO. While several studies have shown that the LASSO
algorithm offers desirable solutions under certain conditions, the precise
asymptotic performance of this algorithm in the complex setting is not yet
known. In this paper, we extend the approximate message passing (AMP) algorithm
to the complex signals and measurements and obtain the complex approximate
message passing algorithm (CAMP). We then generalize the state evolution
framework recently introduced for the analysis of AMP, to the complex setting.
Using the state evolution, we derive accurate formulas for the phase transition
and noise sensitivity of both LASSO and CAMP
Maximin Analysis of Message Passing Algorithms for Recovering Block Sparse Signals
We consider the problem of recovering a block (or group) sparse signal from
an underdetermined set of random linear measurements, which appear in
compressed sensing applications such as radar and imaging. Recent results of
Donoho, Johnstone, and Montanari have shown that approximate message passing
(AMP) in combination with Stein's shrinkage outperforms group LASSO for large
block sizes. In this paper, we prove that, for a fixed block size and in the
strong undersampling regime (i.e., having very few measurements compared to the
ambient dimension), AMP cannot improve upon group LASSO, thereby complementing
the results of Donoho et al
On Sparse Vector Recovery Performance in Structurally Orthogonal Matrices via LASSO
In this paper, we consider the compressed sensing problem of reconstructing a sparse signal from an undersampled set of noisy linear measurements. The regularized least squares or least absolute shrinkage and selection operator (LASSO) formulation is used for signal estimation. The measurement matrix is assumed to be constructed by concatenating several randomly orthogonal bases, which we refer to as structurally orthogonal matrices. Such measurement matrix is highly relevant to large-scale compressive sensing applications because it facilitates rapid computation and parallel processing. Using the replica method in statistical physics, we derive the mean-squared-error (MSE) formula of reconstruction over the structurally orthogonal matrix in the large-system regime. Extensive numerical experiments are provided to verify the analytical result. We then consider the analytical result to investigate the MSE behaviors of the LASSO over the structurally orthogonal matrix, with an emphasis on performance comparisons with matrices with independent and identically distributed (i.i.d.) Gaussian entries. We find that structurally orthogonal matrices are at least as good as their i.i.d. Gaussian counterparts. Thus, the use of structurally orthogonal matrices is attractive in practical applications
- …