42,194 research outputs found
On the Performance of Turbo Signal Recovery with Partial DFT Sensing Matrices
This letter is on the performance of the turbo signal recovery (TSR)
algorithm for partial discrete Fourier transform (DFT) matrices based
compressed sensing. Based on state evolution analysis, we prove that TSR with a
partial DFT sensing matrix outperforms the well-known approximate message
passing (AMP) algorithm with an independent identically distributed (IID)
sensing matrix.Comment: to appear in IEEE Signal Processing Letter
Mutual information for symmetric rank-one matrix estimation: A proof of the replica formula
Factorizing low-rank matrices has many applications in machine learning and
statistics. For probabilistic models in the Bayes optimal setting, a general
expression for the mutual information has been proposed using heuristic
statistical physics computations, and proven in few specific cases. Here, we
show how to rigorously prove the conjectured formula for the symmetric rank-one
case. This allows to express the minimal mean-square-error and to characterize
the detectability phase transitions in a large set of estimation problems
ranging from community detection to sparse PCA. We also show that for a large
set of parameters, an iterative algorithm called approximate message-passing is
Bayes optimal. There exists, however, a gap between what currently known
polynomial algorithms can do and what is expected information theoretically.
Additionally, the proof technique has an interest of its own and exploits three
essential ingredients: the interpolation method introduced in statistical
physics by Guerra, the analysis of the approximate message-passing algorithm
and the theory of spatial coupling and threshold saturation in coding. Our
approach is generic and applicable to other open problems in statistical
estimation where heuristic statistical physics predictions are available
Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
Recovering a sparse signal from an undersampled set of random linear
measurements is the main problem of interest in compressed sensing. In this
paper, we consider the case where both the signal and the measurements are
complex. We study the popular reconstruction method of -regularized
least squares or LASSO. While several studies have shown that the LASSO
algorithm offers desirable solutions under certain conditions, the precise
asymptotic performance of this algorithm in the complex setting is not yet
known. In this paper, we extend the approximate message passing (AMP) algorithm
to the complex signals and measurements and obtain the complex approximate
message passing algorithm (CAMP). We then generalize the state evolution
framework recently introduced for the analysis of AMP, to the complex setting.
Using the state evolution, we derive accurate formulas for the phase transition
and noise sensitivity of both LASSO and CAMP
Dynamical Functional Theory for Compressed Sensing
We introduce a theoretical approach for designing generalizations of the
approximate message passing (AMP) algorithm for compressed sensing which are
valid for large observation matrices that are drawn from an invariant random
matrix ensemble. By design, the fixed points of the algorithm obey the
Thouless-Anderson-Palmer (TAP) equations corresponding to the ensemble. Using a
dynamical functional approach we are able to derive an effective stochastic
process for the marginal statistics of a single component of the dynamics. This
allows us to design memory terms in the algorithm in such a way that the
resulting fields become Gaussian random variables allowing for an explicit
analysis. The asymptotic statistics of these fields are consistent with the
replica ansatz of the compressed sensing problem.Comment: 5 pages, accepted for ISIT 201
- …