2,873 research outputs found

    Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation

    Full text link
    We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type interpolation, that the replica formula yields an upper bound to the exact mutual information. Secondly, for many relevant practical cases, we present a converse lower bound via a method that uses spatial coupling, state evolution analysis and the I-MMSE theorem. This yields a single letter formula for the mutual information and the minimal-mean-square error for random Gaussian linear estimation of all discrete bounded signals. In addition, we prove that the low complexity approximate message-passing algorithm is optimal outside of the so-called hard phase, in the sense that it asymptotically reaches the minimal-mean-square error. In this work spatial coupling is used primarily as a proof technique. However our results also prove two important features of spatially coupled noisy linear random Gaussian estimation. First there is no algorithmically hard phase. This means that for such systems approximate message-passing always reaches the minimal-mean-square error. Secondly, in a proper limit the mutual information associated to such systems is the same as the one of uncoupled linear random Gaussian estimation

    On Capacity Optimality of OAMP: Beyond IID Sensing Matrices and Gaussian Signaling

    Full text link
    This paper investigates a large unitarily invariant system (LUIS) involving a unitarily invariant sensing matrix, an arbitrarily fixed signal distribution, and forward error control (FEC) coding. A universal Gram-Schmidt orthogonalization is considered for the construction of orthogonal approximate message passing (OAMP), which renders the results applicable to general prototypes without the differentiability restriction. For OAMP with Lipschitz continuous local estimators, we develop two variational single-input-single-output transfer functions, based on which we analyze the achievable rate of OAMP. Furthermore, when the state evolution of OAMP has a unique fixed point, we reveal that OAMP reaches the constrained capacity predicted by the replica method of the LUIS with an arbitrary signal distribution based on matched FEC coding. The replica method is rigorous for LUIS with Gaussian signaling and for certain sub-classes of LUIS with arbitrary signal distributions. Several area properties are established based on the variational transfer functions of OAMP. Meanwhile, we elaborate a replica constrained capacity-achieving coding principle for LUIS, based on which irregular low-density parity-check (LDPC) codes are optimized for binary signaling in the simulation results. We show that OAMP with the optimized codes has significant performance improvement over the un-optimized ones and the well-known Turbo linear MMSE algorithm. For quadrature phase-shift keying (QPSK) modulation, replica constrained capacity-approaching bit error rate (BER) performances are observed under various channel conditions.Comment: Single column, 34 pages, 9 figure

    Dynamical Functional Theory for Compressed Sensing

    Get PDF
    We introduce a theoretical approach for designing generalizations of the approximate message passing (AMP) algorithm for compressed sensing which are valid for large observation matrices that are drawn from an invariant random matrix ensemble. By design, the fixed points of the algorithm obey the Thouless-Anderson-Palmer (TAP) equations corresponding to the ensemble. Using a dynamical functional approach we are able to derive an effective stochastic process for the marginal statistics of a single component of the dynamics. This allows us to design memory terms in the algorithm in such a way that the resulting fields become Gaussian random variables allowing for an explicit analysis. The asymptotic statistics of these fields are consistent with the replica ansatz of the compressed sensing problem.Comment: 5 pages, accepted for ISIT 201
    • …
    corecore