We consider the estimation of a signal from the knowledge of its noisy linear
random Gaussian projections. A few examples where this problem is relevant are
compressed sensing, sparse superposition codes, and code division multiple
access. There has been a number of works considering the mutual information for
this problem using the replica method from statistical physics. Here we put
these considerations on a firm rigorous basis. First, we show, using a
Guerra-Toninelli type interpolation, that the replica formula yields an upper
bound to the exact mutual information. Secondly, for many relevant practical
cases, we present a converse lower bound via a method that uses spatial
coupling, state evolution analysis and the I-MMSE theorem. This yields a single
letter formula for the mutual information and the minimal-mean-square error for
random Gaussian linear estimation of all discrete bounded signals. In addition,
we prove that the low complexity approximate message-passing algorithm is
optimal outside of the so-called hard phase, in the sense that it
asymptotically reaches the minimal-mean-square error. In this work spatial
coupling is used primarily as a proof technique. However our results also prove
two important features of spatially coupled noisy linear random Gaussian
estimation. First there is no algorithmically hard phase. This means that for
such systems approximate message-passing always reaches the minimal-mean-square
error. Secondly, in a proper limit the mutual information associated to such
systems is the same as the one of uncoupled linear random Gaussian estimation