4,639 research outputs found
Support Recovery with Sparsely Sampled Free Random Matrices
Consider a Bernoulli-Gaussian complex -vector whose components are , with X_i \sim \Cc\Nc(0,\Pc_x) and binary mutually independent
and iid across . This random -sparse vector is multiplied by a square
random matrix \Um, and a randomly chosen subset, of average size , , of the resulting vector components is then observed in additive
Gaussian noise. We extend the scope of conventional noisy compressive sampling
models where \Um is typically %A16 the identity or a matrix with iid
components, to allow \Um satisfying a certain freeness condition. This class
of matrices encompasses Haar matrices and other unitarily invariant matrices.
We use the replica method and the decoupling principle of Guo and Verd\'u, as
well as a number of information theoretic bounds, to study the input-output
mutual information and the support recovery error rate in the limit of . We also extend the scope of the large deviation approach of Rangan,
Fletcher and Goyal and characterize the performance of a class of estimators
encompassing thresholded linear MMSE and relaxation
RSB Decoupling Property of MAP Estimators
The large-system decoupling property of a MAP estimator is studied when it
estimates the i.i.d. vector from the observation
with
being chosen from a wide range of matrix ensembles, and the noise vector
being i.i.d. and Gaussian. Using the replica method, we show
that the marginal joint distribution of any two corresponding input and output
symbols converges to a deterministic distribution which describes the
input-output distribution of a single user system followed by a MAP estimator.
Under the RSB assumption, the single user system is a scalar channel with
additive noise where the noise term is given by the sum of an independent
Gaussian random variable and correlated interference terms. As the RSB
assumption reduces to RS, the interference terms vanish which results in the
formerly studied RS decoupling principle.Comment: 5 pages, presented in Information Theory Workshop 201
- …