1,662 research outputs found
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
Mutual Information and Minimum Mean-square Error in Gaussian Channels
This paper deals with arbitrarily distributed finite-power input signals
observed through an additive Gaussian noise channel. It shows a new formula
that connects the input-output mutual information and the minimum mean-square
error (MMSE) achievable by optimal estimation of the input given the output.
That is, the derivative of the mutual information (nats) with respect to the
signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input
statistics. This relationship holds for both scalar and vector signals, as well
as for discrete-time and continuous-time noncausal MMSE estimation. This
fundamental information-theoretic result has an unexpected consequence in
continuous-time nonlinear estimation: For any input signal with finite power,
the causal filtering MMSE achieved at SNR is equal to the average value of the
noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is
chosen uniformly distributed between 0 and SNR
Statistical eigen-inference from large Wishart matrices
We consider settings where the observations are drawn from a zero-mean
multivariate (real or complex) normal distribution with the population
covariance matrix having eigenvalues of arbitrary multiplicity. We assume that
the eigenvectors of the population covariance matrix are unknown and focus on
inferential procedures that are based on the sample eigenvalues alone (i.e.,
"eigen-inference"). Results found in the literature establish the asymptotic
normality of the fluctuation in the trace of powers of the sample covariance
matrix. We develop concrete algorithms for analytically computing the limiting
quantities and the covariance of the fluctuations. We exploit the asymptotic
normality of the trace of powers of the sample covariance matrix to develop
eigenvalue-based procedures for testing and estimation. Specifically, we
formulate a simple test of hypotheses for the population eigenvalues and a
technique for estimating the population eigenvalues in settings where the
cumulative distribution function of the (nonrandom) population eigenvalues has
a staircase structure. Monte Carlo simulations are used to demonstrate the
superiority of the proposed methodologies over classical techniques and the
robustness of the proposed techniques in high-dimensional, (relatively) small
sample size settings. The improved performance results from the fact that the
proposed inference procedures are "global" (in a sense that we describe) and
exploit "global" information thereby overcoming the inherent biases that
cripple classical inference procedures which are "local" and rely on "local"
information.Comment: Published in at http://dx.doi.org/10.1214/07-AOS583 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …