307 research outputs found
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
The Reliability Function of Lossy Source-Channel Coding of Variable-Length Codes with Feedback
We consider transmission of discrete memoryless sources (DMSes) across
discrete memoryless channels (DMCs) using variable-length lossy source-channel
codes with feedback. The reliability function (optimum error exponent) is shown
to be equal to where is the rate-distortion
function of the source, is the maximum relative entropy between output
distributions of the DMC, and is the Shannon capacity of the channel. We
show that, in this setting and in this asymptotic regime, separate
source-channel coding is, in fact, optimal.Comment: Accepted to IEEE Transactions on Information Theory in Apr. 201
Robust Gaussian Joint Source-Channel Coding Under the Near-Zero Bandwidth Regime
Minimum power required to achieve a distortion-noise profile, i.e., a
function indicating the maximum allowed distortion value for each noise level,
is studied for the transmission of Gaussian sources over Gaussian channels
under a regime of bandwidth approaching zero. A simple but instrumental lower
bound to the minimum required power for a given profile is presented. For an
upper bound, a dirty-paper based coding scheme is proposed and its
power-distortion tradeoff is analyzed. Finally, upper and lower bounds to the
minimum power is analyzed and compared for specific distortion-noise profiles,
namely rational profiles with order one and two
- β¦