1,292 research outputs found
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
The Dispersion of Nearest-Neighbor Decoding for Additive Non-Gaussian Channels
We study the second-order asymptotics of information transmission using
random Gaussian codebooks and nearest neighbor (NN) decoding over a
power-limited stationary memoryless additive non-Gaussian noise channel. We
show that the dispersion term depends on the non-Gaussian noise only through
its second and fourth moments, thus complementing the capacity result
(Lapidoth, 1996), which depends only on the second moment. Furthermore, we
characterize the second-order asymptotics of point-to-point codes over
-sender interference networks with non-Gaussian additive noise.
Specifically, we assume that each user's codebook is Gaussian and that NN
decoding is employed, i.e., that interference from the unintended users
(Gaussian interfering signals) is treated as noise at each decoder. We show
that while the first-order term in the asymptotic expansion of the maximum
number of messages depends on the power of the interferring codewords only
through their sum, this does not hold for the second-order term.Comment: 12 pages, 3 figures, IEEE Transactions on Information Theor
A Hierarchy of Information Quantities for Finite Block Length Analysis of Quantum Tasks
We consider two fundamental tasks in quantum information theory, data
compression with quantum side information as well as randomness extraction
against quantum side information. We characterize these tasks for general
sources using so-called one-shot entropies. We show that these
characterizations - in contrast to earlier results - enable us to derive tight
second order asymptotics for these tasks in the i.i.d. limit. More generally,
our derivation establishes a hierarchy of information quantities that can be
used to investigate information theoretic tasks in the quantum domain: The
one-shot entropies most accurately describe an operational quantity, yet they
tend to be difficult to calculate for large systems. We show that they
asymptotically agree up to logarithmic terms with entropies related to the
quantum and classical information spectrum, which are easier to calculate in
the i.i.d. limit. Our techniques also naturally yields bounds on operational
quantities for finite block lengths.Comment: See also arXiv:1208.1400, which independently derives part of our
result: the second order asymptotics for binary hypothesis testin
- …