27,202 research outputs found

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Second-Order Asymptotics for the Discrete Memoryless MAC with Degraded Message Sets

    Get PDF
    This paper studies the second-order asymptotics of the discrete memoryless multiple-access channel with degraded message sets. For a fixed average error probability ϵ(0,1)\epsilon\in(0,1) and an arbitrary point on the boundary of the capacity region, we characterize the speed of convergence of rate pairs that converge to that point for codes that have asymptotic error probability no larger than ϵ\epsilon, thus complementing an analogous result given previously for the Gaussian setting.Comment: 5 Pages, 1 Figure. Follow-up paper of http://arxiv.org/abs/1310.1197. Accepted to ISIT 201

    The Third-Order Term in the Normal Approximation for the AWGN Channel

    Full text link
    This paper shows that, under the average error probability formalism, the third-order term in the normal approximation for the additive white Gaussian noise channel with a maximal or equal power constraint is at least 12logn+O(1)\frac{1}{2} \log n + O(1). This matches the upper bound derived by Polyanskiy-Poor-Verd\'{u} (2010).Comment: 13 pages, 1 figur
    corecore