12,883 research outputs found

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Data Processing Bounds for Scalar Lossy Source Codes with Side Information at the Decoder

    Full text link
    In this paper, we introduce new lower bounds on the distortion of scalar fixed-rate codes for lossy compression with side information available at the receiver. These bounds are derived by presenting the relevant random variables as a Markov chain and applying generalized data processing inequalities a la Ziv and Zakai. We show that by replacing the logarithmic function with other functions, in the data processing theorem we formulate, we obtain new lower bounds on the distortion of scalar coding with side information at the decoder. The usefulness of these results is demonstrated for uniform sources and the convex function Q(t)=t1αQ(t)=t^{1-\alpha}, α>1\alpha>1. The bounds in this case are shown to be better than one can obtain from the Wyner-Ziv rate-distortion function.Comment: 35 pages, 9 figure

    A Resource Framework for Quantum Shannon Theory

    Full text link
    Quantum Shannon theory is loosely defined as a collection of coding theorems, such as classical and quantum source compression, noisy channel coding theorems, entanglement distillation, etc., which characterize asymptotic properties of quantum and classical channels and states. In this paper we advocate a unified approach to an important class of problems in quantum Shannon theory, consisting of those that are bipartite, unidirectional and memoryless. We formalize two principles that have long been tacitly understood. First, we describe how the Church of the larger Hilbert space allows us to move flexibly between states, channels, ensembles and their purifications. Second, we introduce finite and asymptotic (quantum) information processing resources as the basic objects of quantum Shannon theory and recast the protocols used in direct coding theorems as inequalities between resources. We develop the rules of a resource calculus which allows us to manipulate and combine resource inequalities. This framework simplifies many coding theorem proofs and provides structural insights into the logical dependencies among coding theorems. We review the above-mentioned basic coding results and show how a subset of them can be unified into a family of related resource inequalities. Finally, we use this family to find optimal trade-off curves for all protocols involving one noisy quantum resource and two noiseless ones.Comment: 60 page
    corecore