1,083 research outputs found

    Alphabet Sizes of Auxiliary Variables in Canonical Inner Bounds

    Full text link
    Alphabet size of auxiliary random variables in our canonical description is derived. Our analysis improves upon estimates known in special cases, and generalizes to an arbitrary multiterminal setup. The salient steps include decomposition of constituent rate polytopes into orthants, translation of a hyperplane till it becomes tangent to the achievable region at an extreme point, and derivation of minimum auxiliary alphabet sizes based on Caratheodory's theorem.Comment: 20 pages, no figures, explanation of a part of impending IEEE IT submissio

    A Generalized Typicality for Abstract Alphabets

    Full text link
    A new notion of typicality for arbitrary probability measures on standard Borel spaces is proposed, which encompasses the classical notions of weak and strong typicality as special cases. Useful lemmas about strong typical sets, including conditional typicality lemma, joint typicality lemma, and packing and covering lemmas, which are fundamental tools for deriving many inner bounds of various multi-terminal coding problems, are obtained in terms of the proposed notion. This enables us to directly generalize lots of results on finite alphabet problems to general problems involving abstract alphabets, without any complicated additional arguments. For instance, quantization procedure is no longer necessary to achieve such generalizations. Another fundamental lemma, Markov lemma, is also obtained but its scope of application is quite limited compared to others. Yet, an alternative theory of typical sets for Gaussian measures, free from this limitation, is also developed. Some remarks on a possibility to generalize the proposed notion for sources with memory are also given.Comment: 44 pages; submitted to IEEE Transactions on Information Theor

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Classical capacity of bosonic broadcast communication and a new minimum output entropy conjecture

    Full text link
    Previous work on the classical information capacities of bosonic channels has established the capacity of the single-user pure-loss channel, bounded the capacity of the single-user thermal-noise channel, and bounded the capacity region of the multiple-access channel. The latter is a multi-user scenario in which several transmitters seek to simultaneously and independently communicate to a single receiver. We study the capacity region of the bosonic broadcast channel, in which a single transmitter seeks to simultaneously and independently communicate to two different receivers. It is known that the tightest available lower bound on the capacity of the single-user thermal-noise channel is that channel's capacity if, as conjectured, the minimum von Neumann entropy at the output of a bosonic channel with additive thermal noise occurs for coherent-state inputs. Evidence in support of this minimum output entropy conjecture has been accumulated, but a rigorous proof has not been obtained. In this paper, we propose a new minimum output entropy conjecture that, if proved to be correct, will establish that the capacity region of the bosonic broadcast channel equals the inner bound achieved using a coherent-state encoding and optimum detection. We provide some evidence that supports this new conjecture, but again a full proof is not available.Comment: 13 pages, 7 figure
    • …
    corecore