5,442 research outputs found

    Random Access Channel Coding in the Finite Blocklength Regime

    Get PDF
    Consider a random access communication scenario over a channel whose operation is defined for any number of possible transmitters. Inspired by the model recently introduced by Polyanskiy for the Multiple Access Channel (MAC) with a fixed, known number of transmitters, we assume that the channel is invariant to permutations on its inputs, and that all active transmitters employ identical encoders. Unlike Polyanskiy, we consider a scenario where neither the transmitters nor the receiver know which transmitters are active. We refer to this agnostic communication setup as the Random Access Channel, or RAC. Scheduled feedback of a finite number of bits is used to synchronize the transmitters. The decoder is tasked with determining from the channel output the number of active transmitters (kk) and their messages but not which transmitter sent which message. The decoding procedure occurs at a time ntn_t depending on the decoder's estimate tt of the number of active transmitters, kk, thereby achieving a rate that varies with the number of active transmitters. Single-bit feedback at each time ni,i≤tn_i, i \leq t, enables all transmitters to determine the end of one coding epoch and the start of the next. The central result of this work demonstrates the achievability on a RAC of performance that is first-order optimal for the MAC in operation during each coding epoch. While prior multiple access schemes for a fixed number of transmitters require 2k−12^k - 1 simultaneous threshold rules, the proposed scheme uses a single threshold rule and achieves the same dispersion.Comment: Presented at ISIT18', submitted to IEEE Transactions on Information Theor

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Joint source-channel coding with feedback

    Get PDF
    This paper quantifies the fundamental limits of variable-length transmission of a general (possibly analog) source over a memoryless channel with noiseless feedback, under a distortion constraint. We consider excess distortion, average distortion and guaranteed distortion (dd-semifaithful codes). In contrast to the asymptotic fundamental limit, a general conclusion is that allowing variable-length codes and feedback leads to a sizable improvement in the fundamental delay-distortion tradeoff. In addition, we investigate the minimum energy required to reproduce kk source samples with a given fidelity after transmission over a memoryless Gaussian channel, and we show that the required minimum energy is reduced with feedback and an average (rather than maximal) power constraint.Comment: To appear in IEEE Transactions on Information Theor

    Finite-Block-Length Analysis in Classical and Quantum Information Theory

    Full text link
    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects

    Can Negligible Cooperation Increase Network Reliability?

    Get PDF
    In network cooperation strategies, nodes work together with the aim of increasing transmission rates or reliability. This paper demonstrates that enabling cooperation between the transmitters of a two-user multiple access channel, via a cooperation facilitator that has access to both messages, always results in a network whose maximal- and average-error sum-capacities are the same---even when those capacities differ in the absence of cooperation and the information shared with the encoders is negligible. From this result, it follows that if a multiple access channel with no transmitter cooperation has different maximal- and average-error sum-capacities, then the maximal-error sum-capacity of the network consisting of this channel and a cooperation facilitator is not continuous with respect to the output edge capacities of the facilitator. This shows that there exist networks where sharing even a negligible number of bits per channel use with the encoders yields a non-negligible benefit.Comment: 27 pages, 3 figures. Submitted to the IEEE Transactions on Information Theor

    Lecture Notes on Network Information Theory

    Full text link
    These lecture notes have been converted to a book titled Network Information Theory published recently by Cambridge University Press. This book provides a significantly expanded exposition of the material in the lecture notes as well as problems and bibliographic notes at the end of each chapter. The authors are currently preparing a set of slides based on the book that will be posted in the second half of 2012. More information about the book can be found at http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/
    • …
    corecore