5 research outputs found

    Capacity and Random-Coding Exponents for Channel Coding with Side Information

    Full text link
    Capacity formulas and random-coding exponents are derived for a generalized family of Gel'fand-Pinsker coding problems. These exponents yield asymptotic upper bounds on the achievable log probability of error. In our model, information is to be reliably transmitted through a noisy channel with finite input and output alphabets and random state sequence, and the channel is selected by a hypothetical adversary. Partial information about the state sequence is available to the encoder, adversary, and decoder. The design of the transmitter is subject to a cost constraint. Two families of channels are considered: 1) compound discrete memoryless channels (CDMC), and 2) channels with arbitrary memory, subject to an additive cost constraint, or more generally to a hard constraint on the conditional type of the channel output given the input. Both problems are closely connected. The random-coding exponent is achieved using a stacked binning scheme and a maximum penalized mutual information decoder, which may be thought of as an empirical generalized Maximum a Posteriori decoder. For channels with arbitrary memory, the random-coding exponents are larger than their CDMC counterparts. Applications of this study include watermarking, data hiding, communication in presence of partially known interferers, and problems such as broadcast channels, all of which involve the fundamental idea of binning.Comment: to appear in IEEE Transactions on Information Theory, without Appendices G and

    Zero-rate feedback can achieve the empirical capacity

    Full text link
    The utility of limited feedback for coding over an individual sequence of DMCs is investigated. This study complements recent results showing how limited or noisy feedback can boost the reliability of communication. A strategy with fixed input distribution PP is given that asymptotically achieves rates arbitrarily close to the mutual information induced by PP and the state-averaged channel. When the capacity achieving input distribution is the same over all channel states, this achieves rates at least as large as the capacity of the state averaged channel, sometimes called the empirical capacity.Comment: Revised version of paper originally submitted to IEEE Transactions on Information Theory, Nov. 2007. This version contains further revisions and clarification

    On error exponents for arbitrarily varying channels

    No full text
    Abstract- The minimum probability of error achievable by random codes on the arbitrarily varying channel (AVC) is in-vestigated. New exponential error bounds are found and applied to the AVC with and without input and state constraints. Also considered is a simple subclass of random codes, called randomly modulated codes, in which encoding and decoding operations are separate from code randomization. A universal coding theorem is proved which shows the existence of randomly modulated codes that achieve the same error bounds as “fully ” random codes for all AVC’s. Index Terms- Arbitrarily varying channels, error exponents, random codes, jamming. T I
    corecore