1,201 research outputs found

    Partial Strong Converse for the Non-Degraded Wiretap Channel

    Full text link
    We prove the partial strong converse property for the discrete memoryless \emph{non-degraded} wiretap channel, for which we require the leakage to the eavesdropper to vanish but allow an asymptotic error probability ϵ[0,1)\epsilon \in [0,1) to the legitimate receiver. We show that when the transmission rate is above the secrecy capacity, the probability of correct decoding at the legitimate receiver decays to zero exponentially. Therefore, the maximum transmission rate is the same for ϵ[0,1)\epsilon \in [0,1), and the partial strong converse property holds. Our work is inspired by a recently developed technique based on information spectrum method and Chernoff-Cramer bound for evaluating the exponent of the probability of correct decoding

    The invalidity of a strong capacity for a quantum channel with memory

    Get PDF
    The strong capacity of a particular channel can be interpreted as a sharp limit on the amount of information which can be transmitted reliably over that channel. To evaluate the strong capacity of a particular channel one must prove both the direct part of the channel coding theorem and the strong converse for the channel. Here we consider the strong converse theorem for the periodic quantum channel and show some rather surprising results. We first show that the strong converse does not hold in general for this channel and therefore the channel does not have a strong capacity. Instead, we find that there is a scale of capacities corresponding to error probabilities between integer multiples of the inverse of the periodicity of the channel. A similar scale also exists for the random channel.Comment: 7 pages, double column. Comments welcome. Repeated equation removed and one reference adde

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Second-Order Coding Rates for Channels with State

    Full text link
    We study the performance limits of state-dependent discrete memoryless channels with a discrete state available at both the encoder and the decoder. We establish the epsilon-capacity as well as necessary and sufficient conditions for the strong converse property for such channels when the sequence of channel states is not necessarily stationary, memoryless or ergodic. We then seek a finer characterization of these capacities in terms of second-order coding rates. The general results are supplemented by several examples including i.i.d. and Markov states and mixed channels

    Strong Converse Theorems for Classes of Multimessage Multicast Networks: A R\'enyi Divergence Approach

    Full text link
    This paper establishes that the strong converse holds for some classes of discrete memoryless multimessage multicast networks (DM-MMNs) whose corresponding cut-set bounds are tight, i.e., coincide with the set of achievable rate tuples. The strong converse for these classes of DM-MMNs implies that all sequences of codes with rate tuples belonging to the exterior of the cut-set bound have average error probabilities that necessarily tend to one (and are not simply bounded away from zero). Examples in the classes of DM-MMNs include wireless erasure networks, DM-MMNs consisting of independent discrete memoryless channels (DMCs) as well as single-destination DM-MMNs consisting of independent DMCs with destination feedback. Our elementary proof technique leverages properties of the R\'enyi divergence.Comment: Submitted to IEEE Transactions on Information Theory, Jul 18, 2014. Revised on Jul 31, 201
    corecore