90 research outputs found

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Nonasymptotic noisy lossy source coding

    Get PDF
    This paper shows new general nonasymptotic achievability and converse bounds and performs their dispersion analysis for the lossy compression problem in which the compressor observes the source through a noisy channel. While this problem is asymptotically equivalent to a noiseless lossy source coding problem with a modified distortion function, nonasymptotically there is a noticeable gap in how fast their minimum achievable coding rates approach the common rate-distortion function, as evidenced both by the refined asymptotic analysis (dispersion) and the numerical results. The size of the gap between the dispersions of the noisy problem and the asymptotically equivalent noiseless problem depends on the stochastic variability of the channel through which the compressor observes the source.Comment: IEEE Transactions on Information Theory, 201

    Quantum soft-covering lemma with applications to rate-distortion coding, resolvability and identification via quantum channels

    Full text link
    We propose a quantum soft-covering problem for a given general quantum channel and one of its output states, which consists in finding the minimum rank of an input state needed to approximate the given channel output. We then prove a one-shot quantum covering lemma in terms of smooth min-entropies by leveraging decoupling techniques from quantum Shannon theory. This covering result is shown to be equivalent to a coding theorem for rate distortion under a posterior (reverse) channel distortion criterion [Atif, Sohail, Pradhan, arXiv:2302.00625]. Both one-shot results directly yield corollaries about the i.i.d. asymptotics, in terms of the coherent information of the channel. The power of our quantum covering lemma is demonstrated by two additional applications: first, we formulate a quantum channel resolvability problem, and provide one-shot as well as asymptotic upper and lower bounds. Secondly, we provide new upper bounds on the unrestricted and simultaneous identification capacities of quantum channels, in particular separating for the first time the simultaneous identification capacity from the unrestricted one, proving a long-standing conjecture of the last author.Comment: 29 pages, 3 figures; v2 fixes an error in Definition 6.1 and various typos and minor issues throughou

    Variable-length compression allowing errors

    Get PDF
    This paper studies the fundamental limits of the minimum average length of lossless and lossy variable-length compression, allowing a nonzero error probability ϵ\epsilon, for lossless compression. We give non-asymptotic bounds on the minimum average length in terms of Erokhin's rate-distortion function and we use those bounds to obtain a Gaussian approximation on the speed of approach to the limit which is quite accurate for all but small blocklengths: (1−ϵ)kH(S)−kV(S)2πe−(Q−1(ϵ))22(1 - \epsilon) k H(\mathsf S) - \sqrt{\frac{k V(\mathsf S)}{2 \pi} } e^{- \frac {(Q^{-1}(\epsilon))^2} 2 } where Q−1(⋅)Q^{-1}(\cdot) is the functional inverse of the standard Gaussian complementary cdf, and V(S)V(\mathsf S) is the source dispersion. A nonzero error probability thus not only reduces the asymptotically achievable rate by a factor of 1−ϵ1 - \epsilon, but this asymptotic limit is approached from below, i.e. larger source dispersions and shorter blocklengths are beneficial. Variable-length lossy compression under an excess distortion constraint is shown to exhibit similar properties

    Successive Refinement of Shannon Cipher System Under Maximal Leakage

    Full text link
    We study the successive refinement setting of Shannon cipher system (SCS) under the maximal leakage constraint for discrete memoryless sources under bounded distortion measures. Specifically, we generalize the threat model for the point-to-point rate-distortion setting of Issa, Wagner and Kamath (T-IT 2020) to the multiterminal successive refinement setting. Under mild conditions that correspond to partial secrecy, we characterize the asymptotically optimal normalized maximal leakage region for both the joint excess-distortion probability (JEP) and the expected distortion reliability constraints. Under JEP, in the achievability part, we propose a type-based coding scheme, analyze the reliability guarantee for JEP and bound the leakage of the information source through compressed versions. In the converse part, by analyzing a guessing scheme of the eavesdropper, we prove the optimality of our achievability result. Under expected distortion, the achievability part is established similarly to the JEP counterpart. The converse proof proceeds by generalizing the corresponding results for the rate-distortion setting of SCS by Schieler and Cuff (T-IT 2014) to the successive refinement setting. Somewhat surprisingly, the normalized maximal leakage regions under both JEP and expected distortion constraints are identical under certain conditions, although JEP appears to be a stronger reliability constraint
    • …
    corecore