659 research outputs found

    One-shot lossy quantum data compression

    Get PDF
    We provide a framework for one-shot quantum rate distortion coding, in which the goal is to determine the minimum number of qubits required to compress quantum information as a function of the probability that the distortion incurred upon decompression exceeds some specified level. We obtain a one-shot characterization of the minimum qubit compression size for an entanglement-assisted quantum rate-distortion code in terms of the smooth max-information, a quantity previously employed in the one-shot quantum reverse Shannon theorem. Next, we show how this characterization converges to the known expression for the entanglement-assisted quantum rate distortion function for asymptotically many copies of a memoryless quantum information source. Finally, we give a tight, finite blocklength characterization for the entanglement-assisted minimum qubit compression size of a memoryless isotropic qubit source subject to an average symbol-wise distortion constraint.Comment: 36 page

    The Likelihood Encoder for Lossy Compression

    Full text link
    A likelihood encoder is studied in the context of lossy source compression. The analysis of the likelihood encoder is based on the soft-covering lemma. It is demonstrated that the use of a likelihood encoder together with the soft-covering lemma yields simple achievability proofs for classical source coding problems. The cases of the point-to-point rate-distortion function, the rate-distortion function with side information at the decoder (i.e. the Wyner-Ziv problem), and the multi-terminal source coding inner bound (i.e. the Berger-Tung problem) are examined in this paper. Furthermore, a non-asymptotic analysis is used for the point-to-point case to examine the upper bound on the excess distortion provided by this method. The likelihood encoder is also related to a recent alternative technique using properties of random binning

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Consciousness, cognition, and the hierarchy of context: extending the global neuronal workspace model

    Get PDF
    We adapt an information theory analysis of interacting cognitive biological and social modules to the problem of the global neuronal workspace, the new standard neuroscience paradigm for consciousness. Tunable punctuation emerges in a natural way, suggesting the possibility of fitting appropriate phase transition power law, and away from transition, generalized Onsager relation expressions, to observational data on conscious reaction. The development can be extended in a straightforward manner to include psychosocial stress, culture, or other cognitive modules which constitute a structured, embedding hierarchy of contextual constraints acting at a slower rate than neuronal function itself. This produces a 'biopsychosociocultural' model of individual consciousness that, while otherwise quite close to the standard treatment, meets compelling philosophical and other objections to brain-only descriptions
    • ā€¦
    corecore