1,435 research outputs found

    First- and Second-Order Hypothesis Testing for Mixed Memoryless Sources with General Mixture

    Full text link
    The first- and second-order optimum achievable exponents in the simple hypothesis testing problem are investigated. The optimum achievable exponent for type II error probability, under the constraint that the type I error probability is allowed asymptotically up to epsilon, is called the epsilon-optimum exponent. In this paper, we first give the second-order epsilon-exponent in the case where the null hypothesis and the alternative hypothesis are a mixed memoryless source and a stationary memoryless source, respectively. We next generalize this setting to the case where the alternative hypothesis is also a mixed memoryless source. We address the first-order epsilon-optimum exponent in this setting. In addition, an extension of our results to more general setting such as the hypothesis testing with mixed general source and the relationship with the general compound hypothesis testing problem are also discussed.Comment: 23 page

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    One-shot lossy quantum data compression

    Get PDF
    We provide a framework for one-shot quantum rate distortion coding, in which the goal is to determine the minimum number of qubits required to compress quantum information as a function of the probability that the distortion incurred upon decompression exceeds some specified level. We obtain a one-shot characterization of the minimum qubit compression size for an entanglement-assisted quantum rate-distortion code in terms of the smooth max-information, a quantity previously employed in the one-shot quantum reverse Shannon theorem. Next, we show how this characterization converges to the known expression for the entanglement-assisted quantum rate distortion function for asymptotically many copies of a memoryless quantum information source. Finally, we give a tight, finite blocklength characterization for the entanglement-assisted minimum qubit compression size of a memoryless isotropic qubit source subject to an average symbol-wise distortion constraint.Comment: 36 page

    Second-Order Asymptotics of Visible Mixed Quantum Source Coding via Universal Codes

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via http://dx.doi.org/10.1109/TIT.2016.2571662The simplest example of a quantum information source with memory is a mixed source which emits signals entirely from one of two memoryless quantum sources with given a priori probabilities. Considering a mixed source consisting of a general one-parameter family of memoryless sources, we derive the second order asymptotic rate for fixed-length visible source coding. Furthermore, we specialize our main result to a mixed source consisting of two memoryless sources. Our results provide the first example of second order asymptotics for a quantum information-processing task employing a resource with memory. For the case of a classical mixed source (using a finite alphabet), our results reduce to those obtained by Nomura and Han [IEEE Trans. on Inf. Th. 59.1 (2013), pp. 1-16]. To prove the achievability part of our main result, we introduce universal quantum source codes achieving second order asymptotic rates. These are obtained by an extension of Hayashi's construction [IEEE Trans. on Inf. Th. 54.10 (2008), pp. 4619-4637] of their classical counterparts

    Smooth Renyi Entropies and the Quantum Information Spectrum

    Full text link
    Many of the traditional results in information theory, such as the channel coding theorem or the source coding theorem, are restricted to scenarios where the underlying resources are independent and identically distributed (i.i.d.) over a large number of uses. To overcome this limitation, two different techniques, the information spectrum method and the smooth entropy framework, have been developed independently. They are based on new entropy measures, called spectral entropy rates and smooth entropies, respectively, that generalize Shannon entropy (in the classical case) and von Neumann entropy (in the more general quantum case). Here, we show that the two techniques are closely related. More precisely, the spectral entropy rate can be seen as the asymptotic limit of the smooth entropy. Our results apply to the quantum setting and thus include the classical setting as a special case

    Properties of Noncommutative Renyi and Augustin Information

    Full text link
    The scaled R\'enyi information plays a significant role in evaluating the performance of information processing tasks by virtue of its connection to the error exponent analysis. In quantum information theory, there are three generalizations of the classical R\'enyi divergence---the Petz's, sandwiched, and log-Euclidean versions, that possess meaningful operational interpretation. However, these scaled noncommutative R\'enyi informations are much less explored compared with their classical counterpart, and lacking crucial properties hinders applications of these quantities to refined performance analysis. The goal of this paper is thus to analyze fundamental properties of scaled R\'enyi information from a noncommutative measure-theoretic perspective. Firstly, we prove the uniform equicontinuity for all three quantum versions of R\'enyi information, hence it yields the joint continuity of these quantities in the orders and priors. Secondly, we establish the concavity in the region of s∈(βˆ’1,0)s\in(-1,0) for both Petz's and the sandwiched versions. This completes the open questions raised by Holevo [\href{https://ieeexplore.ieee.org/document/868501/}{\textit{IEEE Trans.~Inf.~Theory}, \textbf{46}(6):2256--2261, 2000}], Mosonyi and Ogawa [\href{https://doi.org/10.1007/s00220-017-2928-4/}{\textit{Commun.~Math.~Phys}, \textbf{355}(1):373--426, 2017}]. For the applications, we show that the strong converse exponent in classical-quantum channel coding satisfies a minimax identity. The established concavity is further employed to prove an entropic duality between classical data compression with quantum side information and classical-quantum channel coding, and a Fenchel duality in joint source-channel coding with quantum side information in the forthcoming papers

    Distributed Channel Synthesis

    Full text link
    Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal trade-off between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this work also generalizes and strengthens this soft covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012, accepted July, 2013), 26 pages, using IEEEtran.cl

    Strong Converse and Second-Order Asymptotics of Channel Resolvability

    Full text link
    We study the problem of channel resolvability for fixed i.i.d. input distributions and discrete memoryless channels (DMCs), and derive the strong converse theorem for any DMCs that are not necessarily full rank. We also derive the optimal second-order rate under a condition. Furthermore, under the condition that a DMC has the unique capacity achieving input distribution, we derive the optimal second-order rate of channel resolvability for the worst input distribution.Comment: 7 pages, a shorter version will appear in ISIT 2014, this version includes the proofs of technical lemmas in appendice
    • …
    corecore