12,060 research outputs found

    Entropic bounds on coding for noisy quantum channels

    Get PDF
    In analogy with its classical counterpart, a noisy quantum channel is characterized by a loss, a quantity that depends on the channel input and the quantum operation performed by the channel. The loss reflects the transmission quality: if the loss is zero, quantum information can be perfectly transmitted at a rate measured by the quantum source entropy. By using block coding based on sequences of n entangled symbols, the average loss (defined as the overall loss of the joint n-symbol channel divided by n, when n tends to infinity) can be made lower than the loss for a single use of the channel. In this context, we examine several upper bounds on the rate at which quantum information can be transmitted reliably via a noisy channel, that is, with an asymptotically vanishing average loss while the one-symbol loss of the channel is non-zero. These bounds on the channel capacity rely on the entropic Singleton bound on quantum error-correcting codes [Phys. Rev. A 56, 1721 (1997)]. Finally, we analyze the Singleton bounds when the noisy quantum channel is supplemented with a classical auxiliary channel.Comment: 20 pages RevTeX, 10 Postscript figures. Expanded Section II, added 1 figure, changed title. To appear in Phys. Rev. A (May 98

    A smooth entropy approach to quantum hypothesis testing and the classical capacity of quantum channels

    Get PDF
    We use the smooth entropy approach to treat the problems of binary quantum hypothesis testing and the transmission of classical information through a quantum channel. We provide lower and upper bounds on the optimal type II error of quantum hypothesis testing in terms of the smooth max-relative entropy of the two states representing the two hypotheses. Using then a relative entropy version of the Quantum Asymptotic Equipartition Property (QAEP), we can recover the strong converse rate of the i.i.d. hypothesis testing problem in the asymptotics. On the other hand, combining Stein's lemma with our bounds, we obtain a stronger (\ep-independent) version of the relative entropy-QAEP. Similarly, we provide bounds on the one-shot \ep-error classical capacity of a quantum channel in terms of a smooth max-relative entropy variant of its Holevo capacity. Using these bounds and the \ep-independent version of the relative entropy-QAEP, we can recover both the Holevo-Schumacher-Westmoreland theorem about the optimal direct rate of a memoryless quantum channel with product state encoding, as well as its strong converse counterpart.Comment: v4: Title changed, improved bounds, both direct and strong converse rates are covered, a new Discussion section added. 20 page

    R\'enyi Bounds on Information Combining

    Full text link
    Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding. In this work we will generalize the concept to R\'enyi entropies. We give optimal bounds on the conditional R\'enyi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional R\'enyi entropy, we consider four different versions from the literature. Finally, we discuss the application of these bounds to the polarization of R\'enyi entropies under polar codes.Comment: 14 pages, accepted for presentation at ISIT 202

    Entanglement, quantum randomness, and complexity beyond scrambling

    Get PDF
    Scrambling is a process by which the state of a quantum system is effectively randomized due to the global entanglement that "hides" initially localized quantum information. In this work, we lay the mathematical foundations of studying randomness complexities beyond scrambling by entanglement properties. We do so by analyzing the generalized (in particular R\'enyi) entanglement entropies of designs, i.e. ensembles of unitary channels or pure states that mimic the uniformly random distribution (given by the Haar measure) up to certain moments. A main collective conclusion is that the R\'enyi entanglement entropies averaged over designs of the same order are almost maximal. This links the orders of entropy and design, and therefore suggests R\'enyi entanglement entropies as diagnostics of the randomness complexity of corresponding designs. Such complexities form a hierarchy between information scrambling and Haar randomness. As a strong separation result, we prove the existence of (state) 2-designs such that the R\'enyi entanglement entropies of higher orders can be bounded away from the maximum. However, we also show that the min entanglement entropy is maximized by designs of order only logarithmic in the dimension of the system. In other words, logarithmic-designs already achieve the complexity of Haar in terms of entanglement, which we also call max-scrambling. This result leads to a generalization of the fast scrambling conjecture, that max-scrambling can be achieved by physical dynamics in time roughly linear in the number of degrees of freedom.Comment: 72 pages, 4 figures. Rewritten version with new title. v3: published versio

    Information theoretic treatment of tripartite systems and quantum channels

    Full text link
    A Holevo measure is used to discuss how much information about a given POVM on system aa is present in another system bb, and how this influences the presence or absence of information about a different POVM on aa in a third system cc. The main goal is to extend information theorems for mutually unbiased bases or general bases to arbitrary POVMs, and especially to generalize "all-or-nothing" theorems about information located in tripartite systems to the case of \emph{partial information}, in the form of quantitative inequalities. Some of the inequalities can be viewed as entropic uncertainty relations that apply in the presence of quantum side information, as in recent work by Berta et al. [Nature Physics 6, 659 (2010)]. All of the results also apply to quantum channels: e.g., if \EC accurately transmits certain POVMs, the complementary channel \FC will necessarily be noisy for certain other POVMs. While the inequalities are valid for mixed states of tripartite systems, restricting to pure states leads to the basis-invariance of the difference between the information about aa contained in bb and cc.Comment: 21 pages. An earlier version of this paper attempted to prove our main uncertainty relation, Theorem 5, using the achievability of the Holevo quantity in a coding task, an approach that ultimately failed because it did not account for locking of classical correlations, e.g. see [DiVincenzo et al. PRL. 92, 067902 (2004)]. In the latest version, we use a very different approach to prove Theorem
    corecore