32,430 research outputs found

    Bounds on Information Combining With Quantum Side Information

    Full text link
    "Bounds on information combining" are entropic inequalities that determine how the information (entropy) of a set of random variables can change when these are combined in certain prescribed ways. Such bounds play an important role in classical information theory, particularly in coding and Shannon theory; entropy power inequalities are special instances of them. The arguably most elementary kind of information combining is the addition of two binary random variables (a CNOT gate), and the resulting quantities play an important role in Belief propagation and Polar coding. We investigate this problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities. Our main technical result is a non-trivial, and close to optimal, lower bound on the combined entropy, which can be seen as an almost optimal "quantum Mrs. Gerber's Lemma". Our proof uses three main ingredients: (1) a new bound on the concavity of von Neumann entropy, which is tight in the regime of low pairwise state fidelities; (2) the quantitative improvement of strong subadditivity due to Fawzi-Renner, in which we manage to handle the minimization over recovery maps; (3) recent duality results on classical-quantum-channels due to Renes et al. We furthermore present conjectures on the optimal lower and upper bounds under quantum side information, supported by interesting analytical observations and strong numerical evidence. We finally apply our bounds to Polar coding for binary-input classical-quantum channels, and show the following three results: (A) Even non-stationary channels polarize under the polar transform. (B) The blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity. (C) Under the aforementioned lower bound conjecture, a blocklength polynomial in the gap suffices.Comment: 23 pages, 6 figures; v2: small correction

    R\'enyi Bounds on Information Combining

    Full text link
    Bounds on information combining are entropic inequalities that determine how the information, or entropy, of a set of random variables can change when they are combined in certain prescribed ways. Such bounds play an important role in information theory, particularly in coding and Shannon theory. The arguably most elementary kind of information combining is the addition of two binary random variables, i.e. a CNOT gate, and the resulting quantities are fundamental when investigating belief propagation and polar coding. In this work we will generalize the concept to R\'enyi entropies. We give optimal bounds on the conditional R\'enyi entropy after combination, based on a certain convexity or concavity property and discuss when this property indeed holds. Since there is no generally agreed upon definition of the conditional R\'enyi entropy, we consider four different versions from the literature. Finally, we discuss the application of these bounds to the polarization of R\'enyi entropies under polar codes.Comment: 14 pages, accepted for presentation at ISIT 202

    Entropic bounds on coding for noisy quantum channels

    Get PDF
    In analogy with its classical counterpart, a noisy quantum channel is characterized by a loss, a quantity that depends on the channel input and the quantum operation performed by the channel. The loss reflects the transmission quality: if the loss is zero, quantum information can be perfectly transmitted at a rate measured by the quantum source entropy. By using block coding based on sequences of n entangled symbols, the average loss (defined as the overall loss of the joint n-symbol channel divided by n, when n tends to infinity) can be made lower than the loss for a single use of the channel. In this context, we examine several upper bounds on the rate at which quantum information can be transmitted reliably via a noisy channel, that is, with an asymptotically vanishing average loss while the one-symbol loss of the channel is non-zero. These bounds on the channel capacity rely on the entropic Singleton bound on quantum error-correcting codes [Phys. Rev. A 56, 1721 (1997)]. Finally, we analyze the Singleton bounds when the noisy quantum channel is supplemented with a classical auxiliary channel.Comment: 20 pages RevTeX, 10 Postscript figures. Expanded Section II, added 1 figure, changed title. To appear in Phys. Rev. A (May 98

    Entanglement-assisted communication of classical and quantum information

    Get PDF
    We consider the problem of transmitting classical and quantum information reliably over an entanglement-assisted quantum channel. Our main result is a capacity theorem that gives a three-dimensional achievable rate region. Points in the region are rate triples, consisting of the classical communication rate, the quantum communication rate, and the entanglement consumption rate of a particular coding scheme. The crucial protocol in achieving the boundary points of the capacity region is a protocol that we name the classically-enhanced father protocol. The classically-enhanced father protocol is more general than other protocols in the family tree of quantum Shannon theoretic protocols, in the sense that several previously known quantum protocols are now child protocols of it. The classically-enhanced father protocol also shows an improvement over a time-sharing strategy for the case of a qubit dephasing channel--this result justifies the need for simultaneous coding of classical and quantum information over an entanglement-assisted quantum channel. Our capacity theorem is of a multi-letter nature (requiring a limit over many uses of the channel), but it reduces to a single-letter characterization for at least three channels: the completely depolarizing channel, the quantum erasure channel, and the qubit dephasing channel.Comment: 23 pages, 5 figures, 1 table, simplification of capacity region--it now has the simple interpretation as the unit resource capacity region translated along the classically-enhanced father trade-off curv

    Capacities and Capacity-Achieving Decoders for Various Fingerprinting Games

    Full text link
    Combining an information-theoretic approach to fingerprinting with a more constructive, statistical approach, we derive new results on the fingerprinting capacities for various informed settings, as well as new log-likelihood decoders with provable code lengths that asymptotically match these capacities. The simple decoder built against the interleaving attack is further shown to achieve the simple capacity for unknown attacks, and is argued to be an improved version of the recently proposed decoder of Oosterwijk et al. With this new universal decoder, cut-offs on the bias distribution function can finally be dismissed. Besides the application of these results to fingerprinting, a direct consequence of our results to group testing is that (i) a simple decoder asymptotically requires a factor 1.44 more tests to find defectives than a joint decoder, and (ii) the simple decoder presented in this paper provably achieves this bound.Comment: 13 pages, 2 figure

    Asymptotics of Fingerprinting and Group Testing: Capacity-Achieving Log-Likelihood Decoders

    Get PDF
    We study the large-coalition asymptotics of fingerprinting and group testing, and derive explicit decoders that provably achieve capacity for many of the considered models. We do this both for simple decoders (fast but suboptimal) and for joint decoders (slow but optimal), and both for informed and uninformed settings. For fingerprinting, we show that if the pirate strategy is known, the Neyman-Pearson-based log-likelihood decoders provably achieve capacity, regardless of the strategy. The decoder built against the interleaving attack is further shown to be a universal decoder, able to deal with arbitrary attacks and achieving the uninformed capacity. This universal decoder is shown to be closely related to the Lagrange-optimized decoder of Oosterwijk et al. and the empirical mutual information decoder of Moulin. Joint decoders are also proposed, and we conjecture that these also achieve the corresponding joint capacities. For group testing, the simple decoder for the classical model is shown to be more efficient than the one of Chan et al. and it provably achieves the simple group testing capacity. For generalizations of this model such as noisy group testing, the resulting simple decoders also achieve the corresponding simple capacities.Comment: 14 pages, 2 figure

    Re-proving Channel Polarization Theorems: An Extremality and Robustness Analysis

    Get PDF
    The general subject considered in this thesis is a recently discovered coding technique, polar coding, which is used to construct a class of error correction codes with unique properties. In his ground-breaking work, Ar{\i}kan proved that this class of codes, called polar codes, achieve the symmetric capacity --- the mutual information evaluated at the uniform input distribution ---of any stationary binary discrete memoryless channel with low complexity encoders and decoders requiring in the order of O(NlogN)O(N\log N) operations in the block-length NN. This discovery settled the long standing open problem left by Shannon of finding low complexity codes achieving the channel capacity. Polar coding settled an open problem in information theory, yet opened plenty of challenging problems that need to be addressed. A significant part of this thesis is dedicated to advancing the knowledge about this technique in two directions. The first one provides a better understanding of polar coding by generalizing some of the existing results and discussing their implications, and the second one studies the robustness of the theory over communication models introducing various forms of uncertainty or variations into the probabilistic model of the channel.Comment: Preview of my PhD Thesis, EPFL, Lausanne, 2014. For the full version, see http://people.epfl.ch/mine.alsan/publication
    corecore