27 research outputs found
Properties of Noncommutative Renyi and Augustin Information
The scaled R\'enyi information plays a significant role in evaluating the
performance of information processing tasks by virtue of its connection to the
error exponent analysis. In quantum information theory, there are three
generalizations of the classical R\'enyi divergence---the Petz's, sandwiched,
and log-Euclidean versions, that possess meaningful operational interpretation.
However, these scaled noncommutative R\'enyi informations are much less
explored compared with their classical counterpart, and lacking crucial
properties hinders applications of these quantities to refined performance
analysis. The goal of this paper is thus to analyze fundamental properties of
scaled R\'enyi information from a noncommutative measure-theoretic perspective.
Firstly, we prove the uniform equicontinuity for all three quantum versions of
R\'enyi information, hence it yields the joint continuity of these quantities
in the orders and priors. Secondly, we establish the concavity in the region of
for both Petz's and the sandwiched versions. This completes the
open questions raised by Holevo
[\href{https://ieeexplore.ieee.org/document/868501/}{\textit{IEEE
Trans.~Inf.~Theory}, \textbf{46}(6):2256--2261, 2000}], Mosonyi and Ogawa
[\href{https://doi.org/10.1007/s00220-017-2928-4/}{\textit{Commun.~Math.~Phys},
\textbf{355}(1):373--426, 2017}]. For the applications, we show that the strong
converse exponent in classical-quantum channel coding satisfies a minimax
identity. The established concavity is further employed to prove an entropic
duality between classical data compression with quantum side information and
classical-quantum channel coding, and a Fenchel duality in joint source-channel
coding with quantum side information in the forthcoming papers
Divergence radii and the strong converse exponent of classical-quantum channel coding with constant compositions
There are different inequivalent ways to define the R\'enyi capacity of a
channel for a fixed input distribution . In a 1995 paper Csisz\'ar has shown
that for classical discrete memoryless channels there is a distinguished such
quantity that has an operational interpretation as a generalized cutoff rate
for constant composition channel coding. We show that the analogous notion of
R\'enyi capacity, defined in terms of the sandwiched quantum R\'enyi
divergences, has the same operational interpretation in the strong converse
problem of classical-quantum channel coding. Denoting the constant composition
strong converse exponent for a memoryless classical-quantum channel with
composition and rate as , our main result is that where is the -weighted sandwiched R\'enyi
divergence radius of the image of the channel.Comment: 46 pages. V7: Added the strong converse exponent with cost constrain
R\'enyi generalizations of quantum information measures
Quantum information measures such as the entropy and the mutual information
find applications in physics, e.g., as correlation measures. Generalizing such
measures based on the R\'enyi entropies is expected to enhance their scope in
applications. We prescribe R\'enyi generalizations for any quantum information
measure which consists of a linear combination of von Neumann entropies with
coefficients chosen from the set {-1,0,1}. As examples, we describe R\'enyi
generalizations of the conditional quantum mutual information, some quantum
multipartite information measures, and the topological entanglement entropy.
Among these, we discuss the various properties of the R\'enyi conditional
quantum mutual information and sketch some potential applications. We
conjecture that the proposed R\'enyi conditional quantum mutual informations
are monotone increasing in the R\'enyi parameter, and we have proofs of this
conjecture for some special cases.Comment: 9 pages, related to and extends the results from arXiv:1403.610
Entropy of a quantum channel
The von Neumann entropy of a quantum state is a central concept in physics
and information theory, having a number of compelling physical interpretations.
There is a certain perspective that the most fundamental notion in quantum
mechanics is that of a quantum channel, as quantum states, unitary evolutions,
measurements, and discarding of quantum systems can each be regarded as certain
kinds of quantum channels. Thus, an important goal is to define a consistent
and meaningful notion of the entropy of a quantum channel. Motivated by the
fact that the entropy of a state can be formulated as the difference of
the number of physical qubits and the "relative entropy distance" between
and the maximally mixed state, here we define the entropy of a channel
as the difference of the number of physical qubits of the channel
output with the "relative entropy distance" between and the
completely depolarizing channel. We prove that this definition satisfies all of
the axioms, recently put forward in [Gour, IEEE Trans. Inf. Theory 65, 5880
(2019)], required for a channel entropy function. The task of quantum channel
merging, in which the goal is for the receiver to merge his share of the
channel with the environment's share, gives a compelling operational
interpretation of the entropy of a channel. We define Renyi and min-entropies
of a channel and prove that they satisfy the axioms required for a channel
entropy function. Among other results, we also prove that a smoothed version of
the min-entropy of a channel satisfies the asymptotic equipartition property.Comment: v2: 29 pages, 1 figur
Divergence Measures
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures
Rényi generalizations of quantum information measures
Quantum information measures such as the entropy and the mutual information find applications in physics, e.g., as correlation measures. Generalizing such measures based on the Rényi entropies is expected to enhance their scope in applications. We prescribe Rényi generalizations for any quantum information measure which consists of a linear combination of von Neumann entropies with coefficients chosen from the set {−1,0,1} . As examples, we describe Rényi generalizations of the conditional quantum mutual information, some quantum multipartite information measures, and the topological entanglement entropy. Among these, we discuss the various properties of the Rényi conditional quantum mutual information and sketch some potential applications. We conjecture that the proposed Rényi conditional quantum mutual informations are monotone increasing in the Rényi parameter, and we have proof of this conjecture for some special cases
Operational interpretation of Rényi information measures via composite hypothesis testing against product and markov distributions
© 1963-2012 IEEE. We revisit the problem of asymmetric binary hypothesis testing against a composite alternative hypothesis. We introduce a general framework to treat such problems when the alternative hypothesis adheres to certain axioms. In this case, we find the threshold rate, the optimal error and strong converse exponents (at large deviations from the threshold), and the second order asymptotics (at small deviations from the threshold). We apply our results to find the operational interpretations of various Rényi information measures. In case the alternative hypothesis is comprised of bipartite product distributions, we find that the optimal error and strong converse exponents are determined by the variations of Rényi mutual information. In case the alternative hypothesis consists of tripartite distributions satisfying the Markov property, we find that the optimal exponents are determined by the variations of Rényi conditional mutual information. In either case, the relevant notion of Rényi mutual information depends on the precise choice of the alternative hypothesis. As such, this paper also strengthens the view that different definitions of Rényi mutual information, conditional entropy, and conditional mutual information are adequate depending on the context in which the measures are used