9,532 research outputs found
The Entropy Power Inequality with quantum conditioning
The conditional entropy power inequality is a fundamental inequality in
information theory, stating that the conditional entropy of the sum of two
conditionally independent vector-valued random variables each with an assigned
conditional entropy is minimum when the random variables are Gaussian. We prove
the conditional entropy power inequality in the scenario where the conditioning
system is quantum. The proof is based on the heat semigroup and on a
generalization of the Stam inequality in the presence of quantum conditioning.
The entropy power inequality with quantum conditioning will be a key tool of
quantum information, with applications in distributed source coding protocols
with the assistance of quantum entanglement
Logical Entropy: Introduction to Classical and Quantum Logical Information theory
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this paper is to give the direct generalization to quantum logical information theory that similarly focuses on the pairs of eigenstates distinguished by an observable, i.e., qudits of an observable. The fundamental theorem for quantum logical entropy and measurement establishes a direct quantitative connection between the increase in quantum logical entropy due to a projective measurement and the eigenstates that are distinguished by the measurement. Both the classical and quantum versions of logical entropy have simple interpretations as “two-draw” probabilities for distinctions. The conclusion is that quantum logical entropy is the simple and natural notion of information for quantum information theory focusing on the distinguishing of quantum states
On dynamical measures of quantum information
In this work, we use the theory of quantum states over time to define an
entropy associated with quantum processes
, where is a state and is a quantum
channel responsible for the dynamical evolution of . The entropy
is a generalization of the von Neumann entropy in the
sense that (where denotes the
identity channel), and is a dynamical analogue of the quantum joint entropy for
bipartite states. Such an entropy is then used to define dynamical formulations
of the quantum conditional entropy and quantum mutual information, and we show
such information measures satisfy many desirable properties, such as a quantum
entropic Bayes' rule. We also use our entropy function to quantify the
information loss/gain associated with the dynamical evolution of quantum
systems, which enables us to formulate a precise notion of information
conservation for quantum processes.Comment: Comments welcome
A Fully Quantum Asymptotic Equipartition Property
The classical asymptotic equipartition property is the statement that, in the
limit of a large number of identical repetitions of a random experiment, the
output sequence is virtually certain to come from the typical set, each member
of which is almost equally likely. In this paper, we prove a fully quantum
generalization of this property, where both the output of the experiment and
side information are quantum. We give an explicit bound on the convergence,
which is independent of the dimensionality of the side information. This
naturally leads to a family of Renyi-like quantum conditional entropies, for
which the von Neumann entropy emerges as a special case.Comment: Main claim is updated with improved bound
Optimized Quantum F-Divergences
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is a unifying concept in quantum information theory: many information measures such as entropy, conditional entropy, mutual information, and entanglement measures can be realized from it. As such, there has been broad interest in generalizing the notion to further understand its most basic properties, one of which is the data processing inequality. The quantum f-divergence of Petz is one generalization of the quantum relative entropy, and it also leads to other relative entropies, such as the Petz-Renyi relative entropies. In this contribution, I introduce the optimized quantum f-divergence as a related generalization of quantum relative entropy. I prove that it satisfies the data processing inequality, and the method of proof relies upon the operator Jensen inequality, similar to Petz\u27s original approach. Interestingly, the sandwiched Renyi relative entropies are particular examples of the optimized f-divergence. Thus, one benefit of this approach is that there is now a single, unified approach for establishing the data processing inequality for both the Petz-Renyi and sandwiched Renyi relative entropies, for the full range of parameters for which it is known to hold. Full version of this paper is accessible at arXiv:1710.1025
Nonadditive measure and quantum entanglement in a class of mixed states of N^n-system
Through the generalization of Khinchin's classical axiomatic foundation, a
basis is developed for nonadditive information theory. The classical
nonadditive conditional entropy indexed by the positive parameter q is
introduced and then translated into quantum information. This quantity is
nonnegative for classically correlated states but can take negative values for
entangled mixed states. This property is used to study quantum entanglement in
the parametrized Werner-Popescu-like state of an N^n-system, that is, an
n-partite N-level system. It is shown how the strongest limitation on validity
of local realism (i.e., separability of the state) can be obtained in a novel
manner
- …