151 research outputs found
Postquantum Br\`{e}gman relative entropies and nonlinear resource theories
We introduce the family of postquantum Br\`{e}gman relative entropies, based
on nonlinear embeddings into reflexive Banach spaces (with examples given by
reflexive noncommutative Orlicz spaces over semi-finite W*-algebras,
nonassociative L spaces over semi-finite JBW-algebras, and noncommutative
L spaces over arbitrary W*-algebras). This allows us to define a class of
geometric categories for nonlinear postquantum inference theory (providing an
extension of Chencov's approach to foundations of statistical inference), with
constrained maximisations of Br\`{e}gman relative entropies as morphisms and
nonlinear images of closed convex sets as objects. Further generalisation to a
framework for nonlinear convex operational theories is developed using a larger
class of morphisms, determined by Br\`{e}gman nonexpansive operations (which
provide a well-behaved family of Mielnik's nonlinear transmitters). As an
application, we derive a range of nonlinear postquantum resource theories
determined in terms of this class of operations.Comment: v2: several corrections and improvements, including an extension to
the postquantum (generally) and JBW-algebraic (specifically) cases, a section
on nonlinear resource theories, and more informative paper's titl
Divergence Measures
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures
A Review and Taxonomy of Methods for Quantifying Dataset Similarity
In statistics and machine learning, measuring the similarity between two or
more datasets is important for several purposes. The performance of a
predictive model on novel datasets, referred to as generalizability, critically
depends on how similar the dataset used for fitting the model is to the novel
datasets. Exploiting or transferring insights between similar datasets is a key
aspect of meta-learning and transfer-learning. In two-sample testing, it is
checked, whether the underlying (multivariate) distributions of two datasets
coincide or not.
Extremely many approaches for quantifying dataset similarity have been
proposed in the literature. A structured overview is a crucial first step for
comparisons of approaches. We examine more than 100 methods and provide a
taxonomy, classifying them into ten classes, including (i) comparisons of
cumulative distribution functions, density functions, or characteristic
functions, (ii) methods based on multivariate ranks, (iii) discrepancy measures
for distributions, (iv) graph-based methods, (v) methods based on inter-point
distances, (vi) kernel-based methods, (vii) methods based on binary
classification, (viii) distance and similarity measures for datasets, (ix)
comparisons based on summary statistics, and (x) different testing approaches.
Here, we present an extensive review of these methods. We introduce the main
underlying ideas, formal definitions, and important properties.Comment: 90 pages, submitted to Statistics Survey
Group transference techniques for the estimation of the decoherence times and capacities of quantum Markov semigroups
Capacities of quantum channels and decoherence times both quantify the extent
to which quantum information can withstand degradation by interactions with its
environment. However, calculating capacities directly is known to be
intractable in general. Much recent work has focused on upper bounding certain
capacities in terms of more tractable quantities such as specific norms from
operator theory. In the meantime, there has also been substantial recent
progress on estimating decoherence times with techniques from analysis and
geometry, even though many hard questions remain open. In this article, we
introduce a class of continuous-time quantum channels that we called
transferred channels, which are built through representation theory from a
classical Markov kernel defined on a compact group. We study two subclasses of
such kernels: H\"ormander systems on compact Lie-groups and Markov chains on
finite groups. Examples of transferred channels include the depolarizing
channel, the dephasing channel, and collective decoherence channels acting on
qubits. Some of the estimates presented are new, such as those for channels
that randomly swap subsystems. We then extend tools developed in earlier work
by Gao, Junge and LaRacuente to transfer estimates of the classical Markov
kernel to the transferred channels and study in this way different
non-commutative functional inequalities. The main contribution of this article
is the application of this transference principle to the estimation of various
capacities as well as estimation of entanglement breaking times, defined as the
first time for which the channel becomes entanglement breaking. Moreover, our
estimates hold for non-ergodic channels such as the collective decoherence
channels, an important scenario that has been overlooked so far because of a
lack of techniques.Comment: 35 pages, 2 figures. Close to published versio
Magic from a Convolutional Approach
We introduce a convolutional framework to study stabilizer states and
channels based on qudits. This includes the key concepts of a "magic gap," a
mean state (MS), a minimal stabilizer-projection state (MSPS), and a new
convolution. We find that the MS is the closest MSPS to the given state with
respect to relative entropy, and the MS is extremal with respect to von Neumann
entropy. This demonstrates a "maximal entropy principle for DV systems," and
also indicates that the process of taking MS is a nontrivial,
resource-destroying map for magic. We obtain a series of inequalities for
quantum entropies and for Fisher information based on convolution, giving a
"second law of thermodynamics for quantum convolution." The convolution of two
stabilizer states is another stabilizer. We establish a central limit theorem,
based on iterating the convolution of a zero-mean quantum state, and show this
converges to an MS. The rate of convergence is characterized by the magic gap,
which is defined in terms of the support of the characteristic function of the
state. Based on the Choi-Jamiolkowski isomorphism, we introduce the notions of
a mean channel, which is a stabilizer channel, and the convolution of quantum
channels. We obtain results for quantum channels similar to those for states,
and find that Clifford unitaries play an important role in the convolution of
channels in analogous to the role stabilizers play in the convolution of
states. We elaborate these methods with a discussion of three examples: the
qudit DV beam splitter, the qudit DV amplifier and the qubit CNOT gate. All
these results are compatible with the conjecture that stabilizers play the role
in DV quantum systems analogous to Gaussians in continuous-variable quantum
systems.Comment: 46 pages. See also the companion work arXiv:2302.0784
- …