24 research outputs found
On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means
The Jensen-Shannon divergence is a renown bounded symmetrization of the
unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler
divergence to the average mixture distribution. However the Jensen-Shannon
divergence between Gaussian distributions is not available in closed-form. To
bypass this problem, we present a generalization of the Jensen-Shannon (JS)
divergence using abstract means which yields closed-form expressions when the
mean is chosen according to the parametric family of distributions. More
generally, we define the JS-symmetrizations of any distance using generalized
statistical mixtures derived from abstract means. In particular, we first show
that the geometric mean is well-suited for exponential families, and report two
closed-form formula for (i) the geometric Jensen-Shannon divergence between
probability densities of the same exponential family, and (ii) the geometric
JS-symmetrization of the reverse Kullback-Leibler divergence. As a second
illustrating example, we show that the harmonic mean is well-suited for the
scale Cauchy distributions, and report a closed-form formula for the harmonic
Jensen-Shannon divergence between scale Cauchy distributions. We also define
generalized Jensen-Shannon divergences between matrices (e.g., quantum
Jensen-Shannon divergences) and consider clustering with respect to these novel
Jensen-Shannon divergences.Comment: 30 page
Beyond scalar quasi-arithmetic means: Quasi-arithmetic averages and quasi-arithmetic mixtures in information geometry
We generalize quasi-arithmetic means beyond scalars by considering the
gradient map of a Legendre type real-valued function. The gradient map of a
Legendre type function is proven strictly comonotone with a global inverse. It
thus yields a generalization of strictly mononotone and differentiable
functions generating scalar quasi-arithmetic means. Furthermore, the Legendre
transformation gives rise to pairs of dual quasi-arithmetic averages via the
convex duality. We study the invariance and equivariance properties under
affine transformations of quasi-arithmetic averages via the lens of dually flat
spaces of information geometry. We show how these quasi-arithmetic averages are
used to express points on dual geodesics and sided barycenters in the dual
affine coordinate systems. We then consider quasi-arithmetic mixtures and
describe several parametric and non-parametric statistical models which are
closed under the quasi-arithmetic mixture operation.Comment: 20 page
The {\alpha}-divergences associated with a pair of strictly comparable quasi-arithmetic means
We generalize the family of -divergences using a pair of strictly
comparable weighted means. In particular, we obtain the -divergence in the
limit case (a generalization of the Kullback-Leibler
divergence) and the -divergence in the limit case (a
generalization of the reverse Kullback-Leibler divergence). We state the
condition for a pair of quasi-arithmetic means to be strictly comparable, and
report the formula for the quasi-arithmetic -divergences and its
subfamily of bipower homogeneous -divergences which belong to the
Csis\'ar's -divergences. Finally, we show that these generalized
quasi-arithmetic -divergences and -divergences can be decomposed as the
sum of generalized cross-entropies minus entropies, and rewritten as conformal
Bregman divergences using monotone embeddings.Comment: 18 page
Divergence Measures
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures
New Directions for Contact Integrators
Contact integrators are a family of geometric numerical schemes which
guarantee the conservation of the contact structure. In this work we review the
construction of both the variational and Hamiltonian versions of these methods.
We illustrate some of the advantages of geometric integration in the
dissipative setting by focusing on models inspired by recent studies in
celestial mechanics and cosmology.Comment: To appear as Chapter 24 in GSI 2021, Springer LNCS 1282
Generalizing Skew Jensen Divergences and Bregman Divergences With Comparative Convexity
Comparative convexity is a generalization of ordinary convexity based on abstract means instead of arithmetic means. We introduce the generalized skew Jensen divergences and their corresponding Bregman divergences with respect to comparative convexity. To illustrate those novel families of divergences, we consider the convexity induced by quasi-arithmetic means, and report explicit formula for the corresponding Bregman divergences. In particular, we show that those new Bregman divergences are equivalent to conformal ordinary Bregman divergences on monotone embeddings, and further state related results
Generalizing Skew Jensen Divergences and Bregman Divergences with Comparative Convexity
International audienc