563 research outputs found
Geometric mean of probability measures and geodesics of Fisher information metric
The space of all probability measures having positive density function on a
connected compact smooth manifold , denoted by , carries the
Fisher information metric . We define the geometric mean of probability
measures by the aid of which we investigate information geometry of
, equipped with . We show that a geodesic segment joining
arbitrary probability measures and is expressed by using the
normalized geometric mean of its endpoints. As an application, we show that any
two points of can be joined by a unique geodesic. Moreover, we
prove that the function defined by , , gives the Riemannian
distance function on . It is shown that geodesics are all
minimal.Comment: 33 pages, 1 figur
Information Geometry and Evolutionary Game Theory
The Shahshahani geometry of evolutionary game theory is realized as the
information geometry of the simplex, deriving from the Fisher information
metric of the manifold of categorical probability distributions. Some essential
concepts in evolutionary game theory are realized information-theoretically.
Results are extended to the Lotka-Volterra equation and to multiple population
systems.Comment: Added reference
A Smoothed Dual Approach for Variational Wasserstein Problems
Variational problems that involve Wasserstein distances have been recently
proposed to summarize and learn from probability measures. Despite being
conceptually simple, such problems are computationally challenging because they
involve minimizing over quantities (Wasserstein distances) that are themselves
hard to compute. We show that the dual formulation of Wasserstein variational
problems introduced recently by Carlier et al. (2014) can be regularized using
an entropic smoothing, which leads to smooth, differentiable, convex
optimization problems that are simpler to implement and numerically more
stable. We illustrate the versatility of this approach by applying it to the
computation of Wasserstein barycenters and gradient flows of spacial
regularization functionals
The Burbea-Rao and Bhattacharyya centroids
We study the centroid with respect to the class of information-theoretic
Burbea-Rao divergences that generalize the celebrated Jensen-Shannon divergence
by measuring the non-negative Jensen difference induced by a strictly convex
and differentiable function. Although those Burbea-Rao divergences are
symmetric by construction, they are not metric since they fail to satisfy the
triangle inequality. We first explain how a particular symmetrization of
Bregman divergences called Jensen-Bregman distances yields exactly those
Burbea-Rao divergences. We then proceed by defining skew Burbea-Rao
divergences, and show that skew Burbea-Rao divergences amount in limit cases to
compute Bregman divergences. We then prove that Burbea-Rao centroids are
unique, and can be arbitrarily finely approximated by a generic iterative
concave-convex optimization algorithm with guaranteed convergence property. In
the second part of the paper, we consider the Bhattacharyya distance that is
commonly used to measure overlapping degree of probability distributions. We
show that Bhattacharyya distances on members of the same statistical
exponential family amount to calculate a Burbea-Rao divergence in disguise.
Thus we get an efficient algorithm for computing the Bhattacharyya centroid of
a set of parametric distributions belonging to the same exponential families,
improving over former specialized methods found in the literature that were
limited to univariate or "diagonal" multivariate Gaussians. To illustrate the
performance of our Bhattacharyya/Burbea-Rao centroid algorithm, we present
experimental performance results for -means and hierarchical clustering
methods of Gaussian mixture models.Comment: 13 page
- …