168 research outputs found
Gromov-Wasserstein Averaging of Kernel and Distance Matrices
International audienceThis paper presents a new technique for computing the barycenter of a set of distance or kernel matrices. These matrices, which define the interrelationships between points sampled from individual domains, are not required to have the same size or to be in row-by-row correspondence. We compare these matrices using the softassign criterion , which measures the minimum distortion induced by a probabilistic map from the rows of one similarity matrix to the rows of another; this criterion amounts to a regularized version of the Gromov-Wasserstein (GW) distance between metric-measure spaces. The barycenter is then defined as a Fréchet mean of the input matrices with respect to this criterion, minimizing a weighted sum of softassign values. We provide a fast iterative algorithm for the resulting noncon-vex optimization problem, built upon state-of-the-art tools for regularized optimal transportation. We demonstrate its application to the computation of shape barycenters and to the prediction of energy levels from molecular configurations in quantum chemistry
Learning Generative Models across Incomparable Spaces
Generative Adversarial Networks have shown remarkable success in learning a
distribution that faithfully recovers a reference distribution in its entirety.
However, in some cases, we may want to only learn some aspects (e.g., cluster
or manifold structure), while modifying others (e.g., style, orientation or
dimension). In this work, we propose an approach to learn generative models
across such incomparable spaces, and demonstrate how to steer the learned
distribution towards target properties. A key component of our model is the
Gromov-Wasserstein distance, a notion of discrepancy that compares
distributions relationally rather than absolutely. While this framework
subsumes current generative models in identically reproducing distributions,
its inherent flexibility allows application to tasks in manifold learning,
relational learning and cross-domain learning.Comment: International Conference on Machine Learning (ICML
Aligning Time Series on Incomparable Spaces
Dynamic time warping (DTW) is a useful method for aligning, comparing and
combining time series, but it requires them to live in comparable spaces. In
this work, we consider a setting in which time series live on different spaces
without a sensible ground metric, causing DTW to become ill-defined. To
alleviate this, we propose Gromov dynamic time warping (GDTW), a distance
between time series on potentially incomparable spaces that avoids the
comparability requirement by instead considering intra-relational geometry. We
demonstrate its effectiveness at aligning, combining and comparing time series
living on incomparable spaces. We further propose a smoothed version of GDTW as
a differentiable loss and assess its properties in a variety of settings,
including barycentric averaging, generative modeling and imitation learning
- …