Comparing metric measure spaces (i.e. a metric space endowed with
aprobability distribution) is at the heart of many machine learning problems.
The most popular distance between such metric measure spaces is
theGromov-Wasserstein (GW) distance, which is the solution of a quadratic
assignment problem. The GW distance is however limited to the comparison of
metric measure spaces endowed with a probability distribution.To alleviate this
issue, we introduce two Unbalanced Gromov-Wasserstein formulations: a distance
and a more tractable upper-bounding relaxation.They both allow the comparison
of metric spaces equipped with arbitrary positive measures up to isometries.
The first formulation is a positive and definite divergence based on a
relaxation of the mass conservation constraint using a novel type of
quadratically-homogeneous divergence. This divergence works hand in hand with
the entropic regularization approach which is popular to solve large scale
optimal transport problems. We show that the underlying non-convex optimization
problem can be efficiently tackled using a highly parallelizable and
GPU-friendly iterative scheme. The second formulation is a distance between
mm-spaces up to isometries based on a conic lifting. Lastly, we provide
numerical experiments onsynthetic examples and domain adaptation data with a
Positive-Unlabeled learning task to highlight the salient features of the
unbalanced divergence and its potential applications in ML