530 research outputs found
A Smoothed Dual Approach for Variational Wasserstein Problems
Variational problems that involve Wasserstein distances have been recently
proposed to summarize and learn from probability measures. Despite being
conceptually simple, such problems are computationally challenging because they
involve minimizing over quantities (Wasserstein distances) that are themselves
hard to compute. We show that the dual formulation of Wasserstein variational
problems introduced recently by Carlier et al. (2014) can be regularized using
an entropic smoothing, which leads to smooth, differentiable, convex
optimization problems that are simpler to implement and numerically more
stable. We illustrate the versatility of this approach by applying it to the
computation of Wasserstein barycenters and gradient flows of spacial
regularization functionals
Distributed optimization with quantization for computing Wasserstein barycenters
We study the problem of the decentralized computation of entropy-regularized semi-discrete Wasserstein barycenters over a network. Building upon recent primal-dual approaches, we propose a sampling gradient quantization scheme that allows efficient communication and computation of approximate barycenters where the factor distributions are stored distributedly on arbitrary networks. The communication and algorithmic complexity of the proposed algorithm are shown, with explicit dependency on the size of the support, the number of distributions, and the desired accuracy. Numerical results validate our algorithmic analysis
Solving general elliptical mixture models through an approximate Wasserstein manifold
We address the estimation problem for general finite mixture models, with a
particular focus on the elliptical mixture models (EMMs). Compared to the
widely adopted Kullback-Leibler divergence, we show that the Wasserstein
distance provides a more desirable optimisation space. We thus provide a stable
solution to the EMMs that is both robust to initialisations and reaches a
superior optimum by adaptively optimising along a manifold of an approximate
Wasserstein distance. To this end, we first provide a unifying account of
computable and identifiable EMMs, which serves as a basis to rigorously address
the underpinning optimisation problem. Due to a probability constraint, solving
this problem is extremely cumbersome and unstable, especially under the
Wasserstein distance. To relieve this issue, we introduce an efficient
optimisation method on a statistical manifold defined under an approximate
Wasserstein distance, which allows for explicit metrics and computable
operations, thus significantly stabilising and improving the EMM estimation. We
further propose an adaptive method to accelerate the convergence. Experimental
results demonstrate the excellent performance of the proposed EMM solver.Comment: This work has been accepted to AAAI2020. Note that this version also
corrects a small error on the Equation (16) in proo
Graph Cuts with Arbitrary Size Constraints Through Optimal Transport
A common way of partitioning graphs is through minimum cuts. One drawback of
classical minimum cut methods is that they tend to produce small groups, which
is why more balanced variants such as normalized and ratio cuts have seen more
success. However, we believe that with these variants, the balance constraints
can be too restrictive for some applications like for clustering of imbalanced
datasets, while not being restrictive enough for when searching for perfectly
balanced partitions. Here, we propose a new graph cut algorithm for
partitioning graphs under arbitrary size constraints. We formulate the graph
cut problem as a regularized Gromov-Wasserstein problem. We then propose to
solve it using accelerated proximal GD algorithm which has global convergence
guarantees, results in sparse solutions and only incurs an additional ratio of
compared to the classical spectral clustering algorithm
but was seen to be more efficient
- …