1,371 research outputs found
A Smoothed Dual Approach for Variational Wasserstein Problems
Variational problems that involve Wasserstein distances have been recently
proposed to summarize and learn from probability measures. Despite being
conceptually simple, such problems are computationally challenging because they
involve minimizing over quantities (Wasserstein distances) that are themselves
hard to compute. We show that the dual formulation of Wasserstein variational
problems introduced recently by Carlier et al. (2014) can be regularized using
an entropic smoothing, which leads to smooth, differentiable, convex
optimization problems that are simpler to implement and numerically more
stable. We illustrate the versatility of this approach by applying it to the
computation of Wasserstein barycenters and gradient flows of spacial
regularization functionals
Scaling Algorithms for Unbalanced Transport Problems
This article introduces a new class of fast algorithms to approximate
variational problems involving unbalanced optimal transport. While classical
optimal transport considers only normalized probability distributions, it is
important for many applications to be able to compute some sort of relaxed
transportation between arbitrary positive measures. A generic class of such
"unbalanced" optimal transport problems has been recently proposed by several
authors. In this paper, we show how to extend the, now classical, entropic
regularization scheme to these unbalanced problems. This gives rise to fast,
highly parallelizable algorithms that operate by performing only diagonal
scaling (i.e. pointwise multiplications) of the transportation couplings. They
are generalizations of the celebrated Sinkhorn algorithm. We show how these
methods can be used to solve unbalanced transport, unbalanced gradient flows,
and to compute unbalanced barycenters. We showcase applications to 2-D shape
modification, color transfer, and growth models
Entropic regularization approach for mathematical programs with equilibrium constraints
A new smoothing approach based on entropic perturbation is proposed for solving mathematical programs with equilibrium constraints. Some of the desirable properties of the smoothing function are shown. The viability of the proposed approach is supported by a computational study on a set of well-known test problems.Entropic regularization;Smoothing approach;Mathematical programs with equilibrium constraints
A Numerical Method to solve Optimal Transport Problems with Coulomb Cost
In this paper, we present a numerical method, based on iterative Bregman
projections, to solve the optimal transport problem with Coulomb cost. This is
related to the strong interaction limit of Density Functional Theory. The first
idea is to introduce an entropic regularization of the Kantorovich formulation
of the Optimal Transport problem. The regularized problem then corresponds to
the projection of a vector on the intersection of the constraints with respect
to the Kullback-Leibler distance. Iterative Bregman projections on each
marginal constraint are explicit which enables us to approximate the optimal
transport plan. We validate the numerical method against analytical test cases
Regularized Optimal Transport and the Rot Mover's Distance
This paper presents a unified framework for smooth convex regularization of
discrete optimal transport problems. In this context, the regularized optimal
transport turns out to be equivalent to a matrix nearness problem with respect
to Bregman divergences. Our framework thus naturally generalizes a previously
proposed regularization based on the Boltzmann-Shannon entropy related to the
Kullback-Leibler divergence, and solved with the Sinkhorn-Knopp algorithm. We
call the regularized optimal transport distance the rot mover's distance in
reference to the classical earth mover's distance. We develop two generic
schemes that we respectively call the alternate scaling algorithm and the
non-negative alternate scaling algorithm, to compute efficiently the
regularized optimal plans depending on whether the domain of the regularizer
lies within the non-negative orthant or not. These schemes are based on
Dykstra's algorithm with alternate Bregman projections, and further exploit the
Newton-Raphson method when applied to separable divergences. We enhance the
separable case with a sparse extension to deal with high data dimensions. We
also instantiate our proposed framework and discuss the inherent specificities
for well-known regularizers and statistical divergences in the machine learning
and information geometry communities. Finally, we demonstrate the merits of our
methods with experiments using synthetic data to illustrate the effect of
different regularizers and penalties on the solutions, as well as real-world
data for a pattern recognition application to audio scene classification
- …