3 research outputs found

    Fast Algorithms for Computational Optimal Transport and Wasserstein Barycenter

    Full text link
    We provide theoretical complexity analysis for new algorithms to compute the optimal transport (OT) distance between two discrete probability distributions, and demonstrate their favorable practical performance over state-of-art primal-dual algorithms and their capability in solving other problems in large-scale, such as the Wasserstein barycenter problem for multiple probability distributions. First, we introduce the \emph{accelerated primal-dual randomized coordinate descent} (APDRCD) algorithm for computing the OT distance. We provide its complexity upper bound \bigOtil(\frac{n^{5/2}}{\varepsilon}) where nn stands for the number of atoms of these probability measures and ε>0\varepsilon > 0 is the desired accuracy. This complexity bound matches the best known complexities of primal-dual algorithms for the OT problems, including the adaptive primal-dual accelerated gradient descent (APDAGD) and the adaptive primal-dual accelerated mirror descent (APDAMD) algorithms. Then, we demonstrate the better performance of the APDRCD algorithm over the APDAGD and APDAMD algorithms through extensive experimental studies, and further improve its practical performance by proposing a greedy version of it, which we refer to as \emph{accelerated primal-dual greedy coordinate descent} (APDGCD). Finally, we generalize the APDRCD and APDGCD algorithms to distributed algorithms for computing the Wasserstein barycenter for multiple probability distributions.Comment: 18 pages, 35 figure

    On Unbalanced Optimal Transport: An Analysis of Sinkhorn Algorithm

    Full text link
    We provide a computational complexity analysis for the Sinkhorn algorithm that solves the entropic regularized Unbalanced Optimal Transport (UOT) problem between two measures of possibly different masses with at most nn components. We show that the complexity of the Sinkhorn algorithm for finding an ε\varepsilon-approximate solution to the UOT problem is of order O~(n2/ε)\widetilde{\mathcal{O}}(n^2/ \varepsilon), which is near-linear time. To the best of our knowledge, this complexity is better than the complexity of the Sinkhorn algorithm for solving the Optimal Transport (OT) problem, which is of order O~(n2/ε2)\widetilde{\mathcal{O}}(n^2/\varepsilon^2). Our proof technique is based on the geometric convergence of the Sinkhorn updates to the optimal dual solution of the entropic regularized UOT problem and some properties of the primal solution. It is also different from the proof for the complexity of the Sinkhorn algorithm for approximating the OT problem since the UOT solution does not have to meet the marginal constraints

    A New Randomized Primal-Dual Algorithm for Convex Optimization with Optimal Last Iterate Rates

    Full text link
    We develop a novel unified randomized block-coordinate primal-dual algorithm to solve a class of nonsmooth constrained convex optimization problems, which covers different existing variants and model settings from the literature. We prove that our algorithm achieves optimal O(n/k)\mathcal{O}(n/k) and O(n2/k2)\mathcal{O}(n^2/k^2) convergence rates (up to a constant factor) in two cases: general convexity and strong convexity, respectively, where kk is the iteration counter and n is the number of block-coordinates. Our convergence rates are obtained through three criteria: primal objective residual and primal feasibility violation, dual objective residual, and primal-dual expected gap. Moreover, our rates for the primal problem are on the last iterate sequence. Our dual convergence guarantee requires additionally a Lipschitz continuity assumption. We specify our algorithm to handle two important special cases, where our rates are still applied. Finally, we verify our algorithm on two well-studied numerical examples and compare it with two existing methods. Our results show that the proposed method has encouraging performance on different experiments.Comment: 29, 5 figure
    corecore