603 research outputs found

    Stochastic Wasserstein Barycenters

    Full text link
    We present a stochastic algorithm to compute the barycenter of a set of probability distributions under the Wasserstein metric from optimal transport. Unlike previous approaches, our method extends to continuous input distributions and allows the support of the barycenter to be adjusted in each iteration. We tackle the problem without regularization, allowing us to recover a sharp output whose support is contained within the support of the true barycenter. We give examples where our algorithm recovers a more meaningful barycenter than previous work. Our method is versatile and can be extended to applications such as generating super samples from a given distribution and recovering blue noise approximations.Comment: ICML 201

    Sinkhorn Barycenters with Free Support via Frank-Wolfe Algorithm

    Full text link
    We present a novel algorithm to estimate the barycenter of arbitrary probability distributions with respect to the Sinkhorn divergence. Based on a Frank-Wolfe optimization strategy, our approach proceeds by populating the support of the barycenter incrementally, without requiring any pre-allocation. We consider discrete as well as continuous distributions, proving convergence rates of the proposed algorithm in both settings. Key elements of our analysis are a new result showing that the Sinkhorn divergence on compact domains has Lipschitz continuous gradient with respect to the Total Variation and a characterization of the sample complexity of Sinkhorn potentials. Experiments validate the effectiveness of our method in practice.Comment: 46 pages, 8 figure

    Bayesian Learning with Wasserstein Barycenters

    Full text link
    We introduce a novel paradigm for Bayesian learning based on optimal transport theory. Namely, we propose to use the Wasserstein barycenter of the posterior law on models as a predictive posterior, thus introducing an alternative to classical choices like the maximum a posteriori estimator and the Bayesian model average. We exhibit conditions granting the existence and statistical consistency of this estimator, discuss some of its basic and specific properties, and provide insight into its theoretical advantages. Finally, we introduce a novel numerical method which is ideally suited for the computation of our estimator, and we explicitly discuss its implementations for specific families of models. This method can be seen as a stochastic gradient descent algorithm in the Wasserstein space, and is of independent interest and applicability for the computation of Wasserstein barycenters. We also provide an illustrative numerical example for experimental validation of the proposed method.Comment: This version is a significant expansion from the previous one. As a new contribution we introduce a numerical method, that corresponds to a stochastic gradient descent algorithm in Wasserstein space. Additionally, we expanded the study about statistical consistency, and included a comprehensive numerical experiment for validation. 32 pages, 7 figure
    • …
    corecore