156 research outputs found
Graph Signal Representation with Wasserstein Barycenters
In many applications signals reside on the vertices of weighted graphs. Thus,
there is the need to learn low dimensional representations for graph signals
that will allow for data analysis and interpretation. Existing unsupervised
dimensionality reduction methods for graph signals have focused on dictionary
learning. In these works the graph is taken into consideration by imposing a
structure or a parametrization on the dictionary and the signals are
represented as linear combinations of the atoms in the dictionary. However, the
assumption that graph signals can be represented using linear combinations of
atoms is not always appropriate. In this paper we propose a novel
representation framework based on non-linear and geometry-aware combinations of
graph signals by leveraging the mathematical theory of Optimal Transport. We
represent graph signals as Wasserstein barycenters and demonstrate through our
experiments the potential of our proposed framework for low-dimensional graph
signal representation
Regularized Wasserstein Means for Aligning Distributional Data
We propose to align distributional data from the perspective of Wasserstein
means. We raise the problem of regularizing Wasserstein means and propose
several terms tailored to tackle different problems. Our formulation is based
on the variational transportation to distribute a sparse discrete measure into
the target domain. The resulting sparse representation well captures the
desired property of the domain while reducing the mapping cost. We demonstrate
the scalability and robustness of our method with examples in domain
adaptation, point set registration, and skeleton layout
Optimal Transport for Domain Adaptation
Domain adaptation from one data space (or domain) to another is one of the
most challenging tasks of modern data analytics. If the adaptation is done
correctly, models built on a specific data space become more robust when
confronted to data depicting the same semantic concepts (the classes), but
observed by another observation system with its own specificities. Among the
many strategies proposed to adapt a domain to another, finding a common
representation has shown excellent properties: by finding a common
representation for both domains, a single classifier can be effective in both
and use labelled samples from the source domain to predict the unlabelled
samples of the target domain. In this paper, we propose a regularized
unsupervised optimal transportation model to perform the alignment of the
representations in the source and target domains. We learn a transportation
plan matching both PDFs, which constrains labelled samples in the source domain
to remain close during transport. This way, we exploit at the same time the few
labeled information in the source and the unlabelled distributions observed in
both domains. Experiments in toy and challenging real visual adaptation
examples show the interest of the method, that consistently outperforms state
of the art approaches
Stability of Entropic Wasserstein Barycenters and application to random geometric graphs
As interest in graph data has grown in recent years, the computation of
various geometric tools has become essential. In some area such as mesh
processing, they often rely on the computation of geodesics and shortest paths
in discretized manifolds. A recent example of such a tool is the computation of
Wasserstein barycenters (WB), a very general notion of barycenters derived from
the theory of Optimal Transport, and their entropic-regularized variant. In
this paper, we examine how WBs on discretized meshes relate to the geometry of
the underlying manifold. We first provide a generic stability result with
respect to the input cost matrices. We then apply this result to random
geometric graphs on manifolds, whose shortest paths converge to geodesics,
hence proving the consistency of WBs computed on discretized shapes
Landmarks Augmentation with Manifold-Barycentric Oversampling
The training of Generative Adversarial Networks (GANs) requires a large
amount of data, stimulating the development of new augmentation methods to
alleviate the challenge. Oftentimes, these methods either fail to produce
enough new data or expand the dataset beyond the original manifold. In this
paper, we propose a new augmentation method that guarantees to keep the new
data within the original data manifold thanks to the optimal transport theory.
The proposed algorithm finds cliques in the nearest-neighbors graph and, at
each sampling iteration, randomly draws one clique to compute the Wasserstein
barycenter with random uniform weights. These barycenters then become the new
natural-looking elements that one could add to the dataset. We apply this
approach to the problem of landmarks detection and augment the available
annotation in both unpaired and in semi-supervised scenarios. Additionally, the
idea is validated on cardiac data for the task of medical segmentation. Our
approach reduces the overfitting and improves the quality metrics beyond the
original data outcome and beyond the result obtained with popular modern
augmentation methods.Comment: 11 pages, 4 figures, 3 tables. I.B. and N.B. contributed equally.
D.V.D. is the corresponding autho
- …