317 research outputs found
Geometric deep learning: going beyond Euclidean data
Many scientific fields study data with an underlying structure that is a
non-Euclidean space. Some examples include social networks in computational
social sciences, sensor networks in communications, functional networks in
brain imaging, regulatory networks in genetics, and meshed surfaces in computer
graphics. In many applications, such geometric data are large and complex (in
the case of social networks, on the scale of billions), and are natural targets
for machine learning techniques. In particular, we would like to use deep
neural networks, which have recently proven to be powerful tools for a broad
range of problems from computer vision, natural language processing, and audio
analysis. However, these tools have been most successful on data with an
underlying Euclidean or grid-like structure, and in cases where the invariances
of these structures are built into networks used to model them. Geometric deep
learning is an umbrella term for emerging techniques attempting to generalize
(structured) deep neural models to non-Euclidean domains such as graphs and
manifolds. The purpose of this paper is to overview different examples of
geometric deep learning problems and present available solutions, key
difficulties, applications, and future research directions in this nascent
field
Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)
The implicit objective of the biennial "international - Traveling Workshop on
Interactions between Sparse models and Technology" (iTWIST) is to foster
collaboration between international scientific teams by disseminating ideas
through both specific oral/poster presentations and free discussions. For its
second edition, the iTWIST workshop took place in the medieval and picturesque
town of Namur in Belgium, from Wednesday August 27th till Friday August 29th,
2014. The workshop was conveniently located in "The Arsenal" building within
walking distance of both hotels and town center. iTWIST'14 has gathered about
70 international participants and has featured 9 invited talks, 10 oral
presentations, and 14 posters on the following themes, all related to the
theory, application and generalization of the "sparsity paradigm":
Sparsity-driven data sensing and processing; Union of low dimensional
subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph
sensing/processing; Blind inverse problems and dictionary learning; Sparsity
and computational neuroscience; Information theory, geometry and randomness;
Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?;
Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website:
http://sites.google.com/site/itwist1
A relaxed approach for curve matching with elastic metrics
In this paper we study a class of Riemannian metrics on the space of
unparametrized curves and develop a method to compute geodesics with given
boundary conditions. It extends previous works on this topic in several
important ways. The model and resulting matching algorithm integrate within one
common setting both the family of -metrics with constant coefficients and
scale-invariant -metrics on both open and closed immersed curves. These
families include as particular cases the class of first-order elastic metrics.
An essential difference with prior approaches is the way that boundary
constraints are dealt with. By leveraging varifold-based similarity metrics we
propose a relaxed variational formulation for the matching problem that avoids
the necessity of optimizing over the reparametrization group. Furthermore, we
show that we can also quotient out finite-dimensional similarity groups such as
translation, rotation and scaling groups. The different properties and
advantages are illustrated through numerical examples in which we also provide
a comparison with related diffeomorphic methods used in shape registration.Comment: 27 page
Piecewise rigid curve deformation via a Finsler steepest descent
This paper introduces a novel steepest descent flow in Banach spaces. This
extends previous works on generalized gradient descent, notably the work of
Charpiat et al., to the setting of Finsler metrics. Such a generalized gradient
allows one to take into account a prior on deformations (e.g., piecewise rigid)
in order to favor some specific evolutions. We define a Finsler gradient
descent method to minimize a functional defined on a Banach space and we prove
a convergence theorem for such a method. In particular, we show that the use of
non-Hilbertian norms on Banach spaces is useful to study non-convex
optimization problems where the geometry of the space might play a crucial role
to avoid poor local minima. We show some applications to the curve matching
problem. In particular, we characterize piecewise rigid deformations on the
space of curves and we study several models to perform piecewise rigid
evolution of curves
Perturbation of the Eigenvectors of the Graph Laplacian: Application to Image Denoising
The original contributions of this paper are twofold: a new understanding of
the influence of noise on the eigenvectors of the graph Laplacian of a set of
image patches, and an algorithm to estimate a denoised set of patches from a
noisy image. The algorithm relies on the following two observations: (1) the
low-index eigenvectors of the diffusion, or graph Laplacian, operators are very
robust to random perturbations of the weights and random changes in the
connections of the patch-graph; and (2) patches extracted from smooth regions
of the image are organized along smooth low-dimensional structures in the
patch-set, and therefore can be reconstructed with few eigenvectors.
Experiments demonstrate that our denoising algorithm outperforms the denoising
gold-standards
Kernel Spectral Curvature Clustering (KSCC)
Multi-manifold modeling is increasingly used in segmentation and data
representation tasks in computer vision and related fields. While the general
problem, modeling data by mixtures of manifolds, is very challenging, several
approaches exist for modeling data by mixtures of affine subspaces (which is
often referred to as hybrid linear modeling). We translate some important
instances of multi-manifold modeling to hybrid linear modeling in embedded
spaces, without explicitly performing the embedding but applying the kernel
trick. The resulting algorithm, Kernel Spectral Curvature Clustering, uses
kernels at two levels - both as an implicit embedding method to linearize
nonflat manifolds and as a principled method to convert a multiway affinity
problem into a spectral clustering one. We demonstrate the effectiveness of the
method by comparing it with other state-of-the-art methods on both synthetic
data and a real-world problem of segmenting multiple motions from two
perspective camera views.Comment: accepted to 2009 ICCV Workshop on Dynamical Visio
- …