5,773 research outputs found
Geometric deep learning: going beyond Euclidean data
Many scientific fields study data with an underlying structure that is a
non-Euclidean space. Some examples include social networks in computational
social sciences, sensor networks in communications, functional networks in
brain imaging, regulatory networks in genetics, and meshed surfaces in computer
graphics. In many applications, such geometric data are large and complex (in
the case of social networks, on the scale of billions), and are natural targets
for machine learning techniques. In particular, we would like to use deep
neural networks, which have recently proven to be powerful tools for a broad
range of problems from computer vision, natural language processing, and audio
analysis. However, these tools have been most successful on data with an
underlying Euclidean or grid-like structure, and in cases where the invariances
of these structures are built into networks used to model them. Geometric deep
learning is an umbrella term for emerging techniques attempting to generalize
(structured) deep neural models to non-Euclidean domains such as graphs and
manifolds. The purpose of this paper is to overview different examples of
geometric deep learning problems and present available solutions, key
difficulties, applications, and future research directions in this nascent
field
Efficient Deformable Shape Correspondence via Kernel Matching
We present a method to match three dimensional shapes under non-isometric
deformations, topology changes and partiality. We formulate the problem as
matching between a set of pair-wise and point-wise descriptors, imposing a
continuity prior on the mapping, and propose a projected descent optimization
procedure inspired by difference of convex functions (DC) programming.
Surprisingly, in spite of the highly non-convex nature of the resulting
quadratic assignment problem, our method converges to a semantically meaningful
and continuous mapping in most of our experiments, and scales well. We provide
preliminary theoretical analysis and several interpretations of the method.Comment: Accepted for oral presentation at 3DV 2017, including supplementary
materia
Unified Heat Kernel Regression for Diffusion, Kernel Smoothing and Wavelets on Manifolds and Its Application to Mandible Growth Modeling in CT Images
We present a novel kernel regression framework for smoothing scalar surface
data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel
constructed from the eigenfunctions, we formulate a new bivariate kernel
regression framework as a weighted eigenfunction expansion with the heat kernel
as the weights. The new kernel regression is mathematically equivalent to
isotropic heat diffusion, kernel smoothing and recently popular diffusion
wavelets. Unlike many previous partial differential equation based approaches
involving diffusion, our approach represents the solution of diffusion
analytically, reducing numerical inaccuracy and slow convergence. The numerical
implementation is validated on a unit sphere using spherical harmonics. As an
illustration, we have applied the method in characterizing the localized growth
pattern of mandible surfaces obtained in CT images from subjects between ages 0
and 20 years by regressing the length of displacement vectors with respect to
the template surface.Comment: Accepted in Medical Image Analysi
An interactive analysis of harmonic and diffusion equations on discrete 3D shapes
AbstractRecent results in geometry processing have shown that shape segmentation, comparison, and analysis can be successfully addressed through the spectral properties of the Laplace–Beltrami operator, which is involved in the harmonic equation, the Laplacian eigenproblem, the heat diffusion equation, and the definition of spectral distances, such as the bi-harmonic, commute time, and diffusion distances. In this paper, we study the discretization and the main properties of the solutions to these equations on 3D surfaces and their applications to shape analysis. Among the main factors that influence their computation, as well as the corresponding distances, we focus our attention on the choice of different Laplacian matrices, initial boundary conditions, and input shapes. These degrees of freedom motivate our choice to address this study through the executable paper, which allows the user to perform a large set of experiments and select his/her own parameters. Finally, we represent these distances in a unified way and provide a simple procedure to generate new distances on 3D shapes
Steklov Spectral Geometry for Extrinsic Shape Analysis
We propose using the Dirichlet-to-Neumann operator as an extrinsic
alternative to the Laplacian for spectral geometry processing and shape
analysis. Intrinsic approaches, usually based on the Laplace-Beltrami operator,
cannot capture the spatial embedding of a shape up to rigid motion, and many
previous extrinsic methods lack theoretical justification. Instead, we consider
the Steklov eigenvalue problem, computing the spectrum of the
Dirichlet-to-Neumann operator of a surface bounding a volume. A remarkable
property of this operator is that it completely encodes volumetric geometry. We
use the boundary element method (BEM) to discretize the operator, accelerated
by hierarchical numerical schemes and preconditioning; this pipeline allows us
to solve eigenvalue and linear problems on large-scale meshes despite the
density of the Dirichlet-to-Neumann discretization. We further demonstrate that
our operators naturally fit into existing frameworks for geometry processing,
making a shift from intrinsic to extrinsic geometry as simple as substituting
the Laplace-Beltrami operator with the Dirichlet-to-Neumann operator.Comment: Additional experiments adde
NetLSD: Hearing the Shape of a Graph
Comparison among graphs is ubiquitous in graph analytics. However, it is a
hard task in terms of the expressiveness of the employed similarity measure and
the efficiency of its computation. Ideally, graph comparison should be
invariant to the order of nodes and the sizes of compared graphs, adaptive to
the scale of graph patterns, and scalable. Unfortunately, these properties have
not been addressed together. Graph comparisons still rely on direct approaches,
graph kernels, or representation-based methods, which are all inefficient and
impractical for large graph collections.
In this paper, we propose the Network Laplacian Spectral Descriptor (NetLSD):
the first, to our knowledge, permutation- and size-invariant, scale-adaptive,
and efficiently computable graph representation method that allows for
straightforward comparisons of large graphs. NetLSD extracts a compact
signature that inherits the formal properties of the Laplacian spectrum,
specifically its heat or wave kernel; thus, it hears the shape of a graph. Our
evaluation on a variety of real-world graphs demonstrates that it outperforms
previous works in both expressiveness and efficiency.Comment: KDD '18: The 24th ACM SIGKDD International Conference on Knowledge
Discovery & Data Mining, August 19--23, 2018, London, United Kingdo
- …