229 research outputs found
LOCA: LOcal Conformal Autoencoder for standardized data coordinates
We propose a deep-learning based method for obtaining standardized data
coordinates from scientific measurements.Data observations are modeled as
samples from an unknown, non-linear deformation of an underlying Riemannian
manifold, which is parametrized by a few normalized latent variables. By
leveraging a repeated measurement sampling strategy, we present a method for
learning an embedding in that is isometric to the latent
variables of the manifold. These data coordinates, being invariant under smooth
changes of variables, enable matching between different instrumental
observations of the same phenomenon. Our embedding is obtained using a LOcal
Conformal Autoencoder (LOCA), an algorithm that constructs an embedding to
rectify deformations by using a local z-scoring procedure while preserving
relevant geometric information. We demonstrate the isometric embedding
properties of LOCA on various model settings and observe that it exhibits
promising interpolation and extrapolation capabilities. Finally, we apply LOCA
to single-site Wi-Fi localization data, and to -dimensional curved surface
estimation based on a -dimensional projection
Hyperbolic Interaction Model For Hierarchical Multi-Label Classification
Different from the traditional classification tasks which assume mutual
exclusion of labels, hierarchical multi-label classification (HMLC) aims to
assign multiple labels to every instance with the labels organized under
hierarchical relations. Besides the labels, since linguistic ontologies are
intrinsic hierarchies, the conceptual relations between words can also form
hierarchical structures. Thus it can be a challenge to learn mappings from word
hierarchies to label hierarchies. We propose to model the word and label
hierarchies by embedding them jointly in the hyperbolic space. The main reason
is that the tree-likeness of the hyperbolic space matches the complexity of
symbolic data with hierarchical structures. A new Hyperbolic Interaction Model
(HyperIM) is designed to learn the label-aware document representations and
make predictions for HMLC. Extensive experiments are conducted on three
benchmark datasets. The results have demonstrated that the new model can
realistically capture the complex data structures and further improve the
performance for HMLC comparing with the state-of-the-art methods. To facilitate
future research, our code is publicly available
Calculating Sparse and Dense Correspondences for Near-Isometric Shapes
Comparing and analysing digital models are basic techniques of geometric shape processing. These techniques have a variety of applications, such as extracting the domain knowledge contained in the growing number of digital models to simplify shape modelling. Another example application is the analysis of real-world objects, which itself has a variety of applications, such as medical examinations, medical and agricultural research, and infrastructure maintenance. As methods to digitalize physical objects mature, any advances in the analysis of digital shapes lead to progress in the analysis of real-world objects. Global shape properties, like volume and surface area, are simple to compare but contain only very limited information. Much more information is contained in local shape differences, such as where and how a plant grew. Sadly the computation of local shape differences is hard as it requires knowledge of corresponding point pairs, i.e. points on both shapes that correspond to each other. The following article thesis (cumulative dissertation) discusses several recent publications for the computation of corresponding points: - Geodesic distances between points, i.e. distances along the surface, are fundamental for several shape processing tasks as well as several shape matching techniques. Chapter 3 introduces and analyses fast and accurate bounds on geodesic distances. - When building a shape space on a set of shapes, misaligned correspondences lead to points moving along the surfaces and finally to a larger shape space. Chapter 4 shows that this also works the other way around, that is good correspondences are obtain by optimizing them to generate a compact shape space. - Representing correspondences with a “functional map” has a variety of advantages. Chapter 5 shows that representing the correspondence map as an alignment of Green’s functions of the Laplace operator has similar advantages, but is much less dependent on the number of eigenvectors used for the computations. - Quadratic assignment problems were recently shown to reliably yield sparse correspondences. Chapter 6 compares state-of-the-art convex relaxations of graphics and vision with methods from discrete optimization on typical quadratic assignment problems emerging in shape matching
Learning shape correspondence with anisotropic convolutional neural networks
Establishing correspondence between shapes is a fundamental problem in
geometry processing, arising in a wide variety of applications. The problem is
especially difficult in the setting of non-isometric deformations, as well as
in the presence of topological noise and missing parts, mainly due to the
limited capability to model such deformations axiomatically. Several recent
works showed that invariance to complex shape transformations can be learned
from examples. In this paper, we introduce an intrinsic convolutional neural
network architecture based on anisotropic diffusion kernels, which we term
Anisotropic Convolutional Neural Network (ACNN). In our construction, we
generalize convolutions to non-Euclidean domains by constructing a set of
oriented anisotropic diffusion kernels, creating in this way a local intrinsic
polar representation of the data (`patch'), which is then correlated with a
filter. Several cascades of such filters, linear, and non-linear operators are
stacked to form a deep neural network whose parameters are learned by
minimizing a task-specific cost. We use ACNNs to effectively learn intrinsic
dense correspondences between deformable shapes in very challenging settings,
achieving state-of-the-art results on some of the most difficult recent
correspondence benchmarks
Stable Teichmueller quasigeodesics and ending laminations
We characterize which cobounded quasigeodesics in the Teichmueller space T of
a closed surface are at bounded distance from a geodesic. More generally, given
a cobounded lipschitz path gamma in T, we show that gamma is a quasigeodesic
with finite Hausdorff distance from some geodesic if and only if the canonical
hyperbolic plane bundle over gamma is a hyperbolic metric space. As an
application, for complete hyperbolic 3-manifolds N with finitely generated,
freely indecomposable fundamental group and with bounded geometry, we give a
new construction of model geometries for the geometrically infinite ends of N,
a key step in Minsky's proof of Thurston's ending lamination conjecture for
such manifolds.Comment: Published in Geometry and Topology at
http://www.maths.warwick.ac.uk/gt/GTVol7/paper2.abs.htm
Kernel learning over the manifold of symmetric positive definite matrices for dimensionality reduction in a BCI application
In this paper, we propose a kernel for nonlinear dimensionality reduction over the manifold of Symmetric Positive Definite (SPD) matrices in a Motor Imagery (MI)-based Brain Computer Interface (BCI) application. The proposed kernel, which is based on Riemannian geometry, tries to preserve the topology of data points in the feature space. Topology preservation is the main challenge in nonlinear dimensionality reduction (NLDR). Our main idea is to decrease the non-Euclidean characteristics of the manifold by modifying the volume elements. We apply a conformal transform over data-dependent isometric mapping to reduce the negative eigen fraction to learn a data dependent kernel over the Riemannian manifolds. Multiple experiments were carried out using the proposed kernel for a dimensionality reduction of SPD matrices that describe the EEG signals of dataset IIa from BCI competition IV.
The experiments show that this kernel adapts to the input data and leads to promising results in comparison with the most popular manifold learning methods and the Common Spatial Pattern (CSP) technique as a reference algorithm in BCI competitions. The proposed kernel is strong, particularly in the cases where data points have a complex and nonlinear separable distribution
- …