10 research outputs found
3D Normal Coordinate Systems for Cortical Areas
A surface-based diffeomorphic algorithm to generate 3D coordinate grids in
the cortical ribbon is described. In the grid, normal coordinate lines are
generated by the diffeomorphic evolution from the grey/white (inner) surface to
the grey/csf (outer) surface. Specifically, the cortical ribbon is described by
two triangulated surfaces with open boundaries. Conceptually, the inner surface
sits on top of the white matter structure and the outer on top of the gray
matter. It is assumed that the cortical ribbon consists of cortical columns
which are orthogonal to the white matter surface. This might be viewed as a
consequence of the development of the columns in the embryo. It is also assumed
that the columns are orthogonal to the outer surface so that the resultant
vector field is orthogonal to the evolving surface. Then the distance of the
normal lines from the vector field such that the inner surface evolves
diffeomorphically towards the outer one can be construed as a measure of
thickness. Applications are described for the auditory cortices in human adults
and cats with normal hearing or hearing loss. The approach offers great
potential for cortical morphometry
Manifolds.jl: An Extensible Julia Framework for Data Analysis on Manifolds
For data given on a nonlinear space, like angles, symmetric positive
matrices, the sphere, or the hyperbolic space, there is often enough structure
to form a Riemannian manifold. We present the Julia package Manifolds.jl,
providing a fast and easy to use library of Riemannian manifolds and Lie
groups. We introduce a common interface, available in ManifoldsBase.jl, with
which new manifolds, applications, and algorithms can be implemented. We
demonstrate the utility of Manifolds.jl using B\'ezier splines, an optimization
task on manifolds, and a principal component analysis on nonlinear data. In a
benchmark, Manifolds.jl outperforms existing packages in Matlab or Python by
several orders of magnitude and is about twice as fast as a comparable package
implemented in C++
Recommended from our members
Mini-Workshop: Computational Optimization on Manifolds (online meeting)
The goal of the mini-workshop was to study the geometry, algorithms and applications of unconstrained and constrained optimization problems posed on Riemannian manifolds.
Focus topics included the geometry of particular manifolds, the formulation and analysis of a number of application problems, as well as novel algorithms and their implementation
A Survey of Geometric Optimization for Deep Learning: From Euclidean Space to Riemannian Manifold
Although Deep Learning (DL) has achieved success in complex Artificial
Intelligence (AI) tasks, it suffers from various notorious problems (e.g.,
feature redundancy, and vanishing or exploding gradients), since updating
parameters in Euclidean space cannot fully exploit the geometric structure of
the solution space. As a promising alternative solution, Riemannian-based DL
uses geometric optimization to update parameters on Riemannian manifolds and
can leverage the underlying geometric information. Accordingly, this article
presents a comprehensive survey of applying geometric optimization in DL. At
first, this article introduces the basic procedure of the geometric
optimization, including various geometric optimizers and some concepts of
Riemannian manifold. Subsequently, this article investigates the application of
geometric optimization in different DL networks in various AI tasks, e.g.,
convolution neural network, recurrent neural network, transfer learning, and
optimal transport. Additionally, typical public toolboxes that implement
optimization on manifold are also discussed. Finally, this article makes a
performance comparison between different deep geometric optimization methods
under image recognition scenarios.Comment: 41 page
Méthodes numériques et statistiques pour l'analyse de trajectoire dans un cadre de geométrie Riemannienne.
This PhD proposes new Riemannian geometry tools for the analysis of longitudinal observations of neuro-degenerative subjects. First, we propose a numerical scheme to compute the parallel transport along geodesics. This scheme is efficient as long as the co-metric can be computed efficiently. Then, we tackle the issue of Riemannian manifold learning. We provide some minimal theoretical sanity checks to illustrate that the procedure of Riemannian metric estimation can be relevant. Then, we propose to learn a Riemannian manifold so as to model subject's progressions as geodesics on this manifold. This allows fast inference, extrapolation and classification of the subjects.Cette thèse porte sur l'élaboration d'outils de géométrie riemannienne et de leur application en vue de la modélisation longitudinale de sujets atteints de maladies neuro-dégénératives. Dans une première partie, nous prouvons la convergence d'un schéma numérique pour le transport parallèle. Ce schéma reste efficace tant que l'inverse de la métrique peut être calculé rapidement. Dans une deuxième partie, nous proposons l'apprentissage une variété et une métrique riemannienne. Après quelques résultats théoriques encourageants, nous proposons d'optimiser la modélisation de progression de sujets comme des géodésiques sur cette variété
Introduction to Riemannian Geometry and Geometric Statistics: from basic theory to implementation with Geomstats
International audienceAs data is a predominant resource in applications, Riemannian geometry is a natural framework to model and unify complex nonlinear sources of data.However, the development of computational tools from the basic theory of Riemannian geometry is laborious.The work presented here forms one of the main contributions to the open-source project geomstats, that consists in a Python package providing efficient implementations of the concepts of Riemannian geometry and geometric statistics, both for mathematicians and for applied scientists for whom most of the difficulties are hidden under high-level functions. The goal of this monograph is two-fold. First, we aim at giving a self-contained exposition of the basic concepts of Riemannian geometry, providing illustrations and examples at each step and adopting a computational point of view. The second goal is to demonstrate how these concepts are implemented in Geomstats, explaining the choices that were made and the conventions chosen. The general concepts are exposed and specific examples are detailed along the text.The culmination of this implementation is to be able to perform statistics and machine learning on manifolds, with as few lines of codes as in the wide-spread machine learning tool scikit-learn. We exemplify this with an introduction to geometric statistics
Computational Anatomy in Theano
To model deformation of anatomical shapes, non-linear statistics are required
to take into account the non-linear structure of the data space. Computer
implementations of non-linear statistics and differential geometry algorithms
often lead to long and complex code sequences. The aim of the paper is to show
how the Theano framework can be used for simple and concise implementation of
complex differential geometry algorithms while being able to handle complex and
high-dimensional data structures. We show how the Theano framework meets both
of these requirements. The framework provides a symbolic language that allows
mathematical equations to be directly translated into Theano code, and it is
able to perform both fast CPU and GPU computations on high-dimensional data. We
show how different concepts from non-linear statistics and differential
geometry can be implemented in Theano, and give examples of the implemented
theory visualized on landmark representations of Corpus Callosum shapes