32 research outputs found

    A discrete framework to find the optimal matching between manifold-valued curves

    Full text link
    The aim of this paper is to find an optimal matching between manifold-valued curves, and thereby adequately compare their shapes, seen as equivalent classes with respect to the action of reparameterization. Using a canonical decomposition of a path in a principal bundle, we introduce a simple algorithm that finds an optimal matching between two curves by computing the geodesic of the infinite-dimensional manifold of curves that is at all time horizontal to the fibers of the shape bundle. We focus on the elastic metric studied in the so-called square root velocity framework. The quotient structure of the shape bundle is examined, and in particular horizontality with respect to the fibers. These results are more generally given for any elastic metric. We then introduce a comprehensive discrete framework which correctly approximates the smooth setting when the base manifold has constant sectional curvature. It is itself a Riemannian structure on the product manifold of "discrete curves" given by a finite number of points, and we show its convergence to the continuous model as the size of the discretization goes to infinity. Illustrations of optimal matching between discrete curves are given in the hyperbolic plane, the plane and the sphere, for synthetic and real data, and comparison with dynamic programming is established

    Computing distances and geodesics between manifold-valued curves in the SRV framework

    Full text link
    This paper focuses on the study of open curves in a Riemannian manifold M, and proposes a reparametrization invariant metric on the space of such paths. We use the square root velocity function (SRVF) introduced by Srivastava et al. to define a Riemannian metric on the space of immersions M'=Imm([0,1],M) by pullback of a natural metric on the tangent bundle TM'. This induces a first-order Sobolev metric on M' and leads to a distance which takes into account the distance between the origins in M and the L2-distance between the SRV representations of the curves. The geodesic equations for this metric are given and exploited to define an exponential map on M'. The optimal deformation of one curve into another can then be constructed using geodesic shooting, which requires to characterize the Jacobi fields of M'. The particular case of curves lying in the hyperbolic half-plane is considered as an example, in the setting of radar signal processing

    The Fisher-Rao geometry of beta distributions applied to the study of canonical moments

    Get PDF
    This paper studies the Fisher-Rao geometry on the parameter space of beta distributions. We derive the geodesic equations and the sectional curvature, and prove that it is negative. This leads to uniqueness for the Riemannian centroid in that space. We use this Riemannian structure to study canonical moments, an intrinsic representation of the moments of a probability distribution. Drawing on the fact that a uniform distribution in the regular moment space corresponds to a product of beta distributions in the canonical moment space, we propose a mapping from the space of canonical moments to the product beta manifold, allowing us to use the Fisher-Rao geometry of beta distributions to compare and analyze canonical moments

    Quantization and clustering on Riemannian manifolds with an application to air traffic analysis

    Get PDF
    International audienceThe goal of quantization is to find the best approximation of a probability distribution by a discrete measure with finite support. When dealing with empirical distributions, this boils down to finding the best summary of the data by a smaller number of points, and automatically yields a k-means-type clustering. In this paper, we introduce Competitive Learning Riemannian Quantization (CLRQ), an online quantization algorithm that applies when the data does not belong to a vector space, but rather a Riemannian manifold. It can be seen as a density approximation procedure as well as a clustering method. Compared to many clustering algorihtms, it requires few distance computations, which is particularly computationally advantageous in the manifold setting. We prove its convergence and show simulated examples on the sphere and the hyperbolic plane. We also provide an application to real data by using CLRQ to create summaries of images of covariance matrices estimated from air traffic images. These summaries are representative of the air traffic complexity and yield clusterings of the airspaces into zones that are homogeneous with respect to that criterion. They can then be compared using discrete optimal transport and be further used as inputs of a machine learning algorithm or as indexes in a traffic database

    Conjugate points along Kolmogorov flows on the torus

    Full text link
    The geodesics in the group of volume-preserving diffeomorphisms (volumorphisms) of a manifold M , for a Riemannian metric defined by the kinetic energy, can be used to model the movement of ideal fluids in that manifold. The existence of conjugate points along such geodesics reveal that these cease to be infinitesimally length-minimizing between their endpoints. In this work, we focus on the case of the torus M = T 2 and on geodesics corresponding to steady solutions of the Euler equation generated by stream functions ψ\psi = -- cos(mx) cos(ny) for positive integers m and n, called Kolmogorov flows. We show the existence of conjugate points along these geodesics for all (m, n), with the sole exception of m = n = 1. We also discuss the unusual features of this special case and conjecture that there are no conjugate points in this case

    The LpL^p-Fisher-Rao metric and Amari-Cencov α\alpha-connections

    Full text link
    We introduce a family of Finsler metrics, called the LpL^p-Fisher-Rao metrics FpF_p, for p∈(1,∞)p\in (1,\infty), which generalizes the classical Fisher-Rao metric F2F_2, both on the space of densities Dens+(M)_+(M) and probability densities Prob(M)(M). We then study their relations to the Amari-\u{C}encov α\alpha-connections ∇(α)\nabla^{(\alpha)} from information geometry: on Dens+(M)_+(M), the geodesic equations of FpF_p and ∇(α)\nabla^{(\alpha)} coincide, for p=2/(1−α)p = 2/(1-\alpha). Both are pullbacks of canonical constructions on Lp(M)L^p(M), in which geodesics are simply straight lines. In particular, this gives a new variational interpretation of α\alpha-geodesics as being energy minimizing curves. On Prob(M)(M), the FpF_p and ∇(α)\nabla^{(\alpha)} geodesics can still be thought as pullbacks of natural operations on the unit sphere in Lp(M)L^p(M), but in this case they no longer coincide unless p=2p=2. Using this transformation, we solve the geodesic equation of the α\alpha-connection by showing that the geodesic are pullbacks of projections of straight lines onto the unit sphere, and they always cease to exists after finite time when they leave the positive part of the sphere. This unveils the geometric structure of solutions to the generalized Proudman-Johnson equations, and generalizes them to higher dimensions. In addition, we calculate the associate tensors of FpF_p, and study their relation to ∇(α)\nabla^{(\alpha)}

    Parametric information geometry with the package Geomstats

    Full text link
    We introduce the information geometry module of the Python package Geomstats. The module first implements Fisher-Rao Riemannian manifolds of widely used parametric families of probability distributions, such as normal, gamma, beta, Dirichlet distributions, and more. The module further gives the Fisher-Rao Riemannian geometry of any parametric family of distributions of interest, given a parameterized probability density function as input. The implemented Riemannian geometry tools allow users to compare, average, interpolate between distributions inside a given family. Importantly, such capabilities open the door to statistics and machine learning on probability distributions. We present the object-oriented implementation of the module along with illustrative examples and show how it can be used to perform learning on manifolds of parametric probability distributions
    corecore