18 research outputs found

    On Approximating the Riemannian 1-Center

    No full text
    International audienceIn this paper, we generalize the simple Euclidean 1-center approximation algorithm of Badoiu and Clarkson (2003) to Riemannian geometries and study accordingly the convergence rate. We then show how to instantiate this generic algorithm to two particular cases: (1) hyperbolic geometry, and (2) Riemannian manifold of symmetric positive definite matrices

    Algorithms for distance problems in planar complexes of global nonpositive curvature

    Full text link
    CAT(0) metric spaces and hyperbolic spaces play an important role in combinatorial and geometric group theory. In this paper, we present efficient algorithms for distance problems in CAT(0) planar complexes. First of all, we present an algorithm for answering single-point distance queries in a CAT(0) planar complex. Namely, we show that for a CAT(0) planar complex K with n vertices, one can construct in O(n^2 log n) time a data structure D of size O(n^2) so that, given a point x in K, the shortest path gamma(x,y) between x and the query point y can be computed in linear time. Our second algorithm computes the convex hull of a finite set of points in a CAT(0) planar complex. This algorithm is based on Toussaint's algorithm for computing the convex hull of a finite set of points in a simple polygon and it constructs the convex hull of a set of k points in O(n^2 log n + nk log k) time, using a data structure of size O(n^2 + k)

    Fisher-Rao distance and pullback SPD cone distances between multivariate normal distributions

    Full text link
    Data sets of multivariate normal distributions abound in many scientific areas like diffusion tensor imaging, structure tensor computer vision, radar signal processing, machine learning, just to name a few. In order to process those normal data sets for downstream tasks like filtering, classification or clustering, one needs to define proper notions of dissimilarities between normals and paths joining them. The Fisher-Rao distance defined as the Riemannian geodesic distance induced by the Fisher information metric is such a principled metric distance which however is not known in closed-form excepts for a few particular cases. In this work, we first report a fast and robust method to approximate arbitrarily finely the Fisher-Rao distance between multivariate normal distributions. Second, we introduce a class of distances based on diffeomorphic embeddings of the normal manifold into a submanifold of the higher-dimensional symmetric positive-definite cone corresponding to the manifold of centered normal distributions. We show that the projective Hilbert distance on the cone yields a metric on the embedded normal submanifold and we pullback that cone distance with its associated straight line Hilbert cone geodesics to obtain a distance and smooth paths between normal distributions. Compared to the Fisher-Rao distance approximation, the pullback Hilbert cone distance is computationally light since it requires to compute only the extreme minimal and maximal eigenvalues of matrices. Finally, we show how to use those distances in clustering tasks.Comment: 25 page

    Geometric matrix midranges

    Get PDF
    We define geometric matrix midranges for positive definite Hermitian matrices and study the midrange problem from a number of perspectives. Special attention is given to the midrange of two positive definite matrices before considering the extension of the problem to N>2N > 2 matrices. We compare matrix midrange statistics with the scalar and vector midrange problem and note the special significance of the matrix problem from a computational standpoint. We also study various aspects of geometric matrix midrange statistics from the viewpoint of linear algebra, differential geometry and convex optimization.ECH2020 EUROPEAN RESEARCH COUNCIL (ERC) (670645

    Mumford-Shah and Potts Regularization for Manifold-Valued Data with Applications to DTI and Q-Ball Imaging

    Full text link
    Mumford-Shah and Potts functionals are powerful variational models for regularization which are widely used in signal and image processing; typical applications are edge-preserving denoising and segmentation. Being both non-smooth and non-convex, they are computationally challenging even for scalar data. For manifold-valued data, the problem becomes even more involved since typical features of vector spaces are not available. In this paper, we propose algorithms for Mumford-Shah and for Potts regularization of manifold-valued signals and images. For the univariate problems, we derive solvers based on dynamic programming combined with (convex) optimization techniques for manifold-valued data. For the class of Cartan-Hadamard manifolds (which includes the data space in diffusion tensor imaging), we show that our algorithms compute global minimizers for any starting point. For the multivariate Mumford-Shah and Potts problems (for image regularization) we propose a splitting into suitable subproblems which we can solve exactly using the techniques developed for the corresponding univariate problems. Our method does not require any a priori restrictions on the edge set and we do not have to discretize the data space. We apply our method to diffusion tensor imaging (DTI) as well as Q-ball imaging. Using the DTI model, we obtain a segmentation of the corpus callosum
    corecore