6 research outputs found

    Shape classification based on interpoint distance distributions

    Full text link
    According to Kendall (1989), in shape theory, The idea is to filter out effects resulting from translations, changes of scale and rotations and to declare that shape is “what is left”.While this statement applies in principle to classical shape theory based on landmarks, the basic idea remains also when other approaches are used. For example, we might consider, for every shape, a suitable associated function which, to a large extent, could be used to characterize the shape. This finally leads to identify the shapes with the elements of a quotient space of sets in such a way that all the sets in the same equivalence class share the same identifying function. In this paper, we explore the use of the interpoint distance distribution (i.e. the distribution of the distance between two independent uniform points) for this purpose. This idea has been previously proposed by other authors [e.g., Osada et al. (2002), Bonetti and Pagano (2005)]. We aim at providing some additional mathematical support for the use of interpoint distances in this context. In particular, we show the Lipschitz continuity of the transformation taking every shape to its corresponding interpoint distance distribution. Also, we obtain a partial identifiability result showing that, under some geometrical restrictions, shapes with different planar area must have different interpoint distance distributions. Finally, we address practical aspects including a real data example on shape classification in marine biologyThis work has been partially supported by Spanish Grants MTM2013-44045-P (Berrendero and Cuevas) and MTM2013-41383-P (Pateiro-López

    Gromov-Monge quasi-metrics and distance distributions

    Full text link
    Applications in data science, shape analysis and object classification frequently require maps between metric spaces which preserve geometry as faithfully as possible. In this paper, we combine the Monge formulation of optimal transport with the Gromov-Hausdorff distance construction to define a measure of the minimum amount of geometric distortion required to map one metric measure space onto another. We show that the resulting quantity, called Gromov-Monge distance, defines an extended quasi-metric on the space of isomorphism classes of metric measure spaces and that it can be promoted to a true metric on certain subclasses of mm-spaces. We also give precise comparisons between Gromov-Monge distance and several other metrics which have appeared previously, such as the Gromov-Wasserstein metric and the continuous Procrustes metric of Lipman, Al-Aifari and Daubechies. Finally, we derive polynomial-time computable lower bounds for Gromov-Monge distance. These lower bounds are expressed in terms of distance distributions, which are classical invariants of metric measure spaces summarizing the volume growth of metric balls. In the second half of the paper, which may be of independent interest, we study the discriminative power of these lower bounds for simple subclasses of metric measure spaces. We first consider the case of planar curves, where we give a counterexample to the Curve Histogram Conjecture of Brinkman and Olver. Our results on plane curves are then generalized to higher dimensional manifolds, where we prove some sphere characterization theorems for the distance distribution invariant. Finally, we consider several inverse problems on recovering a metric graph from a collection of localized versions of distance distributions. Results are derived by establishing connections with concepts from the fields of computational geometry and topological data analysis.Comment: Version 2: Added many new results and improved expositio

    Gromov-Wasserstein Distance based Object Matching: Asymptotic Inference

    Get PDF
    In this paper, we aim to provide a statistical theory for object matching based on the Gromov-Wasserstein distance. To this end, we model general objects as metric measure spaces. Based on this, we propose a simple and efficiently computable asymptotic statistical test for pose invariant object discrimination. This is based on an empirical version of a β\beta-trimmed lower bound of the Gromov-Wasserstein distance. We derive for β[0,1/2)\beta\in[0,1/2) distributional limits of this test statistic. To this end, we introduce a novel UU-type process indexed in β\beta and show its weak convergence. Finally, the theory developed is investigated in Monte Carlo simulations and applied to structural protein comparisons.Comment: For a version with the complete supplement see [v2
    corecore