935 research outputs found

    Optimally fast incremental Manhattan plane embedding and planar tight span construction

    Full text link
    We describe a data structure, a rectangular complex, that can be used to represent hyperconvex metric spaces that have the same topology (although not necessarily the same distance function) as subsets of the plane. We show how to use this data structure to construct the tight span of a metric space given as an n x n distance matrix, when the tight span is homeomorphic to a subset of the plane, in time O(n^2), and to add a single point to a planar tight span in time O(n). As an application of this construction, we show how to test whether a given finite metric space embeds isometrically into the Manhattan plane in time O(n^2), and add a single point to the space and re-test whether it has such an embedding in time O(n).Comment: 39 pages, 15 figure

    DIMAL: Deep Isometric Manifold Learning Using Sparse Geodesic Sampling

    Full text link
    This paper explores a fully unsupervised deep learning approach for computing distance-preserving maps that generate low-dimensional embeddings for a certain class of manifolds. We use the Siamese configuration to train a neural network to solve the problem of least squares multidimensional scaling for generating maps that approximately preserve geodesic distances. By training with only a few landmarks, we show a significantly improved local and nonlocal generalization of the isometric mapping as compared to analogous non-parametric counterparts. Importantly, the combination of a deep-learning framework with a multidimensional scaling objective enables a numerical analysis of network architectures to aid in understanding their representation power. This provides a geometric perspective to the generalizability of deep learning.Comment: 10 pages, 11 Figure

    Diffusion Variational Autoencoders

    Full text link
    A standard Variational Autoencoder, with a Euclidean latent space, is structurally incapable of capturing topological properties of certain datasets. To remove topological obstructions, we introduce Diffusion Variational Autoencoders with arbitrary manifolds as a latent space. A Diffusion Variational Autoencoder uses transition kernels of Brownian motion on the manifold. In particular, it uses properties of the Brownian motion to implement the reparametrization trick and fast approximations to the KL divergence. We show that the Diffusion Variational Autoencoder is capable of capturing topological properties of synthetic datasets. Additionally, we train MNIST on spheres, tori, projective spaces, SO(3), and a torus embedded in R3. Although a natural dataset like MNIST does not have latent variables with a clear-cut topological structure, training it on a manifold can still highlight topological and geometrical properties.Comment: 10 pages, 8 figures Added an appendix with derivation of asymptotic expansion of KL divergence for heat kernel on arbitrary Riemannian manifolds, and an appendix with new experiments on binarized MNIST. Added a previously missing factor in the asymptotic expansion of the heat kernel and corrected a coefficient in asymptotic expansion KL divergence; further minor edit

    On the moduli space of positive Ricci curvature metrics on homotopy spheres

    Get PDF
    We show that the moduli space of Ricci positive metrics on certain homotopy spheres has infinitely many connected components.Comment: 28 pages, 11 figures. The text has been substantially re-written to improve the expositio
    • …
    corecore