1,806 research outputs found
On the optimality of shape and data representation in the spectral domain
A proof of the optimality of the eigenfunctions of the Laplace-Beltrami
operator (LBO) in representing smooth functions on surfaces is provided and
adapted to the field of applied shape and data analysis. It is based on the
Courant-Fischer min-max principle adapted to our case. % The theorem we present
supports the new trend in geometry processing of treating geometric structures
by using their projection onto the leading eigenfunctions of the decomposition
of the LBO. Utilisation of this result can be used for constructing numerically
efficient algorithms to process shapes in their spectrum. We review a couple of
applications as possible practical usage cases of the proposed optimality
criteria. % We refer to a scale invariant metric, which is also invariant to
bending of the manifold. This novel pseudo-metric allows constructing an LBO by
which a scale invariant eigenspace on the surface is defined. We demonstrate
the efficiency of an intermediate metric, defined as an interpolation between
the scale invariant and the regular one, in representing geometric structures
while capturing both coarse and fine details. Next, we review a numerical
acceleration technique for classical scaling, a member of a family of
flattening methods known as multidimensional scaling (MDS). There, the
optimality is exploited to efficiently approximate all geodesic distances
between pairs of points on a given surface, and thereby match and compare
between almost isometric surfaces. Finally, we revisit the classical principal
component analysis (PCA) definition by coupling its variational form with a
Dirichlet energy on the data manifold. By pairing the PCA with the LBO we can
handle cases that go beyond the scope defined by the observation set that is
handled by regular PCA
Manhattan orbifolds
We investigate a class of metrics for 2-manifolds in which, except for a
discrete set of singular points, the metric is locally isometric to an L_1 (or
equivalently L_infinity) metric, and show that with certain additional
conditions such metrics are injective. We use this construction to find the
tight span of squaregraphs and related graphs, and we find an injective metric
that approximates the distances in the hyperbolic plane analogously to the way
the rectilinear metrics approximate the Euclidean distance.Comment: 17 pages, 15 figures. Some definitions and proofs have been revised
since the previous version, and a new example has been adde
LOCA: LOcal Conformal Autoencoder for standardized data coordinates
We propose a deep-learning based method for obtaining standardized data
coordinates from scientific measurements.Data observations are modeled as
samples from an unknown, non-linear deformation of an underlying Riemannian
manifold, which is parametrized by a few normalized latent variables. By
leveraging a repeated measurement sampling strategy, we present a method for
learning an embedding in that is isometric to the latent
variables of the manifold. These data coordinates, being invariant under smooth
changes of variables, enable matching between different instrumental
observations of the same phenomenon. Our embedding is obtained using a LOcal
Conformal Autoencoder (LOCA), an algorithm that constructs an embedding to
rectify deformations by using a local z-scoring procedure while preserving
relevant geometric information. We demonstrate the isometric embedding
properties of LOCA on various model settings and observe that it exhibits
promising interpolation and extrapolation capabilities. Finally, we apply LOCA
to single-site Wi-Fi localization data, and to -dimensional curved surface
estimation based on a -dimensional projection
Space from Hilbert Space: Recovering Geometry from Bulk Entanglement
We examine how to construct a spatial manifold and its geometry from the
entanglement structure of an abstract quantum state in Hilbert space. Given a
decomposition of Hilbert space into a tensor product of factors,
we consider a class of "redundancy-constrained states" in that
generalize the area-law behavior for entanglement entropy usually found in
condensed-matter systems with gapped local Hamiltonians. Using mutual
information to define a distance measure on the graph, we employ classical
multidimensional scaling to extract the best-fit spatial dimensionality of the
emergent geometry. We then show that entanglement perturbations on such
emergent geometries naturally give rise to local modifications of spatial
curvature which obey a (spatial) analog of Einstein's equation. The Hilbert
space corresponding to a region of flat space is finite-dimensional and scales
as the volume, though the entropy (and the maximum change thereof) scales like
the area of the boundary. A version of the ER=EPR conjecture is recovered, in
that perturbations that entangle distant parts of the emergent geometry
generate a configuration that may be considered as a highly quantum wormhole.Comment: 37 pages, 5 figures. Updated notation, references, and
acknowledgemen
Local Kernels and the Geometric Structure of Data
We introduce a theory of local kernels, which generalize the kernels used in
the standard diffusion maps construction of nonparametric modeling. We prove
that evaluating a local kernel on a data set gives a discrete representation of
the generator of a continuous Markov process, which converges in the limit of
large data. We explicitly connect the drift and diffusion coefficients of the
process to the moments of the kernel. Moreover, when the kernel is symmetric,
the generator is the Laplace-Beltrami operator with respect to a geometry which
is influenced by the embedding geometry and the properties of the kernel. In
particular, this allows us to generate any Riemannian geometry by an
appropriate choice of local kernel. In this way, we continue a program of
Belkin, Niyogi, Coifman and others to reinterpret the current diverse
collection of kernel-based data analysis methods and place them in a geometric
framework. We show how to use this framework to design local kernels invariant
to various features of data. These data-driven local kernels can be used to
construct conformally invariant embeddings and reconstruct global
diffeomorphisms
- …