27 research outputs found
The Morse theory of \v{C}ech and Delaunay complexes
Given a finite set of points in and a radius parameter, we
study the \v{C}ech, Delaunay-\v{C}ech, Delaunay (or Alpha), and Wrap complexes
in the light of generalized discrete Morse theory. Establishing the \v{C}ech
and Delaunay complexes as sublevel sets of generalized discrete Morse
functions, we prove that the four complexes are simple-homotopy equivalent by a
sequence of simplicial collapses, which are explicitly described by a single
discrete gradient field.Comment: 21 pages, 2 figures, improved expositio
Topology of random simplicial complexes: a survey
This expository article is based on a lecture from the Stanford Symposium on
Algebraic Topology: Application and New Directions, held in honor of Gunnar
Carlsson, Ralph Cohen, and Ib Madsen.Comment: After revisions, now 21 pages, 5 figure
On the choice of weight functions for linear representations of persistence diagrams
Persistence diagrams are efficient descriptors of the topology of a point
cloud. As they do not naturally belong to a Hilbert space, standard statistical
methods cannot be directly applied to them. Instead, feature maps (or
representations) are commonly used for the analysis. A large class of feature
maps, which we call linear, depends on some weight functions, the choice of
which is a critical issue. An important criterion to choose a weight function
is to ensure stability of the feature maps with respect to Wasserstein
distances on diagrams. We improve known results on the stability of such maps,
and extend it to general weight functions. We also address the choice of the
weight function by considering an asymptotic setting; assume that
is an i.i.d. sample from a density on . For the
\v{C}ech and Rips filtrations, we characterize the weight functions for which
the corresponding feature maps converge as approaches infinity, and by
doing so, we prove laws of large numbers for the total persistences of such
diagrams. Those two approaches (stability and convergence) lead to the same
simple heuristic for tuning weight functions: if the data lies near a
-dimensional manifold, then a sensible choice of weight function is the
persistence to the power with
The Density of Expected Persistence Diagrams and its Kernel Based Estimation
Persistence diagrams play a fundamental role in Topological Data Analysis where they are used as topological descriptors of filtrations built on top of data. They consist in discrete multisets of points in the plane R^2 that can equivalently be seen as discrete measures in R^2. When the data come as a random point cloud, these discrete measures become random measures whose expectation is studied in this paper. First, we show that for a wide class of filtrations, including the Cech and Rips-Vietoris filtrations, the expected persistence diagram, that is a deterministic measure on R^2, has a density with respect to the Lebesgue measure. Second, building on the previous result we show that the persistence surface recently introduced in [Adams et al., 2017] can be seen as a kernel estimator of this density. We propose a cross-validation scheme for selecting an optimal bandwidth, which is proven to be a consistent procedure to estimate the density
Simplicial Data Analysis: theory, practice, and algorithms
Simplicial complexes store in discrete form key information on a topological space, and have been used in mathematics to introduce combinatorial and discrete tools in geometry and topology. They represent a topological space as a collection of ‘simple elements’ (such as vertices, edges, triangles, tetrahedra, and more general simplices) that are glued to each other in a structured manner. In the last 40 years, they have been a basic tool in computer visualization for storing and classifying different shapes of 3d images, then in the early 2000s these techniques were success- fully applied to more general data, not necessarily sampled from a metric space.
The use of techniques borrowed from algebraic topology has been very successfull in analysing data from various fields: genomics, sensor analysis, brain connectomics, fMRI data, trade net- works, and new fields of application are being tested every day. Regrettably, topological data analysis has been used mainly as a qualitative method, the problem being the lack of proper tools to perform effective statistical analysis.
Coming from well established techniques in random graph theory, the first models for random simplicial complexes have been introduced in recent years, none of which though can be used effectively in a quantitative analysis of data. We introduce a model that can be successfully used as a null model for simplicial complexes as it fixes the size distribution of facets.
Another challenge is to successfully identify a simplicial complex which can correctly encode the topological space from which the initial data set is sampled. The most common solution is to build nesting simplicial complexes, and study the evolution of their features. A recent study uncovered that the problem can reside in making wrong assumption on the space of data. We propose a categorical reasoning which enlightens the cause leading to these misconceptions. We introduce a new category for weighted graphs and study its relation to other common categories when the weights are chosen in a poset.
The construction of the appropriate simplicial complex is not the only obstacle one faces when applying topological methods to real data. Available algorithms for homological features extraction have a memory and time complexity which scales exponentially on the number of simplices, making these techniques not suitable for the analysis of ‘big data’. We propose a quantum algorithm which is able to track in logarithmic time the evolution of a quantum version of well known homological features along a filtration of simplicial complexes
Strange Random Topology of the Circle
We characterise high-dimensional topology that arises from a random Cech
complex constructed on the circle. Expected Euler characteristic curve is
computed, where we observe limiting spikes. The spikes correspond to expected
Betti numbers growing arbitrarily large over shrinking intervals of filtration
radii. Using the fact that the homotopy type of the random Cech complex is
either an odd-dimensional sphere or a bouquet of even-dimensional spheres, we
give probabilistic bounds of the homotopy types. By departing from the
conventional practice of scaling down filtration radii as the sample size grows
large, our findings indicate that the full breadth of filtration radii leads to
interesting systematic behaviour that cannot be regarded as "topological
noise".Comment: Updated figures, cleaner main theorems, minor typo
Recommended from our members
Topological and geometric inference of data
The overarching problem under consideration is to determine the structure
of the subspace on which a distribution is supported, given
only a finite noisy sample thereof. The special case in
which the subspace is an embedded manifold is given particular
attention owing to its conceptual elegance, and asymptotic bounds are
obtained on the admissible level of noise such that the
manifold can be recovered up to homotopy equivalence.
Attention is turned on how to accomplish this in practice.
Following ideas from topological data analysis, simplicial complexes are used
as discrete analogues of spaces suitable for computation. By utilising
the prior assumption that the data lie on a manifold, topologically
inspired techniques are proposed for refining the simplicial complex
to better approximate this manifold. This is applied to the
problem of nonlinear dimensionality reduction and found to improve accuracy
of reconstructing several synthetic and real-world datasets.
The second chapter focuses on extending this work to the
case where the ambient space is non-Euclidean. The interfaces between
topological data analysis, functional data analysis, and shape analysis
are thoroughly explored. Lipschitz bounds are proved which relate several
metrics on the space of positive semidefinite matrices; they are then
interpreted in the context of topological data analysis. This is
applied to diffusion tensor imaging and phonology.
The final chapter explores the case where the points are
non-uniformly distributed over the embedded subspace. In particular, a method
is proposed to overcome the shortcomings of witness complex construction
when there are large deviations in the density. The theory
of multidimensional persistence is leveraged to provide a succinct setting
in which the structure of the data can be interpreted
as a generalised stratified space.EPSR
IST Austria Thesis
The main objects considered in the present work are simplicial and CW-complexes with vertices forming a random point cloud. In particular, we consider a Poisson point process in R^n and study Delaunay and Voronoi complexes of the first and higher orders and weighted Delaunay complexes obtained as sections of Delaunay complexes, as well as the ÄŒech complex. Further, we examine theDelaunay complex of a Poisson point process on the sphere S^n, as well as of a uniform point cloud, which is equivalent to the convex hull, providing a connection to the theory of random polytopes. Each of the complexes in question can be endowed with a radius function, which maps its cells to the radii of appropriately chosen circumspheres, called the radius of the cell. Applying and developing discrete Morse theory for these functions, joining it together with probabilistic and sometimes analytic machinery, and developing several integral geometric tools, we aim at getting the distributions of circumradii of typical cells. For all considered complexes, we are able to generalize and obtain up to constants the distribution of radii of typical intervals of all types. In low dimensions the constants can be computed explicitly, thus providing the explicit expressions for the expected numbers of cells. In particular, it allows to find the expected density of simplices of every dimension for a Poisson point process in R^4, whereas the result for R^3 was known already in 1970's
Refinement of Interval Approximations for Fully Commutative Quivers
A fundamental challenge in multiparameter persistent homology is the absence
of a complete and discrete invariant. To address this issue, we propose an
enhanced framework that realizes a holistic understanding of a fully
commutative quiver's representation via synthesizing interpretations obtained
from intervals. Additionally, it provides a mechanism to tune the balance
between approximation resolution and computational complexity. This framework
is evaluated on commutative ladders of both finite-type and infinite-type. For
the former, we discover an efficient method for the indecomposable
decomposition leveraging solely one-parameter persistent homology. For the
latter, we introduce a new invariant that reveals persistence in the second
parameter by connecting two standard persistence diagrams using interval
approximations. We subsequently present several models for constructing
commutative ladder filtrations, offering fresh insights into random filtrations
and demonstrating our toolkit's effectiveness in analyzing the topology of
materials
Abel Symposia
Discrete Morse theory has recently lead to new developments in the theory of random geometric complexes. This article surveys the methods and results obtained with this new approach, and discusses some of its shortcomings. It uses simulations to illustrate the results and to form conjectures, getting numerical estimates for combinatorial, topological, and geometric properties of weighted and unweighted Delaunay mosaics, their dual Voronoi tessellations, and the Alpha and Wrap complexes contained in the mosaics