18 research outputs found
Polynomial-Sized Topological Approximations Using The Permutahedron
Classical methods to model topological properties of point clouds, such as
the Vietoris-Rips complex, suffer from the combinatorial explosion of complex
sizes. We propose a novel technique to approximate a multi-scale filtration of
the Rips complex with improved bounds for size: precisely, for points in
, we obtain a -approximation with at most simplices of dimension or lower. In conjunction with dimension
reduction techniques, our approach yields a -approximation of size for Rips filtrations on arbitrary metric
spaces. This result stems from high-dimensional lattice geometry and exploits
properties of the permutahedral lattice, a well-studied structure in discrete
geometry.
Building on the same geometric concept, we also present a lower bound result
on the size of an approximate filtration: we construct a point set for which
every -approximation of the \v{C}ech filtration has to contain
features, provided that for .Comment: 24 pages, 1 figur
No-Dimensional Tverberg Theorems and Algorithms
Tverberg's theorem is a classic result in discrete geometry. It states that
for any integer and any finite -dimensional point set of at least points, we can partition
into subsets whose convex hulls have a non-empty intersection. The
computational problem of finding such a partition lies in the complexity class
, but no hardness results are known.
Tverberg's theorem also has a colorful variant: the points in have colors,
and under certain conditions, can be partitioned into colorful sets, i.e.,
sets in which each color appears exactly once such that the convex hulls of the
sets intersect.
Recently, Adiprasito, Barany, and Mustafa [SODA 2019] proved a no-dimensional
version of Tverberg's theorem, in which the convex hulls of the sets in the
partition may intersect in an approximate fashion, relaxing the requirement on
the cardinality of . The argument is constructive, but it does not result in
a polynomial-time algorithm.
We present an alternative proof for a no-dimensional Tverberg theorem that
leads to an efficient algorithm to find the partition. More specifically, we
show an deterministic algorithm that finds for any set of points and any in
time a partition of into subsets such that there is a ball of radius
intersecting the convex hull
of each subset. A similar result holds also for the colorful version.
To obtain our result, we generalize Sarkaria's tensor product constructions
[Israel Journal Math., 1992] that reduces the Tverberg problem to the Colorful
Caratheodory problem. By carefully choosing the vectors used in the tensor
products, we implement the reduction in an efficient manner.Comment: A shorter version will appear at SoCG 202
No-Dimensional Tverberg Theorems and Algorithms
Tverberg’s theorem states that for any k≥2 and any set P⊂Rd of at least (d+1)(k−1)+1 points in d dimensions, we can partition P into k subsets whose convex hulls have a non-empty intersection. The associated search problem of finding the partition lies in the complexity class CLS=PPAD∩PLS, but no hardness results are known. In the colorful Tverberg theorem, the points in P have colors, and under certain conditions, P can be partitioned into colorful sets, in which each color appears exactly once and whose convex hulls intersect. To date, the complexity of the associated search problem is unresolved. Recently, Adiprasito, Bárány, and Mustafa (SODA 2019) gave a no-dimensional Tverberg theorem, in which the convex hulls may intersect in an approximate fashion. This relaxes the requirement on the cardinality of P. The argument is constructive, but does not result in a polynomial-time algorithm. We present a deterministic algorithm that finds for any n-point set P⊂Rd and any k∈{2,…,n} in O(nd⌈logk⌉) time a k-partition of P such that there is a ball of radius O((k/n−−√)diam(P)) that intersects the convex hull of each set. Given that this problem is not known to be solvable exactly in polynomial time, our result provides a remarkably efficient and simple new notion of approximation. Our main contribution is to generalize Sarkaria’s method (Israel Journal Math., 1992) to reduce the Tverberg problem to the colorful Carathéodory problem (in the simplified tensor product interpretation of Bárány and Onn) and to apply it algorithmically. It turns out that this not only leads to an alternative algorithmic proof of a no-dimensional Tverberg theorem, but it also generalizes to other settings such as the colorful variant of the problem
Approximation algorithms for Vietoris-Rips and Čech filtrations
Persistent Homology is a tool to analyze and visualize the shape of data from a topological viewpoint. It computes persistence, which summarizes the evolution of topological and geometric information about metric spaces over multiple scales of distances. While computing persistence is quite efficient for low-dimensional topological features, it becomes overwhelmingly expensive for medium to high-dimensional features. In this thesis, we attack this computational problem from several different angles. We present efficient techniques to approximate the persistence of metric spaces. Three of our methods are tailored towards general point clouds in Euclidean spaces. We make use of high dimensional lattice geometry to reduce the cost of the approximations. In particular, we discover several properties of the Permutahedral lattice, whose Voronoi cell is well-known for its combinatorial properties. The last method is suitable for point clouds with low intrinsic dimension, where we exploit the structural properties of the point set to tame the complexity. In some cases, we achieve a reduction in size complexity by trading off the quality of the approximation. Two of our methods work particularly well in conjunction with dimension-reduction techniques: we arrive at the first approximation schemes whose complexities are only polynomial in the size of the point cloud, and independent of the ambient dimension. On the other hand, we provide a lower bound result: we construct a point cloud that requires super-polynomial complexity for a high-quality approximation of the persistence. Together with our approximation schemes, we show that polynomial complexity is achievable for rough approximations, but impossible for sufficiently fine approximations. For some metric spaces, the intrinsic dimension is low in small neighborhoods of the input points, but much higher for large scales of distances. We develop a concept of local intrinsic dimension to capture this property. We also present several applications of this concept, including an approximation method for persistence. This thesis is written in English.Persistent Homology ist eine Methode zur Analyse und Veranschaulichung von Daten aus topologischer Sicht. Sie berechnet eine topologische Zusammenfassung eines metrischen Raumes, die Persistence genannt wird, indem die topologischen Eigenschaften des Raumes über verschiedene Skalen von Abständen analysiert werden. Die Berechnung von Persistence ist für niederdimensionale topologische Eigenschaften effizient. Leider ist die Berechung für mittlere bis hohe Dimensionen sehr teuer. In dieser Dissertation greifen wir dieses Problem aus vielen verschiedenen Winkeln an. Wir stellen effiziente Techniken vor, um die Persistence für metrische Räume zu approximieren. Drei unserer Methoden eignen sich für Punktwolken im euklidischen Raum. Wir verwenden hochdimensionale Gittergeometrie, um die Kosten unserer Approximationen zu reduzieren. Insbesondere entdecken wir mehrere Eigenschaften des Permutahedral Gitters, dessen Voronoi-Zelle für ihre kombinatorischen Eigenschaften bekannt ist. Die vierte Methode eignet sich für Punktwolken mit geringer intrinsischer Dimension: wir verwenden die strukturellen Eigenschaften, um die Komplexität zu reduzieren. Für einige Methoden zeigen wir einen Trade-off zwischen Komplexität und Approximationsqualität auf. Zwei unserer Methoden funktionieren gut mit Dimensionsreduktionstechniken: wir präsentieren die erste Methode mit polynomieller Komplexität unabhängig von der Dimension. Wir zeigen auch eine untere Schranke. Wir konstruieren eine Punktwolke, für die die Berechnung der Persistence nicht in Polynomzeit möglich ist. Die bedeutet, dass in Polynomzeit nur eine grobe Approximation berechnet werden kann. Für gewisse metrische Räume ist die intrinsiche Dimension gering bei kleinen Skalen aber hoch bei großen Skalen. Wir führen das Konzept lokale intrinsische Dimension ein, um diesen Umstand zu fassen, und zeigen, dass es für eine gute Approximation von Persistenz benutzt werden kann. Diese Dissertation ist in englischer Sprache verfasst
Coxeter triangulations have good quality
Coxeter triangulations are triangulations of Euclidean space based on a single simplex. By this we mean that given an individual simplex we can recover the entire triangulation of Euclidean space by inductively reflecting in the faces of the simplex. In this paper we establish that the quality of the simplices in all Coxeter triangulations is O(1/d−−√) of the quality of regular simplex. We further investigate the Delaunay property for these triangulations. Moreover, we consider an extension of the Delaunay property, namely protection, which is a measure of non-degeneracy of a Delaunay triangulation. In particular, one family of Coxeter triangulations achieves the protection O(1/d2). We conjecture that both bounds are optimal for triangulations in Euclidean space
Improved Approximate Rips Filtrations with Shifted Integer Lattices
Rips complexes are important structures for analyzing topological features of metric spaces. Unfortunately, generating these complexes constitutes an expensive task because of a combinatorial explosion in the complex size. For n points in R^d, we present a scheme to construct a 4.24-approximation of the multi-scale filtration of the Rips complex in the L-infinity metric, which extends to a O(d^{0.25})-approximation of the Rips filtration for the Euclidean case. The k-skeleton of the resulting approximation has a total size of n2^{O(d log k)}. The scheme is based on the integer lattice and on the barycentric subdivision of the d-cube
Computational Complexity of the ?-Ham-Sandwich Problem
?_d from each set. Steiger and Zhao [DCG 2010] proved a discrete analogue of this theorem, which we call the ?-Ham-Sandwich theorem. They gave an algorithm to find the hyperplane in time O(n (log n)^{d-3}), where n is the total number of input points. The computational complexity of this search problem in high dimensions is open, quite unlike the complexity of the Ham-Sandwich problem, which is now known to be PPA-complete (Filos-Ratsikas and Goldberg [STOC 2019]).
Recently, Fearnley, Gordon, Mehta, and Savani [ICALP 2019] introduced a new sub-class of CLS (Continuous Local Search) called Unique End-of-Potential Line (UEOPL). This class captures problems in CLS that have unique solutions. We show that for the ?-Ham-Sandwich theorem, the search problem of finding the dividing hyperplane lies in UEOPL. This gives the first non-trivial containment of the problem in a complexity class and places it in the company of classic search problems such as finding the fixed point of a contraction map, the unique sink orientation problem and the P-matrix linear complementarity problem
Sparse Nerves in Practice
Topological data analysis combines machine learning with methods from
algebraic topology. Persistent homology, a method to characterize topological
features occurring in data at multiple scales is of particular interest. A
major obstacle to the wide-spread use of persistent homology is its
computational complexity. In order to be able to calculate persistent homology
of large datasets, a number of approximations can be applied in order to reduce
its complexity. We propose algorithms for calculation of approximate sparse
nerves for classes of Dowker dissimilarities including all finite Dowker
dissimilarities and Dowker dissimilarities whose homology is Cech persistent
homology. All other sparsification methods and software packages that we are
aware of calculate persistent homology with either an additive or a
multiplicative interleaving. In dowker_homology, we allow for any
non-decreasing interleaving function . We analyze the computational
complexity of the algorithms and present some benchmarks. For Euclidean data in
dimensions larger than three, the sizes of simplicial complexes we create are
in general smaller than the ones created by SimBa. Especially when calculating
persistent homology in higher homology dimensions, the differences can become
substantial