3 research outputs found
Chunk Reduction for Multi-Parameter Persistent Homology
The extension of persistent homology to multi-parameter setups is an
algorithmic challenge. Since most computation tasks scale badly with the size
of the input complex, an important pre-processing step consists of simplifying
the input while maintaining the homological information. We present an
algorithm that drastically reduces the size of an input. Our approach is an
extension of the chunk algorithm for persistent homology (Bauer et al.,
Topological Methods in Data Analysis and Visualization III, 2014). We show that
our construction produces the smallest multi-filtered chain complex among all
the complexes quasi-isomorphic to the input, improving on the guarantees of
previous work in the context of discrete Morse theory. Our algorithm also
offers an immediate parallelization scheme in shared memory. Already its
sequential version compares favorably with existing simplification schemes, as
we show by experimental evaluation
Chunk Reduction for Multi-Parameter Persistent Homology
The extension of persistent homology to multi-parameter setups is an algorithmic challenge. Since most computation tasks scale badly with the size of the input complex, an important pre-processing step consists of simplifying the input while maintaining the homological information. We present an algorithm that drastically reduces the size of an input. Our approach is an extension of the chunk algorithm for persistent homology (Bauer et al., Topological Methods in Data Analysis and Visualization III, 2014). We show that our construction produces the smallest multi-filtered chain complex among all the complexes quasi-isomorphic to the input, improving on the guarantees of previous work in the context of discrete Morse theory. Our algorithm also offers an immediate parallelization scheme in shared memory. Already its sequential version compares favorably with existing simplification schemes, as we show by experimental evaluation
Delaunay Bifiltrations of Functions on Point Clouds
The Delaunay filtration of a point cloud is a central tool of computational topology. Its use is justified
by the topological equivalence of and the offset
(i.e., union-of-balls) filtration of . Given a function , we introduce a Delaunay bifiltration
that satisfies an analogous topological
equivalence, ensuring that topologically
encodes the offset filtrations of all sublevel sets of , as well as the
topological relations between them. is of size
, which for odd matches the worst-case
size of . Adapting the Bowyer-Watson algorithm for
computing Delaunay triangulations, we give a simple, practical algorithm to
compute in time . Our implementation, based on CGAL, computes
with modest overhead compared to computing
, and handles tens of thousands of points in
within seconds.Comment: 28 pages, 7 figures, 8 tables. To appear in the proceedings of SODA2