305 research outputs found
The Topology ToolKit
This system paper presents the Topology ToolKit (TTK), a software platform
designed for topological data analysis in scientific visualization. TTK
provides a unified, generic, efficient, and robust implementation of key
algorithms for the topological analysis of scalar data, including: critical
points, integral lines, persistence diagrams, persistence curves, merge trees,
contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots,
Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due
to a tight integration with ParaView. It is also easily accessible to
developers through a variety of bindings (Python, VTK/C++) for fast prototyping
or through direct, dependence-free, C++, to ease integration into pre-existing
complex systems. While developing TTK, we faced several algorithmic and
software engineering challenges, which we document in this paper. In
particular, we present an algorithm for the construction of a discrete gradient
that complies to the critical points extracted in the piecewise-linear setting.
This algorithm guarantees a combinatorial consistency across the topological
abstractions supported by TTK, and importantly, a unified implementation of
topological data simplification for multi-scale exploration and analysis. We
also present a cached triangulation data structure, that supports time
efficient and generic traversals, which self-adjusts its memory usage on demand
for input simplicial meshes and which implicitly emulates a triangulation for
regular grids with no memory overhead. Finally, we describe an original
software architecture, which guarantees memory efficient and direct accesses to
TTK features, while still allowing for researchers powerful and easy bindings
and extensions. TTK is open source (BSD license) and its code, online
documentation and video tutorials are available on TTK's website
Slice, Simplify and Stitch: Topology-Preserving Simplification Scheme for Massive Voxel Data
We focus on efficient computations of topological descriptors for voxel data. This type of data includes 2D greyscale images, 3D medical scans, but also higher-dimensional scalar fields arising from physical simulations. In recent years we have seen an increase in applications of topological methods for such data. However, computational issues remain an obstacle.
We therefore propose a streaming scheme which simplifies large 3-dimensional voxel data - while provably retaining its persistent homology. We combine this scheme with an efficient boundary matrix reduction implementation, obtaining an end-to-end tool for persistent homology of large data. Computational experiments show its state-of-the-art performance. In particular, we are now able to robustly handle complex datasets with several billions voxels on a regular laptop.
A software implementation called Cubicle is available as open-source: https://bitbucket.org/hubwag/cubicle
Streaming Algorithm for Euler Characteristic Curves of Multidimensional Images
We present an efficient algorithm to compute Euler characteristic curves of
gray scale images of arbitrary dimension. In various applications the Euler
characteristic curve is used as a descriptor of an image.
Our algorithm is the first streaming algorithm for Euler characteristic
curves. The usage of streaming removes the necessity to store the entire image
in RAM. Experiments show that our implementation handles terabyte scale images
on commodity hardware. Due to lock-free parallelism, it scales well with the
number of processor cores. Our software---CHUNKYEuler---is available as open
source on Bitbucket.
Additionally, we put the concept of the Euler characteristic curve in the
wider context of computational topology. In particular, we explain the
connection with persistence diagrams
Skeletonization and Partitioning of Digital Images Using Discrete Morse Theory
We show how discrete Morse theory provides a rigorous and unifying foundation for defining skeletons and partitions of grayscale digital images. We model a grayscale image as a cubical complex with a real-valued function defined on its vertices (the voxel values). This function is extended to a discrete gradient vector field using the algorithm presented in Robins, Wood, Sheppard TPAMI 33:1646 (2011). In the current paper we define basins (the building blocks of a partition) and segments of the skeleton using the stable and unstable sets associated with critical cells. The natural connection between Morse theory and homology allows us to prove the topological validity of these constructions; for example, that the skeleton is homotopic to the initial object. We simplify the basins and skeletons via Morse-theoretic cancellation of critical cells in the discrete gradient vector field using a strategy informed by persistent homology. Simple working Python code for our algorithms for efficient vector field traversal is included. Example data are taken from micro-CT images of porous materials, an application area where accurate topological models of pore connectivity are vital for fluid-flow modelling
Tuning the Performance of a Computational Persistent Homology Package
In recent years, persistent homology has become an attractive method for data analysis. It captures topological features, such as connected components, holes, and voids from point cloud data and summarizes the way in which these features appear and disappear in a filtration sequence. In this project, we focus on improving the performanceof Eirene, a computational package for persistent homology. Eirene is a 5000-line open-source software library implemented in the dynamic programming language Julia. We use the Julia profiling tools to identify performance bottlenecks and develop novel methods to manage them, including the parallelization of some time-consuming functions on multicore/manycore hardware. Empirical results show that performance can be greatly improved
Clear and Compress: Computing Persistent Homology in Chunks
We present a parallelizable algorithm for computing the persistent homology
of a filtered chain complex. Our approach differs from the commonly used
reduction algorithm by first computing persistence pairs within local chunks,
then simplifying the unpaired columns, and finally applying standard reduction
on the simplified matrix. The approach generalizes a technique by G\"unther et
al., which uses discrete Morse Theory to compute persistence; we derive the
same worst-case complexity bound in a more general context. The algorithm
employs several practical optimization techniques which are of independent
interest. Our sequential implementation of the algorithm is competitive with
state-of-the-art methods, and we improve the performance through parallelized
computation.Comment: This result was presented at TopoInVis 2013
(http://www.sci.utah.edu/topoinvis13.html
Parallel Mapper
The construction of Mapper has emerged in the last decade as a powerful and
effective topological data analysis tool that approximates and generalizes
other topological summaries, such as the Reeb graph, the contour tree, split,
and joint trees. In this paper, we study the parallel analysis of the
construction of Mapper. We give a provably correct parallel algorithm to
execute Mapper on multiple processors and discuss the performance results that
compare our approach to a reference sequential Mapper implementation. We report
the performance experiments that demonstrate the efficiency of our method
- …