17 research outputs found
Optimal topological simplification of discrete functions on surfaces
We solve the problem of minimizing the number of critical points among all
functions on a surface within a prescribed distance {\delta} from a given input
function. The result is achieved by establishing a connection between discrete
Morse theory and persistent homology. Our method completely removes homological
noise with persistence less than 2{\delta}, constructively proving the
tightness of a lower bound on the number of critical points given by the
stability theorem of persistent homology in dimension two for any input
function. We also show that an optimal solution can be computed in linear time
after persistence pairs have been computed.Comment: 27 pages, 8 figure
Clear and Compress: Computing Persistent Homology in Chunks
We present a parallelizable algorithm for computing the persistent homology
of a filtered chain complex. Our approach differs from the commonly used
reduction algorithm by first computing persistence pairs within local chunks,
then simplifying the unpaired columns, and finally applying standard reduction
on the simplified matrix. The approach generalizes a technique by G\"unther et
al., which uses discrete Morse Theory to compute persistence; we derive the
same worst-case complexity bound in a more general context. The algorithm
employs several practical optimization techniques which are of independent
interest. Our sequential implementation of the algorithm is competitive with
state-of-the-art methods, and we improve the performance through parallelized
computation.Comment: This result was presented at TopoInVis 2013
(http://www.sci.utah.edu/topoinvis13.html
Combinatorial Gradient Fields for 2D Images with Empirically Convergent Separatrices
This paper proposes an efficient probabilistic method that computes
combinatorial gradient fields for two dimensional image data. In contrast to
existing algorithms, this approach yields a geometric Morse-Smale complex that
converges almost surely to its continuous counterpart when the image resolution
is increased. This approach is motivated using basic ideas from probability
theory and builds upon an algorithm from discrete Morse theory with a strong
mathematical foundation. While a formal proof is only hinted at, we do provide
a thorough numerical evaluation of our method and compare it to established
algorithms.Comment: 17 pages, 7 figure
Effective homology of k-D digital objects (partially) calculated in parallel
In [18], a membrane parallel theoretical framework for computing (co)homology information of fore- ground or background of binary digital images is developed. Starting from this work, we progress here in two senses: (a) providing advanced topological information, such as (co)homology torsion and effi- ciently answering to any decision or classification problem for sum of k -xels related to be a (co)cycle or a (co)boundary; (b) optimizing the previous framework to be implemented in using GPGPU computing. Discrete Morse theory, Effective Homology Theory and parallel computing techniques are suitably com- bined for obtaining a homological encoding, called algebraic minimal model, of a Region-Of-Interest (seen as cubical complex) of a presegmented k -D digital image
Topological Regularization via Persistence-Sensitive Optimization
Optimization, a key tool in machine learning and statistics, relies on
regularization to reduce overfitting. Traditional regularization methods
control a norm of the solution to ensure its smoothness. Recently, topological
methods have emerged as a way to provide a more precise and expressive control
over the solution, relying on persistent homology to quantify and reduce its
roughness. All such existing techniques back-propagate gradients through the
persistence diagram, which is a summary of the topological features of a
function. Their downside is that they provide information only at the critical
points of the function. We propose a method that instead builds on
persistence-sensitive simplification and translates the required changes to the
persistence diagram into changes on large subsets of the domain, including both
critical and regular points. This approach enables a faster and more precise
topological regularization, the benefits of which we illustrate with
experimental evidence.Comment: The first two authors contributed equally to this wor
Topology-Aware Surface Reconstruction for Point Clouds
We present an approach to inform the reconstruction of a surface from a point
scan through topological priors. The reconstruction is based on basis functions
which are optimized to provide a good fit to the point scan while satisfying
predefined topological constraints. We optimize the parameters of a model to
obtain likelihood function over the reconstruction domain. The topological
constraints are captured by persistence diagrams which are incorporated in the
optimization algorithm promote the correct topology. The result is a novel
topology-aware technique which can: 1.) weed out topological noise from point
scans, and 2.) capture certain nuanced properties of the underlying shape which
could otherwise be lost while performing surface reconstruction. We showcase
results reconstructing shapes with multiple potential topologies, compare to
other classical surface construction techniques, and show the completion of
real scan data