9 research outputs found
Efficient Approximation of the Matching Distance for 2-Parameter Persistence
In topological data analysis, the matching distance is a computationally tractable metric on multi-filtered simplicial complexes. We design efficient algorithms for approximating the matching distance of two bi-filtered complexes to any desired precision ?>0. Our approach is based on a quad-tree refinement strategy introduced by Biasotti et al., but we recast their approach entirely in geometric terms. This point of view leads to several novel observations resulting in a practically faster algorithm. We demonstrate this speed-up by experimental comparison and provide our code in a public repository which provides the first efficient publicly available implementation of the matching distance
Topological Optimization with Big Steps
Using persistent homology to guide optimization has emerged as a novel
application of topological data analysis. Existing methods treat persistence
calculation as a black box and backpropagate gradients only onto the simplices
involved in particular pairs. We show how the cycles and chains used in the
persistence calculation can be used to prescribe gradients to larger subsets of
the domain. In particular, we show that in a special case, which serves as a
building block for general losses, the problem can be solved exactly in linear
time. This relies on another contribution of this paper, which eliminates the
need to examine a factorial number of permutations of simplices with the same
value. We present empirical experiments that show the practical benefits of our
algorithm: the number of steps required for the optimization is reduced by an
order of magnitude.Comment: 10 pages, 10 figure
Geometry Helps to Compare Persistence Diagrams
Exploiting geometric structure to improve the asymptotic complexity of
discrete assignment problems is a well-studied subject. In contrast, the
practical advantages of using geometry for such problems have not been
explored. We implement geometric variants of the Hopcroft--Karp algorithm for
bottleneck matching (based on previous work by Efrat el al.) and of the auction
algorithm by Bertsekas for Wasserstein distance computation. Both
implementations use k-d trees to replace a linear scan with a geometric
proximity query. Our interest in this problem stems from the desire to compute
distances between persistence diagrams, a problem that comes up frequently in
topological data analysis. We show that our geometric matching algorithms lead
to a substantial performance gain, both in running time and in memory
consumption, over their purely combinatorial counterparts. Moreover, our
implementation significantly outperforms the only other implementation
available for comparing persistence diagrams.Comment: 20 pages, 10 figures; extended version of paper published in ALENEX
201
Topological Regularization via Persistence-Sensitive Optimization
Optimization, a key tool in machine learning and statistics, relies on
regularization to reduce overfitting. Traditional regularization methods
control a norm of the solution to ensure its smoothness. Recently, topological
methods have emerged as a way to provide a more precise and expressive control
over the solution, relying on persistent homology to quantify and reduce its
roughness. All such existing techniques back-propagate gradients through the
persistence diagram, which is a summary of the topological features of a
function. Their downside is that they provide information only at the critical
points of the function. We propose a method that instead builds on
persistence-sensitive simplification and translates the required changes to the
persistence diagram into changes on large subsets of the domain, including both
critical and regular points. This approach enables a faster and more precise
topological regularization, the benefits of which we illustrate with
experimental evidence.Comment: The first two authors contributed equally to this wor
Recommended from our members
Topological Regularization via Persistence-Sensitive Optimization
Optimization, a key tool in machine learning and statistics, relies on
regularization to reduce overfitting. Traditional regularization methods
control a norm of the solution to ensure its smoothness. Recently, topological
methods have emerged as a way to provide a more precise and expressive control
over the solution, relying on persistent homology to quantify and reduce its
roughness. All such existing techniques back-propagate gradients through the
persistence diagram, which is a summary of the topological features of a
function. Their downside is that they provide information only at the critical
points of the function. We propose a method that instead builds on
persistence-sensitive simplification and translates the required changes to the
persistence diagram into changes on large subsets of the domain, including both
critical and regular points. This approach enables a faster and more precise
topological regularization, the benefits of which we illustrate with
experimental evidence