7 research outputs found
ASAP - A Sub-sampling Approach for Preserving Topological Structures
Topological data analysis tools enjoy increasing popularity in a wide range of applications. However, due to computational complexity, processing large number of samples of higher dimensionality quickly becomes infeasible. We propose a novel sub-sampling strategy inspired by Coulomb’s law to decrease the number of data points in d-dimensional point clouds while preserving its Homology. The method is not only capable of reducing the memory and computation time needed for the construction of different types of simplicial complexes but also preserves the size of the voids in d-dimensions, which is crucial e.g. for astronomical applications. We demonstrate and compare the strategy in several synthetic scenarios and an astronomical particle simulation of a Jellyfish galaxy for the detection of superbubbles (supernova signatures)
ASAP – A sub-sampling approach for preserving topological structures modeled with geodesic topographic mapping
Topological data analysis tools enjoy increasing popularity in a wide range of applications, such as Computer graphics, Image analysis, Machine learning, and Astronomy for extracting information. However, due to computational complexity, processing large numbers of samples of higher dimensionality quickly becomes infeasible. This contribution is twofold: We present an efficient novel sub-sampling strategy inspired by Coulomb's law to decrease the number of data points in d-dimensional point clouds while preserving its homology. The method is not only capable of reducing the memory and computation time needed for the construction of different types of simplicial complexes but also preserves the size of the voids in d-dimensions, which is crucial e.g. for astronomical applications. Furthermore, we propose a technique to construct a probabilistic description of the border of significant cycles and cavities inside the point cloud. We demonstrate and empirically compare the strategy in several synthetic scenarios and an astronomical particle simulation of a dwarf galaxy for the detection of superbubbles (supernova signatures). (c) 2021 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http:// creativecommons.org/licenses/by/4.0/)
Swarm Intelligence-based Extraction and Manifold Crawling Along the Large-Scale Structure
The distribution of galaxies and clusters of galaxies on the mega-parsec
scale of the Universe follows an intricate pattern now famously known as the
Large-Scale Structure or the Cosmic Web. To study the environments of this
network, several techniques have been developed that are able to describe its
properties and the properties of groups of galaxies as a function of their
environment. In this work we analyze the previously introduced framework:
1-Dimensional Recovery, Extraction, and Analysis of Manifolds (1-DREAM) on
N-body cosmological simulation data of the Cosmic Web. The 1-DREAM toolbox
consists of five Machine Learning methods, whose aim is the extraction and
modelling of 1-dimensional structures in astronomical big data settings. We
show that 1-DREAM can be used to extract structures of different density ranges
within the Cosmic Web and to create probabilistic models of them. For
demonstration, we construct a probabilistic model of an extracted filament and
move through the structure to measure properties such as local density and
velocity. We also compare our toolbox with a collection of methodologies which
trace the Cosmic Web. We show that 1-DREAM is able to split the network into
its various environments with results comparable to the state-of-the-art
methodologies. A detailed comparison is then made with the public code
DisPerSE, in which we find that 1-DREAM is robust against changes in sample
size making it suitable for analyzing sparse observational data, and finding
faint and diffuse manifolds in low density regions
ASAP : a sub-sampling approach for preserving topological structures
Topological data analysis tools enjoy increasing popularity in a wide range of applications. However, due to computational complexity, processing large number of samples of higher dimensionality quickly becomes infeasible. We propose a novel sub-sampling strategy inspired by Coulomb{\textquoteright}s law to decrease the number of data points in d-dimensional point clouds while preserving its Homology. The method is not only capable of reducing the memory and computation time needed for the construction of different types of simplicial complexes but also preserves the size of the voids in d-dimensions, which is crucial e.g. for astronomical applications. We demonstrate and compare the strategy in several synthetic scenarios and an astronomical particle simulation of a Jellyfish galaxy for the detection of superbubbles (supernova signatures)
Swarm-intelligence-based extraction and manifold crawling along the Large-Scale Structure
The distribution of galaxies and clusters of galaxies on the mega-parsec scale of the Universe follows an intricate pattern now famously known as the Large-Scale Structure or the Cosmic Web. To study the environments of this network, several techniques have been developed that are able to describe its properties and the properties of groups of galaxies as a function of their environment. In this work, we analyse the previously introduced framework: 1-Dimensional Recovery, Extraction, and Analysis of Manifolds (1-DREAM) on N-body cosmological simulation data of the Cosmic Web. The 1-DREAM toolbox consists of five Machine Learning methods, whose aim is the extraction and modelling of one-dimensional structures in astronomical big data settings. We show that 1-DREAM can be used to extract structures of different density ranges within the Cosmic Web and to create probabilistic models of them. For demonstration, we construct a probabilistic model of an extracted filament and move through the structure to measure properties such as local density and velocity. We also compare our toolbox with a collection of methodologies which trace the Cosmic Web. We show that 1-DREAM is able to split the network into its various environments with results comparable to the state-of-the-art methodologies. A detailed comparison is then made with the public code DISPERSE, in which we find that 1-DREAM is robust against changes in sample size making it suitable for analysing sparse observational data, and finding faint and diffuse manifolds in low-density regions