4,952 research outputs found
Coarse-graining Approaches in Univariate Multiscale Sample and Dispersion Entropy
The evaluation of complexity in univariate signals has attracted considerable attention in recent years. This is often done using the framework of Multiscale Entropy, which entails two basic steps: coarse-graining to consider multiple temporal scales, and evaluation of irregularity for each of those scales with entropy estimators. Recent developments in the field have proposed modifications to this approach to facilitate the analysis of short-time series. However, the role of the downsampling in the classical coarse-graining process and its relationships with alternative filtering techniques has not been systematically explored yet. Here, we assess the impact of coarse-graining in multiscale entropy estimations based on both Sample Entropy and Dispersion Entropy. We compare the classical moving average approach with low-pass Butterworth filtering, both with and without downsampling, and empirical mode decomposition in Intrinsic Multiscale Entropy, in selected synthetic data and two real physiological datasets. The results show that when the sampling frequency is low or high, downsampling respectively decreases or increases the entropy values. Our results suggest that, when dealing with long signals and relatively low levels of noise, the refine composite method makes little difference in the quality of the entropy estimation at the expense of considerable additional computational cost. It is also found that downsampling within the coarse-graining procedure may not be required to quantify the complexity of signals, especially for short ones. Overall, we expect these results to contribute to the ongoing discussion about the development of stable, fast and robust-to-noise multiscale entropy techniques suited for either short or long recordings
Weak Lensing Mass Reconstruction using Wavelets
This paper presents a new method for the reconstruction of weak lensing mass
maps. It uses the multiscale entropy concept, which is based on wavelets, and
the False Discovery Rate which allows us to derive robust detection levels in
wavelet space. We show that this new restoration approach outperforms several
standard techniques currently used for weak shear mass reconstruction. This
method can also be used to separate E and B modes in the shear field, and thus
test for the presence of residual systematic effects. We concentrate on large
blind cosmic shear surveys, and illustrate our results using simulated shear
maps derived from N-Body Lambda-CDM simulations with added noise corresponding
to both ground-based and space-based observations.Comment: Accepted manuscript with all figures can be downloaded at:
http://jstarck.free.fr/aa_wlens05.pdf and software can be downloaded at
http://jstarck.free.fr/mrlens.htm
Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes
Exploiting the theory of state space models, we derive the exact expressions
of the information transfer, as well as redundant and synergistic transfer, for
coupled Gaussian processes observed at multiple temporal scales. All of the
terms, constituting the frameworks known as interaction information
decomposition and partial information decomposition, can thus be analytically
obtained for different time scales from the parameters of the VAR model that
fits the processes. We report the application of the proposed methodology
firstly to benchmark Gaussian systems, showing that this class of systems may
generate patterns of information decomposition characterized by mainly
redundant or synergistic information transfer persisting across multiple time
scales or even by the alternating prevalence of redundant and synergistic
source interaction depending on the time scale. Then, we apply our method to an
important topic in neuroscience, i.e., the detection of causal interactions in
human epilepsy networks, for which we show the relevance of partial information
decomposition to the detection of multiscale information transfer spreading
from the seizure onset zone
Multiscale Granger causality
In the study of complex physical and biological systems represented by
multivariate stochastic processes, an issue of great relevance is the
description of the system dynamics spanning multiple temporal scales. While
methods to assess the dynamic complexity of individual processes at different
time scales are well-established, multiscale analysis of directed interactions
has never been formalized theoretically, and empirical evaluations are
complicated by practical issues such as filtering and downsampling. Here we
extend the very popular measure of Granger causality (GC), a prominent tool for
assessing directed lagged interactions between joint processes, to quantify
information transfer across multiple time scales. We show that the multiscale
processing of a vector autoregressive (AR) process introduces a moving average
(MA) component, and describe how to represent the resulting ARMA process using
state space (SS) models and to combine the SS model parameters for computing
exact GC values at arbitrarily large time scales. We exploit the theoretical
formulation to identify peculiar features of multiscale GC in basic AR
processes, and demonstrate with numerical simulations the much larger
estimation accuracy of the SS approach compared with pure AR modeling of
filtered and downsampled data. The improved computational reliability is
exploited to disclose meaningful multiscale patterns of information transfer
between global temperature and carbon dioxide concentration time series, both
in paleoclimate and in recent years
Filter-based multiscale entropy analysis of complex physiological time series
The multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiologic time series. In this thesis, we re-interpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce the {\it filter-based multiscale entropy} (FME) which filters a time series by filters to generate its multiple frequency components and then compute the {\it blockwise} entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory which suggests that the human heartbeat interval time series (HHITS) can be described in piecewise linear patterns, we propose the piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. We then propose wavelet packet transform entropy (WPTE) analysis. We apply WPTE analysis to HHITS using lower and higher piecewise linear filters. Numerical results show that WPTE using piecewise linear filters gives us the highest classification rates discriminating different cardiac systems among other multiscale entropy analysis. At the end, we discuss the application of FME on discrete time series. We introduce an `eliminating\u27 algorithm to examine and compare the complexity of coding and noncoding DNA sequences
Hyperspectral colon tissue cell classification
A novel algorithm to discriminate between normal and malignant tissue cells of the human colon is presented. The microscopic level images of human colon tissue cells were acquired using hyperspectral imaging technology at contiguous wavelength intervals of visible light. While hyperspectral imagery data provides a wealth of information, its large size normally means high computational processing complexity. Several methods exist to avoid the so-called curse of dimensionality and hence reduce the computational complexity. In this study, we experimented with Principal Component Analysis (PCA) and two modifications of Independent Component Analysis (ICA). In the first stage of the algorithm, the extracted components are used to separate four constituent parts of the colon tissue: nuclei, cytoplasm, lamina propria, and lumen. The segmentation is performed in an unsupervised fashion using the nearest centroid clustering algorithm. The segmented image is further used, in the second stage of the classification algorithm, to exploit the spatial relationship between the labeled constituent parts. Experimental results using supervised Support Vector Machines (SVM) classification based on multiscale morphological features reveal the discrimination between normal and malignant tissue cells with a reasonable degree of accuracy
- …