18 research outputs found
Quantifying signals with power-law correlations: A comparative study of detrended fluctuation analysis and detrended moving average techniques
Detrended fluctuation analysis (DFA) and detrended moving average (DMA) are
two scaling analysis methods designed to quantify correlations in noisy
non-stationary signals. We systematically study the performance of different
variants of the DMA method when applied to artificially generated long-range
power-law correlated signals with an {\it a-priori} known scaling exponent
and compare them with the DFA method. We find that the scaling
results obtained from different variants of the DMA method strongly depend on
the type of the moving average filter. Further, we investigate the optimal
scaling regime where the DFA and DMA methods accurately quantify the scaling
exponent , and how this regime depends on the correlations in the
signal. Finally, we develop a three-dimensional representation to determine how
the stability of the scaling curves obtained from the DFA and DMA methods
depends on the scale of analysis, the order of detrending, and the order of the
moving average we use, as well as on the type of correlations in the signal.Comment: 15 pages, 16 figure
Effect of nonstationarities on detrended fluctuation analysis
Detrended fluctuation analysis (DFA) is a scaling analysis method used to
quantify long-range power-law correlations in signals. Many physical and
biological signals are ``noisy'', heterogeneous and exhibit different types of
nonstationarities, which can affect the correlation properties of these
signals. We systematically study the effects of three types of
nonstationarities often encountered in real data. Specifically, we consider
nonstationary sequences formed in three ways: (i) stitching together segments
of data obtained from discontinuous experimental recordings, or removing some
noisy and unreliable parts from continuous recordings and stitching together
the remaining parts -- a ``cutting'' procedure commonly used in preparing data
prior to signal analysis; (ii) adding to a signal with known correlations a
tunable concentration of random outliers or spikes with different amplitude,
and (iii) generating a signal comprised of segments with different properties
-- e.g. different standard deviations or different correlation exponents. We
compare the difference between the scaling results obtained for stationary
correlated signals and correlated signals with these three types of
nonstationarities.Comment: 17 pages, 10 figures, corrected some typos, added one referenc
Effect of Trends on Detrended Fluctuation Analysis
Detrended fluctuation analysis (DFA) is a scaling analysis method used to
estimate long-range power-law correlation exponents in noisy signals. Many
noisy signals in real systems display trends, so that the scaling results
obtained from the DFA method become difficult to analyze. We systematically
study the effects of three types of trends -- linear, periodic, and power-law
trends, and offer examples where these trends are likely to occur in real data.
We compare the difference between the scaling results for artificially
generated correlated noise and correlated noise with a trend, and study how
trends lead to the appearance of crossovers in the scaling behavior. We find
that crossovers result from the competition between the scaling of the noise
and the ``apparent'' scaling of the trend. We study how the characteristics of
these crossovers depend on (i) the slope of the linear trend; (ii) the
amplitude and period of the periodic trend; (iii) the amplitude and power of
the power-law trend and (iv) the length as well as the correlation properties
of the noise. Surprisingly, we find that the crossovers in the scaling of noisy
signals with trends also follow scaling laws -- i.e. long-range power-law
dependence of the position of the crossover on the parameters of the trends. We
show that the DFA result of noise with a trend can be exactly determined by the
superposition of the separate results of the DFA on the noise and on the trend,
assuming that the noise and the trend are not correlated. If this superposition
rule is not followed, this is an indication that the noise and the superimposed
trend are not independent, so that removing the trend could lead to changes in
the correlation properties of the noise.Comment: 20 pages, 16 figure
The Prognostic Value of Non-Linear Analysis of Heart Rate Variability in Patients with Congestive Heart Failure—A Pilot Study of Multiscale Entropy
AIMS: The influences of nonstationarity and nonlinearity on heart rate time series can be mathematically qualified or quantified by multiscale entropy (MSE). The aim of this study is to investigate the prognostic value of parameters derived from MSE in the patients with systolic heart failure. METHODS AND RESULTS: Patients with systolic heart failure were enrolled in this study. One month after clinical condition being stable, 24-hour Holter electrocardiogram was recording. MSE as well as other standard parameters of heart rate variability (HRV) and detrended fluctuation analysis (DFA) were assessed. A total of 40 heart failure patients with a mea age of 56±16 years were enrolled and followed-up for 684±441 days. There were 25 patients receiving β-blockers treatment. During follow-up period, 6 patients died or received urgent heart transplantation. The short-term exponent of DFA and the slope of MSE between scale 1 to 5 were significantly different between patients with or without β-blockers (p = 0.014 and p = 0.028). Only the area under the MSE curve for scale 6 to 20 (Area(6-20)) showed the strongest predictive power between survival (n = 34) and mortality (n = 6) groups among all the parameters. The value of Area(6-20)21.2 served as a significant predictor of mortality or heart transplant (p = 0.0014). CONCLUSION: The area under the MSE curve for scale 6 to 20 is not relevant to β-blockers and could further warrant independent risk stratification for the prognosis of CHF patients
Unexpected Course of Nonlinear Cardiac Interbeat Interval Dynamics during Childhood and Adolescence
The fluctuations of the cardiac interbeat series contain rich information because they reflect variations of other functions on different time scales (e.g., respiration or blood pressure control). Nonlinear measures such as complexity and fractal scaling properties derived from 24 h heart rate dynamics of healthy subjects vary from childhood to old age. In this study, the age-related variations during childhood and adolescence were addressed. In particular, the cardiac interbeat interval series was quantified with respect to complexity and fractal scaling properties. The R-R interval series of 409 healthy children and adolescents (age range: 1 to 22 years, 220 females) was analyzed with respect to complexity (Approximate Entropy, ApEn) and fractal scaling properties on three time scales: long-term (slope β of the power spectrum, log power vs. log frequency, in the frequency range 10−4 to 10−2 Hz) intermediate-term (DFA, detrended fluctuation analysis, α2) and short-term (DFA α1). Unexpectedly, during age 7 to 13 years β and ApEn were higher compared to the age <7 years and age >13 years (β: −1.06 vs. −1.21; ApEn: 0.88 vs. 0.74). Hence, the heart rate dynamics were closer to a 1/f power law and most complex between 7 and 13 years. However, DFA α1 and α2 increased with progressing age similar to measures reflecting linear properties. In conclusion, the course of long-term fractal scaling properties and complexity of heart rate dynamics during childhood and adolescence indicates that these measures reflect complex changes possibly linked to hormonal changes during pre-puberty and puberty
Complex systems and the technology of variability analysis
Characteristic patterns of variation over time, namely rhythms, represent a defining feature of complex systems, one that is synonymous with life. Despite the intrinsic dynamic, interdependent and nonlinear relationships of their parts, complex biological systems exhibit robust systemic stability. Applied to critical care, it is the systemic properties of the host response to a physiological insult that manifest as health or illness and determine outcome in our patients. Variability analysis provides a novel technology with which to evaluate the overall properties of a complex system. This review highlights the means by which we scientifically measure variation, including analyses of overall variation (time domain analysis, frequency distribution, spectral power), frequency contribution (spectral analysis), scale invariant (fractal) behaviour (detrended fluctuation and power law analysis) and regularity (approximate and multiscale entropy). Each technique is presented with a definition, interpretation, clinical application, advantages, limitations and summary of its calculation. The ubiquitous association between altered variability and illness is highlighted, followed by an analysis of how variability analysis may significantly improve prognostication of severity of illness and guide therapeutic intervention in critically ill patients