1,428 research outputs found
What can be inferred from surrogate data testing?
Surrogate data testing for linearity is frequently applied to confirm the
results of nonlinear time series analysis. We argue that this, in general, is
not possible.Comment: 1 pag
Detecting periodicity in experimental data using linear modeling techniques
Fourier spectral estimates and, to a lesser extent, the autocorrelation
function are the primary tools to detect periodicities in experimental data in
the physical and biological sciences. We propose a new method which is more
reliable than traditional techniques, and is able to make clear identification
of periodic behavior when traditional techniques do not. This technique is
based on an information theoretic reduction of linear (autoregressive) models
so that only the essential features of an autoregressive model are retained.
These models we call reduced autoregressive models (RARM). The essential
features of reduced autoregressive models include any periodicity present in
the data. We provide theoretical and numerical evidence from both experimental
and artificial data, to demonstrate that this technique will reliably detect
periodicities if and only if they are present in the data. There are strong
information theoretic arguments to support the statement that RARM detects
periodicities if they are present. Surrogate data techniques are used to ensure
the converse. Furthermore, our calculations demonstrate that RARM is more
robust, more accurate, and more sensitive, than traditional spectral
techniques.Comment: 10 pages (revtex) and 6 figures. To appear in Phys Rev E. Modified
styl
Stochastic to deterministic crossover of fractal dimension for a Langevin equation
Using algorithms of Higuchi and of Grassberger and Procaccia, we study
numerically how fractal dimensions cross over from finite-dimensional Brownian
noise at short time scales to finite values of deterministic chaos at longer
time scales for data generated from a Langevin equation that has a strange
attractor in the limit of zero noise. Our results suggest that the crossover
occurs at such short time scales that there is little chance of
finite-dimensional Brownian noise being incorrectly identified as deterministic
chaos.Comment: 12 pages including 3 figures, RevTex and epsf. To appear Phys. Rev.
E, April, 199
Test your surrogate data before you test for nonlinearity
The schemes for the generation of surrogate data in order to test the null
hypothesis of linear stochastic process undergoing nonlinear static transform
are investigated as to their consistency in representing the null hypothesis.
In particular, we pinpoint some important caveats of the prominent algorithm of
amplitude adjusted Fourier transform surrogates (AAFT) and compare it to the
iterated AAFT (IAAFT), which is more consistent in representing the null
hypothesis. It turns out that in many applications with real data the
inferences of nonlinearity after marginal rejection of the null hypothesis were
premature and have to be re-investigated taken into account the inaccuracies in
the AAFT algorithm, mainly concerning the mismatching of the linear
correlations. In order to deal with such inaccuracies we propose the use of
linear together with nonlinear polynomials as discriminating statistics. The
application of this setup to some well-known real data sets cautions against
the use of the AAFT algorithm.Comment: 14 pages, 15 figures, submitted to Physical Review
Emerging Infections and Pregnancy
Immunologic changes of pregnancy may increase susceptibility to certain intracellular pathogens
Recommended from our members
Heuristic estimates of weighted binomial statistics for use in detecting rare point source transients
The ALEXIS (Array of Low Energy X-ray Imaging Sensors) satellite scans nearly half the sky every fifty seconds, and downlinks time-tagged photon data twice a day. The standard science quicklook processing produces over a dozen sky maps at each downlink, and these maps are automatically searched for potential transient point sources. We are interested only in {ital highly significant} point source detections, and based on earlier Monte-Carlo studies, only consider {ital p} < 10{sup -7}, which is about 5.2 {sigma}. Our algorithms are therefore required to operate on the far tail of the distribution, where many traditional approximations break down. Although an exact solution is available for the case of unweighted counts, the problem is more difficult in the case of weighted counts. We have found that a heuristic modification of a formula derived by Li and Ma (1983) provides reasonably accurate estimates of p-values for point source detections even for very low p-value detections
Neural Network Model for Apparent Deterministic Chaos in Spontaneously Bursting Hippocampal Slices
A neural network model that exhibits stochastic population bursting is
studied by simulation. First return maps of inter-burst intervals exhibit
recurrent unstable periodic orbit (UPO)-like trajectories similar to those
found in experiments on hippocampal slices. Applications of various control
methods and surrogate analysis for UPO-detection also yield results similar to
those of experiments. Our results question the interpretation of the
experimental data as evidence for deterministic chaos and suggest caution in
the use of UPO-based methods for detecting determinism in time-series data.Comment: 4 pages, 5 .eps figures (included), requires psfrag.sty (included
Using Topological Statistics to Detect Determinism in Time Series
Statistical differentiability of the measure along the reconstructed
trajectory is a good candidate to quantify determinism in time series. The
procedure is based upon a formula that explicitly shows the sensitivity of the
measure to stochasticity. Numerical results for partially surrogated time
series and series derived from several stochastic models, illustrate the
usefulness of the method proposed here. The method is shown to work also for
high--dimensional systems and experimental time seriesComment: 23 RevTeX pages, 14 eps figures. To appear in Physical Review
Quantitative analysis by renormalized entropy of invasive electroencephalograph recordings in focal epilepsy
Invasive electroencephalograph (EEG) recordings of ten patients suffering
from focal epilepsy were analyzed using the method of renormalized entropy.
Introduced as a complexity measure for the different regimes of a dynamical
system, the feature was tested here for its spatio-temporal behavior in
epileptic seizures. In all patients a decrease of renormalized entropy within
the ictal phase of seizure was found. Furthermore, the strength of this
decrease is monotonically related to the distance of the recording location to
the focus. The results suggest that the method of renormalized entropy is a
useful procedure for clinical applications like seizure detection and
localization of epileptic foci.Comment: 10 pages, 5 figure
- …