2,942 research outputs found

    A maximum likelihood based technique for validating detrended fluctuation analysis (ML-DFA)

    Get PDF
    Detrended Fluctuation Analysis (DFA) is widely used to assess the presence of long-range temporal correlations in time series. Signals with long-range temporal correlations are typically defined as having a power law decay in their autocorrelation function. The output of DFA is an exponent, which is the slope obtained by linear regression of a log-log fluctuation plot against window size. However, if this fluctuation plot is not linear, then the underlying signal is not self-similar, and the exponent has no meaning. There is currently no method for assessing the linearity of a DFA fluctuation plot. Here we present such a technique, called ML-DFA. We scale the DFA fluctuation plot to construct a likelihood function for a set of alternative models including polynomial, root, exponential, logarithmic and spline functions. We use this likelihood function to determine the maximum likelihood and thus to calculate values of the Akaike and Bayesian information criteria, which identify the best fit model when the number of parameters involved is taken into account and over-fitting is penalised. This ensures that, of the models that fit well, the least complicated is selected as the best fit. We apply ML-DFA to synthetic data from FARIMA processes and sine curves with DFA fluctuation plots whose form has been analytically determined, and to experimentally collected neurophysiological data. ML-DFA assesses whether the hypothesis of a linear fluctuation plot should be rejected, and thus whether the exponent can be considered meaningful. We argue that ML-DFA is essential to obtaining trustworthy results from DFA.Comment: 22 pages, 7 figure

    Multifractal analysis of discretized X-ray CT images for the characterization of soil macropore structures

    Get PDF
    A correct statistical model of soil pore structure can be critical for understanding flow and transport processes in soils, and creating synthetic soil pore spaces for hypothetical and model testing, and evaluating similarity of pore spaces of different soils. Advanced visualization techniques such as X-ray computed tomography (CT) offer new opportunities of exploring heterogeneity of soil properties at horizon or aggregate scales. Simple fractal models such as fractional Brownian motion that have been proposed to capture the complex behavior of soil spatial variation at field scale rarely simulate irregularity patterns displayed by spatial series of soil properties. The objective of this work was to use CT data to test the hypothesis that soil pore structure at the horizon scale may be represented by multifractal models. X-ray CT scans of twelve, water-saturated, 20-cm long soil columns with diameters of 7.5 cm were analyzed. A reconstruction algorithm was applied to convert the X-ray CT data into a stack of 1480 grayscale digital images with a voxel resolution of 110 microns and a cross-sectional size of 690 × 690 pixels. The images were binarized and the spatial series of the percentage of void space vs. depth was analyzed to evaluate the applicability of the multifractal model. The series of depth-dependent macroporosity values exhibited a well-defined multifractal structure that was revealed by singularity and RĂ©nyi spectra. The long-range dependencies in these series were parameterized by the Hurst exponent. Values of the Hurst exponent close to one were observed indicating the strong persistence in variations of porosity with depth. The multifractal modeling of soil macropore structure can be an efficient method for parameterizing and simulating the vertical spatial heterogeneity of soil pore space

    Synthetic Turbulence, Fractal Interpolation and Large-Eddy Simulation

    Full text link
    Fractal Interpolation has been proposed in the literature as an efficient way to construct closure models for the numerical solution of coarse-grained Navier-Stokes equations. It is based on synthetically generating a scale-invariant subgrid-scale field and analytically evaluating its effects on large resolved scales. In this paper, we propose an extension of previous work by developing a multiaffine fractal interpolation scheme and demonstrate that it preserves not only the fractal dimension but also the higher-order structure functions and the non-Gaussian probability density function of the velocity increments. Extensive a-priori analyses of atmospheric boundary layer measurements further reveal that this Multiaffine closure model has the potential for satisfactory performance in large-eddy simulations. The pertinence of this newly proposed methodology in the case of passive scalars is also discussed

    Chaotic Time Series Analysis in Economics: Balance and Perspectives

    Get PDF
    To show that a mathematical model exhibits chaotic behaviour does not prove that chaos is also present in the corresponding data. To convincingly show that a system behaves chaotically, chaos has to be identified directly from the data. From an empirical point of view, it is difficult to distinguish between fluctuations provoked by random shocks and endogenous fluctuations determined by the nonlinear nature of the relation between economic aggregates. For this purpose, chaos tests test are developed to investigate the basic features of chaotic phenomena: nonlinearity, fractal attractor, and sensitivity to initial conditions. The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in the data. More specifically, our attention will be devoted to reviewing the results reached by the application of these techniques to economic and financial time series and to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.Economic dynamics, nonlinearity, tests for chaos, chaos

    Light scattering from cold rolled aluminum surfaces

    Get PDF
    We present experimental light scattering measurements from aluminum surfaces obtained by cold rolling. We show that our results are consistent with a scale invariant description of the roughness of these surfaces. The roughness parameters that we obtain from the light scattering experiment are consistent with those obtained from Atomic Force Microscopy measurements

    Model estimation of cerebral hemodynamics between blood flow and volume changes: a data-based modeling approach

    Get PDF
    It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV
    • 

    corecore