95 research outputs found

    Time series classification based on fractal properties

    Full text link
    The article considers classification task of fractal time series by the meta algorithms based on decision trees. Binomial multiplicative stochastic cascades are used as input time series. Comparative analysis of the classification approaches based on different features is carried out. The results indicate the advantage of the machine learning methods over the traditional estimating the degree of self-similarity.Comment: 4 pages, 2 figures, 3 equations, 1 tabl

    Mandelbrot's stochastic time series models

    Get PDF
    I survey and illustrate the main time series models that Mandelbrot introduced into time series analysis in the 1960s and 1970s. I focus particularly on the members of the additive fractional stable family including Lévy flights and fractional Brownian motion (fBm), noting some of the less well‐known aspects of this family, such as the cases when the self‐similarity exponent H and the Hurst exponent J differ. I briefly discuss the role of multiplicative models in modeling the physics of cascades. I then recount the still little‐known story of Mandelbrot's work on fractional renewal models in the late 1960s, explaining how these differ from their more familiar fBm counterpart and form a “missing link” between fBm and the problem of random change points. I conclude by highlighting the frontier problem of damped fractional models

    Investigating detrended fluctuation analysis with structural breaks

    Get PDF
    Detrended Fluctuation Analysis has been used in several fields of science to study the statistical properties of trend stationary and nonstationary time-series. Its application to financial data has produced important results concerning long-range correlations and long-memory. However, these results may be contaminated if the researcher attributes to nonstationary trends the effect of stationary trends with endogenous structural breaks. Our paper proposes a modified DFA model where boxes to determine local trends are replaced by endogenous structural break windows. We also allow local trends fitted by quadratic functions and use squared residuals in place of patchy standard deviations to study the magnitude of the power-law exponent. The results show that our modified DFA model performs better than the fixed length alternatives originally proposed, and is, therefore, a suitable model to fit with financial data. Consistently with previous findings, our results show positive long-range correlation in all indices with the higher value for emerging markets.info:eu-repo/semantics/acceptedVersio

    Trendit poistavan fluktuaatioanalyysin edistyneet menetelmät ja niiden sovellukset laskennallisessa kardiologiassa

    Get PDF
    Fractals are ubiquitous in nature. A defining characteristic of fractality is self-similarity; the phenomenon looks similar when observed at multiple scales, which implies the existence of a power law scaling relation. Detrended fluctuation analysis (DFA) is a popular tool for studying these fractal scaling relations. Power laws become linear relationships in logarithmic scales, and conventionally these scaling exponents are determined by simple linear regression in approximately linear regions in doubly logarithmic plots. However, in practice the scaling is hardly ever exact, and its behavior may often vary at different scales. This thesis extends the fluctuation analysis by introducing robust tools for determining these scaling exponents. A method based on the Kalman smoother is utilized for extracting a whole spectrum of exponents as a function of the scale. The method is parameter-free and resistant to statistical noise, which distinguishes it from prior efforts for determining such local scale exponents. Additionally, an optimization scheme is presented to obtain data-adaptive segmentation of approximately linear regimes. Based on integer linear programming model, the procedure may readily be customized for various purposes. This versatility is demonstrated by applying the method to a group of data to find a common segmentation that is particularly well-suited for machine learning applications. First, the methods are are employed in exploring the details of the scaling by analyzing simulated data with known scaling properties. These findings provide insight into the interpretation of earlier results. Second, the methods are applied to the study of heart rate variability. The beating of the heart follows fractal-like patterns, and deviations in these complex variations may be indicative of cardiac diseases. In this context DFA is traditionally performed by extracting two scaling exponents, for short- and long-range correlations, respectively. This has been criticized as an oversimplification, which is corroborated by the results of this thesis. The heart rate exhibits richer fractal-like variability, which becomes apparent in the full scaling spectra. The additional information provided by these methods facilitate improved classification of cardiac conditions.Fraktaaleja esiintyy kaikkialla luonnossa. Fraktaalisuuden ominaispiirre on itsesimilaarisuus, eli ilmiö näyttää samankaltaiselta, kun sitä tarkastellaan useassa eri skaalassa. Tämä johtaa siihen, että ilmiön skaalautuvuus noudattaa potenssilakia. Tällaisia fraktaalisia skaalausrelaatioita voidaan tutkia trendit poistavan fluktuaatioanalyysin avulla (Detrended Fluctuation Analysis, DFA). Logaritminen skaala muuntaa potenssilait lineaarisiksi riippuvuuksiksi, ja tavallisesti skaalauseksponentit määritetään logaritmisista kuvaajista lineaarisen regression avulla. Kuitenkaan skaalautuvuus ei lähes koskaan ole täydellistä, ja se voi myös muuttua eri skaaloilla. Tämä työ laajentaa fluktuaatioanalyysia esittelemällä paranneltuja menetelmiä näiden skaalauseksponenttien määrittämiseen. Kokonainen spektri skaalauseksponentteja skaalan funktiona määritetään hyödyntämällä Kalman-suodinta. Tämän menetelmän etuja verrattuna aikaisempiin tapoihin määrittää paikallisia skaalauseksponentteja ovat sen parametrivapaa esitys ja vakaus myös kohinaisissa tapauksissa. Lisäksi esitetään lineaariseen kokonaislukuoptimointiin perustuva menetelmä, jonka avulla skaalautuvuudessa voidaan erottaa alueita, jotka noudattavat likimäärin potenssilakia. Tämän mallin muokkaaminen eri tarpeisiin on myös suoraviivaista. Mallia sovelletaan yhteisen segmentaation etsimiseksi datajoukolle, mikä on tarpeen erityisesti koneoppimisen menetelmiä varten. Esitettyjen menetelmien avulla tutkitaan ensin simuloituja prosesseja, joiden teoreettinen skaalautuminen tunnetaan. Menetelmien mahdollistama yksityiskohtainen analyysi selittää aikaisempia havaintoja DFA:n käyttäytymisessä. Menetelmiä sovelletaan myös sykevälivaihtelun fraktaalianalyysiin. Terveen sydämen sykeväleissä on fraktaalisia piirteitä, joita eri sairaudet muokkaavat ja hävittävät. Sykevälivaihtelun fraktaalisuutta on perinteisesti kuvattu lyhyen- ja pitkän kantaman skaalauseksponenteilla. Tätä kahden eksponentin mallia on kritisoitu riittämättömäksi, ja tämän työn tulokset vahvistavat tätä näkökulmaa. Skaalauseksponenttispektri paljastaa, että sykevälivaihtelun fraktaalisuus on kahden eksponentin mallia monimuotoisempaa. Esitetyillä menetelmillä saatava lisäinformaatio mahdollistaa aikaisempaa tarkemman sydänsairauksien luokittelun

    Estimators of Fractal Dimension: Assessing the Roughness of Time Series and Spatial Data

    Full text link
    The fractal or Hausdorff dimension is a measure of roughness (or smoothness) for time series and spatial data. The graph of a smooth, differentiable surface indexed in Rd\mathbb{R}^d has topological and fractal dimension dd. If the surface is nondifferentiable and rough, the fractal dimension takes values between the topological dimension, dd, and d+1d+1. We review and assess estimators of fractal dimension by their large sample behavior under infill asymptotics, in extensive finite sample simulation studies, and in a data example on arctic sea-ice profiles. For time series or line transect data, box-count, Hall--Wood, semi-periodogram, discrete cosine transform and wavelet estimators are studied along with variation estimators with power indices 2 (variogram) and 1 (madogram), all implemented in the R package fractaldim. Considering both efficiency and robustness, we recommend the use of the madogram estimator, which can be interpreted as a statistically more efficient version of the Hall--Wood estimator. For two-dimensional lattice data, we propose robust transect estimators that use the median of variation estimates along rows and columns. Generally, the link between power variations of index p>0p>0 for stochastic processes, and the Hausdorff dimension of their sample paths, appears to be particularly robust and inclusive when p=1p=1.Comment: Published in at http://dx.doi.org/10.1214/11-STS370 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Financial markets as a complex system: A short time scale perspective

    Get PDF
    In this paper we want to discuss macroscopic and microscopic properties of financial markets. By analyzing quantitatively a database consisting of 13 minute per minute recorded financial time series, we identify some macroscopic statistical properties of the corresponding markets, with a special emphasize on temporal correlations. These analysis are performed by using both linear and nonlinear tools. Multivariate correlations are also tested for, which leads to the identification of a global coupling mechanism between the considered stock markets. The application of a new formalism, called transfer entropy, allows to measure the information flow between some financial time series. We then discuss some key aspects of recent attemps to model financial markets from a microscopic point of view. One model, that is based on the simulation of the order book, is described more in detail, and the results of its practical implementation are presented. We finally address some general aspects of forecasting and modeling, in particular the role of stochastic and nonlinear deterministic processes. --time series analysis,econophysics,simulated markets,temporal correlations,high-frequency data

    Long-term memories of developed and emerging markets: Using the scaling analysis to characterize their stage of development

    Get PDF
    The scaling properties encompass in a simple analysis many of the volatility characteristics of financial markets. That is why we use them to probe the different degree of markets development. We empirically study the scaling properties of daily Foreign Exchange rates, Stock Market indices and fixed income instruments by using the generalized Hurst approach. We show that the scaling exponents are associated with characteristics of the specific markets and can be used to differentiate markets in their stage of development. The robustness of the results is tested by both Monte-Carlo studies and a computation of the scaling in the frequency-domain.Scaling exponents; Time series analysis; Multi-fractals

    Generalized Volterra-Wiener and surrogate data methods for complex time series analysis

    Get PDF
    Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.Includes bibliographical references (leaves 133-150).This thesis describes the current state-of-the-art in nonlinear time series analysis, bringing together approaches from a broad range of disciplines including the non-linear dynamical systems, nonlinear modeling theory, time-series hypothesis testing, information theory, and self-similarity. We stress mathematical and qualitative relationships between key algorithms in the respective disciplines in addition to describing new robust approaches to solving classically intractable problems. Part I presents a comprehensive review of various classical approaches to time series analysis from both deterministic and stochastic points of view. We focus on using these classical methods for quantification of complexity in addition to proposing a unified approach to complexity quantification encapsulating several previous approaches. Part II presents robust modern tools for time series analysis including surrogate data and Volterra-Wiener modeling. We describe new algorithms converging the two approaches that provide both a sensitive test for nonlinear dynamics and a noise-robust metric for chaos intensity.by Akhil Shashidhar.M.Eng
    corecore