35 research outputs found

    Approximated maximum likelihood estimation in multifractal random walks

    Full text link
    We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry et al., Phys. Rev. E 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the R computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices.Comment: 8 pages, 3 figures, 2 table

    On the multiresolution structure of Internet traffic traces

    Full text link
    Internet traffic on a network link can be modeled as a stochastic process. After detecting and quantifying the properties of this process, using statistical tools, a series of mathematical models is developed, culminating in one that is able to generate ``traffic'' that exhibits --as a key feature-- the same difference in behavior for different time scales, as observed in real traffic, and is moreover indistinguishable from real traffic by other statistical tests as well. Tools inspired from the models are then used to determine and calibrate the type of activity taking place in each of the time scales. Surprisingly, the above procedure does not require any detailed information originating from either the network dynamics, or the decomposition of the total traffic into its constituent user connections, but rather only the compliance of these connections to very weak conditions.Comment: 57 pages, color figures. Figures are of low quality due to space consideration

    Multifractal analysis of memory usage patterns

    Get PDF
    The discovery of fractal phenomenon in computer-related areas such as network traffic flow leads to the hypothesis that many computer resources display fractal characteristics. The goal of this study is to apply fractal analysis to computer memory usage patterns. We devise methods for calculating the Holder exponent of a time series and calculating the fractal dimension of a plot of a time series. These methods are then applied to memory-related data collected from a Unix server. We find that our methods for calculating the Holder exponent of a time series yield results that are independently confirmed through calculation of the fractal dimension of the time series, and that computer memory use does indeed display multifractal behavior. In addition, it is hypothesized that this multifractal behavior may be useful in making certain predictions about the future behavior of an operating system

    On partitioning multivariate self-affine time series

    Get PDF
    Given a multivariate time series, possibly of high dimension, with unknown and time-varying joint distribution, it is of interest to be able to completely partition the time series into disjoint, contiguous subseries, each of which has different distributional or pattern attributes from the preceding and succeeding subseries. An additional feature of many time series is that they display self-affinity, so that subseries at one time scale are similar to subseries at another after application of an affine transformation. Such qualities are observed in time series from many disciplines, including biology, medicine, economics, finance, and computer science. This paper defines the relevant multiobjective combinatorial optimization problem with limited assumptions as a biobjective one, and a specialized evolutionary algorithm is presented which finds optimal self-affine time series partitionings with a minimum of choice parameters. The algorithm not only finds partitionings for all possible numbers of partitions given data constraints, but also for self-affinities between these partitionings and some fine-grained partitioning. The resulting set of Pareto-efficient solution sets provides a rich representation of the self-affine properties of a multivariate time series at different locations and time scales

    PREDICTING INTERNET TRAFFIC BURSTS USING EXTREME VALUE THEORY

    Get PDF
    Computer networks play an important role in today’s organization and people life. These interconnected devices share a common medium and they tend to compete for it. Quality of Service (QoS) comes into play as to define what level of services users get. Accurately defining the QoS metrics is thus important. Bursts and serious deteriorations are omnipresent in Internet and considered as an important aspects of it. This thesis examines bursts and serious deteriorations in Internet traffic and applies Extreme Value Theory (EVT) to their prediction and modelling. EVT itself is a field of statistics that has been in application in fields like hydrology and finance, with only a recent introduction to the field of telecommunications. Model fitting is based on real traces from Belcore laboratory along with some simulated traces based on fractional Gaussian noise and linear fractional alpha stable motion. QoS traces from University of Napoli are also used in the prediction stage. Three methods from EVT are successfully used for the bursts prediction problem. They are Block Maxima (BM) method, Peaks Over Threshold (POT) method, and RLargest Order Statistics (RLOS) method. Bursts in internet traffic are predicted using the above three methods. A clear methodology was developed for the bursts prediction problem. New metrics for QoS are suggested based on Return Level and Return Period. Thus, robust QoS metrics can be defined. In turn, a superior QoS will be obtained that would support mission critical applications

    Statistical methods for scale-invariant and multifractal stochastic processes.

    Get PDF
    This thesis focuses on stochastic modeling, and statistical methods, in finance and in climate science. Two financial markets, short-term interest rates and electricity prices, are analyzed. We find that the evidence of mean reversion in short-term interest rates is week, while the “log-returns” of electricity prices have significant anti-correlations. More importantly, empirical analyses confirm the multifractal nature of these financial markets, and we propose multifractal models that incorporate the specific conditional mean reversion and level dependence. A second topic in the thesis is the analysis of regional (5◦ × 5◦ and 2◦ × 2◦ latitude- longitude) globally gridded surface temperature series for the time period 1900-2014, with respect to a linear trend and long-range dependence. We find statistically significant trends in most regions. However, we also demonstrate that the existence of a second scaling regime on decadal time scales will have an impact on trend detection. The last main result is an approximative maximum likelihood (ML) method for the log- normal multifractal random walk. It is shown that the ML method has applications beyond parameter estimation, and can for instance be used to compute various risk measures in financial markets

    Self-similar traffic and network dynamics

    Get PDF
    Copyright © 2002 IEEEOne of the most significant findings of traffic measurement studies over the last decade has been the observed self-similarity in packet network traffic. Subsequent research has focused on the origins of this self-similarity, and the network engineering significance of this phenomenon. This paper reviews what is currently known about network traffic self-similarity and its significance. We then consider a matter of current research, namely, the manner in which network dynamics (specifically, the dynamics of transmission control protocol (TCP), the predominant transport protocol used in today's Internet) can affect the observed self-similarity. To this end, we first discuss some of the pitfalls associated with applying traditional performance evaluation techniques to highly-interacting, large-scale networks such as the Internet. We then present one promising approach based on chaotic maps to capture and model the dynamics of TCP-type feedback control in such networks. Not only can appropriately chosen chaotic map models capture a range of realistic source characteristics, but by coupling these to network state equations, one can study the effects of network dynamics on the observed scaling behavior. We consider several aspects of TCP feedback, and illustrate by examples that while TCP-type feedback can modify the self-similar scaling behavior of network traffic, it neither generates it nor eliminates it.Ashok Erramilli, Matthew Roughan, Darryl Veitch and Walter Willinge

    PREDICTING INTERNET TRAFFIC BURSTS USING EXTREME VALUE THEORY

    Get PDF
    Computer networks play an important role in today’s organization and people life. These interconnected devices share a common medium and they tend to compete for it. Quality of Service (QoS) comes into play as to define what level of services users get. Accurately defining the QoS metrics is thus important. Bursts and serious deteriorations are omnipresent in Internet and considered as an important aspects of it. This thesis examines bursts and serious deteriorations in Internet traffic and applies Extreme Value Theory (EVT) to their prediction and modelling. EVT itself is a field of statistics that has been in application in fields like hydrology and finance, with only a recent introduction to the field of telecommunications. Model fitting is based on real traces from Belcore laboratory along with some simulated traces based on fractional Gaussian noise and linear fractional alpha stable motion. QoS traces from University of Napoli are also used in the prediction stage. Three methods from EVT are successfully used for the bursts prediction problem. They are Block Maxima (BM) method, Peaks Over Threshold (POT) method, and RLargest Order Statistics (RLOS) method. Bursts in internet traffic are predicted using the above three methods. A clear methodology was developed for the bursts prediction problem. New metrics for QoS are suggested based on Return Level and Return Period. Thus, robust QoS metrics can be defined. In turn, a superior QoS will be obtained that would support mission critical applications

    Internet traffic volumes characterization and forecasting

    Get PDF
    Internet usage increases every year and the need to estimate the growth of the generated traffic has become a major topic. Forecasting actual figures in advance is essential for bandwidth allocation, networking design and investment planning. In this thesis novel mathematical equations are presented to model and to predict long-term Internet traffic in terms of total aggregating volume, globally and more locally. Historical traffic data from consecutive years have revealed hidden numerical patterns as the values progress year over year and this trend can be well represented with appropriate mathematical relations. The proposed formulae have excellent fitting properties over long-history measurements and can indicate forthcoming traffic for the next years with an exceptionally low prediction error. In cases where pending traffic data have already become available, the suggested equations provide more successful results than the respective projections that come from worldwide leading research. The studies also imply that future traffic strongly depends on the past activity and on the growth of Internet users, provided that a big and representative sample of pertinent data exists from large geographical areas. To the best of my knowledge this work is the first to introduce effective prediction methods that exclusively rely on the static attributes and the progression properties of historical values

    Critical Market Crashes

    Full text link
    This review is a partial synthesis of the book ``Why stock market crash'' (Princeton University Press, January 2003), which presents a general theory of financial crashes and of stock market instabilities that his co-workers and the author have developed over the past seven years. The study of the frequency distribution of drawdowns, or runs of successive losses shows that large financial crashes are ``outliers'': they form a class of their own as can be seen from their statistical signatures. If large financial crashes are ``outliers'', they are special and thus require a special explanation, a specific model, a theory of their own. In addition, their special properties may perhaps be used for their prediction. The main mechanisms leading to positive feedbacks, i.e., self-reinforcement, such as imitative behavior and herding between investors are reviewed with many references provided to the relevant literature outside the confine of Physics. Positive feedbacks provide the fuel for the development of speculative bubbles, preparing the instability for a major crash. We demonstrate several detailed mathematical models of speculative bubbles and crashes. The most important message is the discovery of robust and universal signatures of the approach to crashes. These precursory patterns have been documented for essentially all crashes on developed as well as emergent stock markets, on currency markets, on company stocks, and so on. The concept of an ``anti-bubble'' is also summarized, with two forward predictions on the Japanese stock market starting in 1999 and on the USA stock market still running. We conclude by presenting our view of the organization of financial markets.Comment: Latex 89 pages and 38 figures, in press in Physics Report
    corecore