138 research outputs found
Spectral Time Series Analysis of Ocean Wave Buoy Measurements
Waves in the ocean can be as dangerous as they are impressive. In order to study the behaviour of such waves, buoys are commonly deployed to collect recordings of the ocean surface over time. This results in large quantities of high-frequency multivariate time series data. The statistical analysis of such data is of great importance in a variety of engineering and scientific contexts, from the design of coastal flood defences to offshore structures. We develop methodology for analysing such buoy data, investigating two key questions. Firstly, how should we perform parameter inference for models of the frequency domain behaviour of the surface, given recorded buoy data? Secondly, how can we detect statistically significant non-linearities present in these time series? For parameter inference, we find that pseudo-likelihood approaches greatly outperform state-of-the-art methodologies. As a result, not only can we obtain more reliable parameter estimates, but we can also perform inference for more complicated models, allowing for a more intricate description of the waves. Due to the improved performance of such estimates, we are able to see the evolution of these parameters throughout storm events, using recorded buoy data from both California and the North Sea. For detecting non-linearities, we develop a robust testing procedure by evaluating the bispectrum of the observed time series against the bispectrum of bootstrap simulated Gaussian processes with similar characteristics. We explore the performance of this technique in simulation studies, and apply the approach to buoy data from California
Nonlinear time-series analysis revisited
In 1980 and 1981, two pioneering papers laid the foundation for what became
known as nonlinear time-series analysis: the analysis of observed
data---typically univariate---via dynamical systems theory. Based on the
concept of state-space reconstruction, this set of methods allows us to compute
characteristic quantities such as Lyapunov exponents and fractal dimensions, to
predict the future course of the time series, and even to reconstruct the
equations of motion in some cases. In practice, however, there are a number of
issues that restrict the power of this approach: whether the signal accurately
and thoroughly samples the dynamics, for instance, and whether it contains
noise. Moreover, the numerical algorithms that we use to instantiate these
ideas are not perfect; they involve approximations, scale parameters, and
finite-precision arithmetic, among other things. Even so, nonlinear time-series
analysis has been used to great advantage on thousands of real and synthetic
data sets from a wide variety of systems ranging from roulette wheels to lasers
to the human heart. Even in cases where the data do not meet the mathematical
or algorithmic requirements to assure full topological conjugacy, the results
of nonlinear time-series analysis can be helpful in understanding,
characterizing, and predicting dynamical systems
Nonlinear data analysis of the CMB
Das kosmologische Prinzip der HomogenitĂ€t und statistischen Isotropie des Raumes ist eine fundamentale Annahme der modernen Kosmologie. Auf dieser Basis wird die Existenz einer inflationĂ€ren Phase im jungen Universum postuliert, welche wiederum primordiale GauĂverteilte Fluktuationen vorhersagt, welche sich im kosmischen Mikrowellenhintergrund als Temperatur- und Polarisationsanisotropien manifestieren. Die Grundidee meiner Arbeit war die Weiterentwicklung einer modellunabhĂ€ngigen Untersuchungsmethode, welche die GauĂâsche Hypothese fĂŒr die Dichtefluktuationen testet, wobei die GauĂianitĂ€t eines Ensembles mit der Zufallsverteilung der Fourier Phasen im Phasenraum definiert wird.
Die Methode basiert auf einer nichtlinearen Datenanalyse mit Hilfe von Surrogatkarten, welche die linearen Eigenschaften eines Datensatzes imitieren. Im Rahmen der Surrogatmethode habe ich unter Verwendung zweier verschiedener Bildanalyseverfahren, nÀmlich den Minkowski Funktionalen und den Skalierungsindizes, beide sensitiv auf Korrelationen höherer Ordnung, Karten der kosmischen Mikrowellenhintergrundstrahlung des WMAP und des Planck Experimentes auf skalenabhÀngige Phasenkorrelationen untersucht. Ein Schwerpunkt lag hierbei auf Studien zu hemisphÀrischen Asymmetrien und zum Einfluss der galaktischen Ebene auf die Resultate. Aus der Analyse der Phasenkorrelationen im Phasenraum entwickelte ich neue Methoden zur Untersuchung von Korrelationen zwischen Statistiken höherer Ordnung im Ortsraum und den Informationen des Phasenraumes.
Beide Bildanalyseverfahren detektierten Phasenkorrelationen auf den gröĂten Skalen des kosmischen Mikrowellenhintergrundes in vergleichbarer AusprĂ€gung. Der Einfluss der galaktischen Ebene auf diese Resultate zeigte sich in Cutsky Analysen und beim Vergleichen verschiedener Vordergrundsubtraktionsverfahren innerhalb der zwei Experimente als vernachlĂ€ssigbar gering. HemisphĂ€rische Anomalien auf den gröĂten Skalen der Hintergrundstrahlung wurden wiederholt bestĂ€tigt. Die Parametrisierung von Nicht-GauĂianitĂ€t durch den fNL-Parameter zeigte sich beim Vergleich von fNL-Simulationen mit experimentellen Daten als unzureichend. In Analysen der Daten mit Hilfe von Bianchi-Modellen zeigten sich Hinweise auf eine nicht-triviale Topologie des Universums. Die Resultate meiner Arbeit deuten auf eine Verletzung des standardmĂ€Ăigen Single Field Slow-Roll Modells fĂŒr Inflation hin, und widersprechen den Vorhersagen von isotropen Kosmologien. Meine Studien eröffnen im Allgemeinen neue Wege zu einem besseren VerstĂ€ndnis von Nicht-GauĂ'schen Signaturen in komplexen rĂ€umlichen Strukturen, insbesondere durch die Analyse von Korrelationen der Fourier-Phasen und deren Einfluss auf Statistiken höherer Ordnung im Ortsraum. In naher Zukunft können die Polarisationsdaten des Planck Experimentes weiteren Aufschluss ĂŒber die Anomalien der kosmischen Mikrowellenhintergrundstrahlung bringen. Die Beschreibung des polarisierten Mikrowellenhintergrundes innerhalb einer Phasenanalyse wĂ€re eine wichtige ErgĂ€nzung zu klassischen Studien
The Random Walk of High Frequency Trading
This paper builds a model of high-frequency equity returns by separately
modeling the dynamics of trade-time returns and trade arrivals. Our main
contributions are threefold. First, we characterize the distributional behavior
of high-frequency asset returns both in ordinary clock time and in trade time.
We show that when controlling for pre-scheduled market news events, trade-time
returns of the highly liquid near-month E-mini S&P 500 futures contract are
well characterized by a Gaussian distribution at very fine time scales. Second,
we develop a structured and parsimonious model of clock-time returns by
subordinating a trade-time Gaussian distribution with a trade arrival process
that is associated with a modified Markov-Switching Multifractal Duration
(MSMD) model. This model provides an excellent characterization of
high-frequency inter-trade durations. Over-dispersion in this distribution of
inter-trade durations leads to leptokurtosis and volatility clustering in
clock-time returns, even when trade-time returns are Gaussian. Finally, we use
our model to extrapolate the empirical relationship between trade rate and
volatility in an effort to understand conditions of market failure. Our model
suggests that the 1,200 km physical separation of financial markets in Chicago
and New York/New Jersey provides a natural ceiling on systemic volatility and
may contribute to market stability during periods of extremely heavy trading
Essays on the nonlinear and nonstochastic nature of stock market data
The nature and structure of stock-market price dynamics is an area of ongoing and rigourous scientific debate. For almost three decades, most emphasis has been given on upholding the concepts of Market Efficiency and rational investment behaviour. Such an approach has favoured the development of numerous linear and nonlinear models mainly of stochastic foundations. Advances in mathematics have shown that nonlinear deterministic processes i.e. "chaos" can produce sequences that appear random to linear statistical techniques. Till recently, investment finance has been a science based on linearity and stochasticity. Hence it is important that studies of Market Efficiency include investigations of chaotic determinism and power laws. As far as chaos is concerned, there are rather mixed or inconclusive research results, prone with controversy. This inconclusiveness is attributed to two things: the nature of stock market time series, which are highly volatile and contaminated with a substantial amount of noise of largely unknown structure, and the lack of appropriate robust statistical testing procedures. In order to overcome such difficulties, within this thesis it is shown empirically and for the first time how one can combine novel techniques from recent chaotic and signal analysis literature, under a univariate time series analysis framework. Three basic methodologies are investigated: Recurrence analysis, Surrogate Data and Wavelet transforms. Recurrence Analysis is used to reveal qualitative and quantitative evidence of nonlinearity and nonstochasticity for a number of stock markets. It is then demonstrated how Surrogate Data, under a statistical hypothesis testing framework, can be simulated to provide similar evidence. Finally, it is shown how wavelet transforms can be applied in order to reveal various salient features of the market data and provide a platform for nonparametric regression and denoising. The results indicate that without the invocation of any parametric model-based assumptions, one can easily deduce that there is more to linearity and stochastic randomness in the data. Moreover, substantial evidence of recurrent patterns and aperiodicities is discovered which can be attributed to chaotic dynamics. These results are therefore very consistent with existing research indicating some types of nonlinear dependence in financial data. Concluding, the value of this thesis lies in its contribution to the overall evidence on Market Efficiency and chaotic determinism in financial markets. The main implication here is that the theory of equilibrium pricing in financial markets may need reconsideration in order to accommodate for the structures revealed
Evaluation of iterative Kalman smoother schemes for multi-decadal past climate analysis with comprehensive Earth system models
Paleoclimate reconstruction based on assimilation of proxy observations requires specification of the control variables and their background statistics. As opposed to numerical weather prediction (NWP), which is mostly an initial condition problem, the main source of error growth in deterministic Earth system models (ESMs) regarding the model low-frequency response comes from errors in other inputs: parameters for the small-scale physics, as well as forcing and boundary conditions. Also, comprehensive ESMs are non-linear and only a few ensemble members can be run in current high-performance computers. Under these conditions we evaluate two assimilation schemes, which (a) count on iterations to deal with non-linearity and (b) are based on low-dimensional control vectors to reduce the computational need. The practical implementation would assume that the ESM has been previously globally tuned with current observations and that for a given situation there is previous knowledge of the most sensitive inputs (given corresponding uncertainties), which should be selected as control variables. The low dimension of the control vector allows for using full-rank covariances and resorting to finite-difference sensitivities (FDSs). The schemes are then an FDS implementation of the iterative Kalman smoother (FDS-IKS, a GaussâNewton scheme) and a so-called FDS-multistep Kalman smoother (FDS-MKS, based on repeated assimilation of the observations). We describe the schemes and evaluate the analysis step for a data assimilation window in two numerical experiments: (a) a simple 1-D energy balance model (Ebm1D; which has an adjoint code) with present-day surface air temperature from the NCEP/NCAR reanalysis data as a target and (b) a multi-decadal synthetic case with the Community Earth System Model (CESM v1.2, with no adjoint). In the Ebm1D experiment, the FDS-IKS converges to the same parameters and cost function values as a 4D-Var scheme. For similar iterations to the FDS-IKS, the FDS-MKS results in slightly higher cost function values, which are still substantially lower than those of an ensemble transform Kalman filter (ETKF). In the CESM experiment, we include an ETKF with Gaussian anamorphosis (ETKF-GA) implementation as a potential non-linear assimilation alternative. For three iterations, both FDS schemes obtain cost functions values that are close between them and (with about half the computational cost) lower than those of the ETKF and ETKF-GA (with similar cost function values). Overall, the FDS-IKS seems more adequate for the problem, with the FDS-MKS potentially more useful to damp increments in early iterations of the FDS-IKS
Practical implementation of nonlinear time series methods: The TISEAN package
Nonlinear time series analysis is becoming a more and more reliable tool for
the study of complicated dynamics from measurements. The concept of
low-dimensional chaos has proven to be fruitful in the understanding of many
complex phenomena despite the fact that very few natural systems have actually
been found to be low dimensional deterministic in the sense of the theory. In
order to evaluate the long term usefulness of the nonlinear time series
approach as inspired by chaos theory, it will be important that the
corresponding methods become more widely accessible. This paper, while not a
proper review on nonlinear time series analysis, tries to make a contribution
to this process by describing the actual implementation of the algorithms, and
their proper usage. Most of the methods require the choice of certain
parameters for each specific time series application. We will try to give
guidance in this respect. The scope and selection of topics in this article, as
well as the implementational choices that have been made, correspond to the
contents of the software package TISEAN which is publicly available from
http://www.mpipks-dresden.mpg.de/~tisean . In fact, this paper can be seen as
an extended manual for the TISEAN programs. It fills the gap between the
technical documentation and the existing literature, providing the necessary
entry points for a more thorough study of the theoretical background.Comment: 27 pages, 21 figures, downloadable software at
http://www.mpipks-dresden.mpg.de/~tisea
Generalized Volterra-Wiener and surrogate data methods for complex time series analysis
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2006.Includes bibliographical references (leaves 133-150).This thesis describes the current state-of-the-art in nonlinear time series analysis, bringing together approaches from a broad range of disciplines including the non-linear dynamical systems, nonlinear modeling theory, time-series hypothesis testing, information theory, and self-similarity. We stress mathematical and qualitative relationships between key algorithms in the respective disciplines in addition to describing new robust approaches to solving classically intractable problems. Part I presents a comprehensive review of various classical approaches to time series analysis from both deterministic and stochastic points of view. We focus on using these classical methods for quantification of complexity in addition to proposing a unified approach to complexity quantification encapsulating several previous approaches. Part II presents robust modern tools for time series analysis including surrogate data and Volterra-Wiener modeling. We describe new algorithms converging the two approaches that provide both a sensitive test for nonlinear dynamics and a noise-robust metric for chaos intensity.by Akhil Shashidhar.M.Eng
Symbolization-based analysis of engineering time series
Data symbolization, derived from the study of symbolic dynamics, involves discretization of measurement data to aid in observing and characterizing temporal patterns. In this study, symbolization-based methods are developed for analysis of time series from experimental engineering systems to test hypotheses concerning stationarity, temporal reversibility, and synchronization. Stationarity is examined in the context of process control and dynamical state matching; temporal reversibility, in the context of model discrimination and selection of control schemes (linear versus nonlinear); and synchronization, in the context of modes of interactions between system components. Statistical significance is estimated using the method of surrogate data with Monte Carlo probabilities
- âŠ