129 research outputs found

    Long memory estimation for complex-valued time series

    Get PDF
    Long memory has been observed for time series across a multitude of fields and the accurate estimation of such dependence, e.g. via the Hurst exponent, is crucial for the modelling and prediction of many dynamic systems of interest. Many physical processes (such as wind data), are more naturally expressed as a complex-valued time series to represent magnitude and phase information (wind speed and direction). With data collection ubiquitously unreliable, irregular sampling or missingness is also commonplace and can cause bias in a range of analysis tasks, including Hurst estimation. This article proposes a new Hurst exponent estimation technique for complex-valued persistent data sampled with potential irregularity. Our approach is justified through establishing attractive theoretical properties of a new complex-valued wavelet lifting transform, also introduced in this paper. We demonstrate the accuracy of the proposed estimation method through simulations across a range of sampling scenarios and complex- and real-valued persistent processes. For wind data, our method highlights that inclusion of the intrinsic correlations between the real and imaginary data, inherent in our complex-valued approach, can produce different persistence estimates than when using real-valued analysis. Such analysis could then support alternative modelling or policy decisions compared with conclusions based on real-valued estimation

    Applications of compressed sensing in computational physics

    Get PDF
    Conventional sampling theory is dictated by Shannon's celebrated sampling theorem: For a signal to be reconstructed from samples, it must be sampled with at least twice the maximum frequency found in the signal. This principle is key in all modern signal acquisition, from consumer electronics to medical imaging devices. Recently, a new theory of signal acquisition has emerged in the form of Compressed Sensing, which allows for complete conservation of the information in a signal using far fewer samples than Shannon's theorem dictates. This is achieved by noting that signals with information are usually structured, allowing them to be represented with very few coefficients in the proper basis, a property called sparsity. In this thesis, we survey the existing theory of compressed sensing, with details on performance guarantees in terms of the Restricted Isometry Property. We then survey the state-of-the-art applications of the theory, including improved MRI using Total Variation sparsity and restoration of seismic data using curvelet and wave atom sparsity. We apply Compressed Sensing to the problem of finding statistical properties of a signal based CS methods, by attempting to measure the Hurst exponent of rough surfaces by partial measurements. We suggest an improvement on previous results in seismic data restoration, by applying a learned dictionary of signal patches for restoration

    The Structure of Climate Variability Across Scales

    Get PDF
    One of the most intriguing facets of the climate system is that it exhibits variability across all temporal and spatial scales; pronounced examples are temperature and precipitation. The structure of this variability, however, is not arbitrary. Over certain spatial and temporal ranges it can be described by scaling relationships in the form of power‐laws in probability density distributions and autocorrelation functions. These scaling relationships can be quantified by scaling exponents which measure how the variability changes across scales and how the intensity changes with frequency of occurrence. Scaling determines the relative magnitudes and persistence of natural climate fluctuations. Here, we review various scaling mechanisms and their relevance for the climate system. We show observational evidence of scaling and discuss the application of scaling properties and methods in trend detection, climate sensitivity analyses, and climate predictio

    Volatility and correlation: Modeling and forecasting using Support Vector Machines

    Get PDF
    Several Realized Volatility and Correlation estimators have been introduced. The estimators which are defined based on high frequency data converge to the true estimators faster than their counterparts even under Market Microstructure Noise. Also a strategy for multivariate volatility estimation has been introduced. The strategy which is an incorporation of Support Vector Machine with Multiresolution Analysis based on wavelets affords higher performance of estimation than the single estimation

    The Impact of Covid-19 on Oil Market Returns: Has Market Efficiency Being Violated?

    Get PDF
    This study examines the effect of COVID-19 pandemic on the efficiency of oil markets from 2nd February 2020 to 4th August 2021. By relying on dynamic conditional correlation GARCH and Wavelet coherence techniques, we able to provide correlations between the variables across time and frequency domains. Our empirical findings point to significant yet weak correlations between COVID-19 recovery/death rates for the time period extending from early February to early May even though we observe strong correlations between WTI prices and COVID-19 health statistics in mid-April. Moreover, during this identified time period, the length of frequency cycles within the correlations decreases from 16 days to 8 days. Altogether, these findings imply that oil markets were inefficient between February and early May and have since turned market efficient for the remaining duration of the pandemic.

    Wavelet Based Feature Extraction and Dimension Reduction for the Classification of Human Cardiac Electrogram Depolarization Waveforms

    Get PDF
    An essential task for a pacemaker or implantable defibrillator is the accurate identification of rhythm categories so that the correct electrotherapy can be administered. Because some rhythms cause a rapid dangerous drop in cardiac output, it is necessary to categorize depolarization waveforms on a beat-to-beat basis to accomplish rhythm classification as rapidly as possible. In this thesis, a depolarization waveform classifier based on the Lifting Line Wavelet Transform is described. It overcomes problems in existing rate-based event classifiers; namely, (1) they are insensitive to the conduction path of the heart rhythm and (2) they are not robust to pseudo-events. The performance of the Lifting Line Wavelet Transform based classifier is illustrated with representative examples. Although rate based methods of event categorization have served well in implanted devices, these methods suffer in sensitivity and specificity when atrial, and ventricular rates are similar. Human experts differentiate rhythms by morphological features of strip chart electrocardiograms. The wavelet transform is a simple approximation of this human expert analysis function because it correlates distinct morphological features at multiple scales. The accuracy of implanted rhythm determination can then be improved by using human-appreciable time domain features enhanced by time scale decomposition of depolarization waveforms. The purpose of the present work was to determine the feasibility of implementing such a system on a limited-resolution platform. 78 patient recordings were split into equal segments of reference, confirmation, and evaluation sets. Each recording had a sampling rate of 512Hz, and a significant change in rhythm in the recording. The wavelet feature generator implemented in Matlab performs anti-alias pre-filtering, quantization, and threshold-based event detection, to produce indications of events to submit to wavelet transformation. The receiver operating characteristic curve was used to rank the discriminating power of the feature accomplishing dimension reduction. Accuracy was used to confirm the feature choice. Evaluation accuracy was greater than or equal to 95% over the IEGM recordings

    Quantitative methods in high-frequency financial econometrics: modeling univariate and multivariate time series

    Get PDF

    Bayesian inference for indirectly observed stochastic processes, applications to epidemic modelling

    Get PDF
    Stochastic processes are mathematical objects that offer a probabilistic representation of how some quantities evolve in time. In this thesis we focus on estimating the trajectory and parameters of dynamical systems in cases where only indirect observations of the driving stochastic process are available. We have first explored means to use weekly recorded numbers of cases of Influenza to capture how the frequency and nature of contacts made with infected individuals evolved in time. The latter was modelled with diffusions and can be used to quantify the impact of varying drivers of epidemics as holidays, climate, or prevention interventions. Following this idea, we have estimated how the frequency of condom use has evolved during the intervention of the Gates Foundation against HIV in India. In this setting, the available estimates of the proportion of individuals infected with HIV were not only indirect but also very scarce observations, leading to specific difficulties. At last, we developed a methodology for fractional Brownian motions (fBM), here a fractional stochastic volatility model, indirectly observed through market prices. The intractability of the likelihood function, requiring augmentation of the parameter space with the diffusion path, is ubiquitous in this thesis. We aimed for inference methods robust to refinements in time discretisations, made necessary to enforce accuracy of Euler schemes. The particle Marginal Metropolis Hastings (PMMH) algorithm exhibits this mesh free property. We propose the use of fast approximate filters as a pre-exploration tool to estimate the shape of the target density, for a quicker and more robust adaptation phase of the asymptotically exact algorithm. The fBM problem could not be treated with the PMMH, which required an alternative methodology based on reparameterisation and advanced Hamiltonian Monte Carlo techniques on the diffusion pathspace, that would also be applicable in the Markovian setting

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio
    corecore