103 research outputs found

    The Variance Profile

    Get PDF
    The variance profile is defined as the power mean of the spectral density function of a stationary stochastic process. It is a continuous and non-decreasing function of the power parameter, p, which returns the minimum of the spectrum (p → −∞), the interpolation error variance (harmonic mean, p = −1), the prediction error variance (geometric mean, p = 0), the unconditional variance (arithmetic mean, p = 1) and the maximum of the spectrum (p → ∞). The variance profile provides a useful characterisation of a stochastic processes; we focus in particular on the class of fractionally integrated processes. Moreover, it enables a direct and immediate derivation of the Szego-Kolmogorov formula and the interpolation error variance formula. The paper proposes a non-parametric estimator of the variance profile based on the power mean of the smoothed sample spectrum, and proves its consistency and its asymptotic normality. From the empirical standpoint, we propose and illustrate the use of the variance profile for estimating the long memory parameter in climatological and financial time series and for assessing structural change.Predictability; Interpolation; Non-parametric spectral estimation; Long memory.

    Automatic positive semidefinate HAC covariance matrix and GMM estimation

    Get PDF
    This paper proposes a new class of heteroskedastic and autocorrelation consistent (HAC) covariance matrix estimators. The standard HAC estimation method reweights estimators of the autocovariances. Here we initially smooth the data observations themselves using kernel function–based weights. The resultant HAC covariance matrix estimator is the normalized outer product of the smoothed random vectors and is therefore automatically positive semidefinite. A corresponding efficient GMM criterion may also be defined as a quadratic form in the smoothed moment indicators whose normalized minimand provides a test statistic for the overidentifying moment conditions

    A Statistical Study of Wavelet Coherence for Stationary and Nonstationary Processes

    No full text
    The coherence function measures the correlation between a pair of random processes in the frequency domain. It is a well studied and understood concept, and the distributional properties of conventional coherence estimators for stationary processes have been derived and applied in a number of physical settings. In recent years the wavelet coherence measure has been used to analyse correlations between a pair of processes in the time-scale domain, typically in hypothesis testing scenarios, but it has proven resistant to analytic study with resort to simulations for statistical properties. As part of the null hypothesis being tested, such simulations invariably assume joint stationarity of the series. In this thesis two methods of calculating wavelet coherence have been developed and distributional properties of the wavelet coherence estimators have been fully derived. With the first method, in an analogous framework to multitapering, wavelet coherence is estimated using multiple orthogonal Morse wavelets. The second coherence estimator proposed uses time-domain smoothing and a single Morlet wavelet. Since both sets of wavelets are complex-valued, we consider the case of wavelet coherence calculated from discrete-time complex-valued and stationary time series. Under Gaussianity, the Goodman distribution is shown, for large samples, to be appropriate for wavelet coherence. The true wavelet coherence value is identified in terms of its frequency domain equivalent and degrees of freedom can be readily derived. The theoretical results are verified via simulations. The notion of a spectral function is considered for the nonstationary case. Particular focus is given to Priestley’s evolutionary process and a Wold-Cramér nonstationary representation where time-varying spectral functions can be clearly defined. Methods of estimating these spectra are discussed, including the continuous wavelet transform, which when performed with a Morlet wavelet and temporal smoothing is shown to bear close resemblance to Priestley’s own estimation procedure. The concept of coherence for bivariate evolutionary nonstationary processes is discussed in detail. In such situations it can be shown that the coherence function, as in the stationary case, is invariant of time. It is shown that for spectra that vary slowly in time the derived statistics of the temporally smoothed wavelet coherence estimator are appropriate. Further to this the similarities with Priestleys spectral estimator are exploited to derive distributional properties of the corresponding Priestley coherence estimator. A well known class of the evolutionary and Wold-Cramér nonstationary processes are the modulated stationary processes. Using these it is shown that bivariate processes can be constructed that exhibit coherence variation with time, frequency, and time-and-frequency. The temporally smoothed Morlet wavelet coherence estimator is applied to these processes. It is shown that accurate coherence estimates can be achieved for each type of coherence, and that the distributional properties derived under stationarity are applicable

    Comparing estimation techniques for temporal scaling in palaeoclimate time series

    Get PDF
    Characterizing the variability across timescales is important for understanding the underlying dynamics of the Earth system. It remains challenging to do so from palaeoclimate archives since they are more often than not irregular, and traditional methods for producing timescale-dependent estimates of variability, such as the classical periodogram and the multitaper spectrum, generally require regular time sampling. We have compared those traditional methods using interpolation with interpolation-free methods, namely the Lomb–Scargle periodogram and the first-order Haar structure function. The ability of those methods to produce timescale-dependent estimates of variability when applied to irregular data was evaluated in a comparative framework, using surrogate palaeo-proxy data generated with realistic sampling. The metric we chose to compare them is the scaling exponent, i.e. the linear slope in log-transformed coordinates, since it summarizes the behaviour of the variability across timescales. We found that, for scaling estimates in irregular time series, the interpolation-free methods are to be preferred over the methods requiring interpolation as they allow for the utilization of the information from shorter timescales which are particularly affected by the irregularity. In addition, our results suggest that the Haar structure function is the safer choice of interpolation-free method since the Lomb–Scargle periodogram is unreliable when the underlying process generating the time series is not stationary. Given that we cannot know a priori what kind of scaling behaviour is contained in a palaeoclimate time series, and that it is also possible that this changes as a function of timescale, it is a desirable characteristic for the method to handle both stationary and non-stationary cases alike

    Time series exponential models: theory and methods

    Get PDF
    The exponential model of Bloomfield (1973) is becoming increasingly important due to its recent applications to long memory time series. However, this model has received little consideration in the context of short memory time series. Furthermore, there has been very little attempt at using the EXP model as a model to analyze observed time series data. This dissertation research is largely focused on developing new methods to improve the utility and robustness of the EXP model. Specifically, a new nonparametric method of parameter estimation is developed using wavelets. The advantage of this method is that, for many spectra, the resulting parameter estimates are less susceptible to biases associated with methods of parameter estimation based directly on the raw periodogram. Additionally, several methods are developed for the validation of spectral models. These methods test the hypothesis that the estimated model provides a whitening transformation of the spectrum; this is equivalent to the time domain notion of producing a model whose residuals behave like the residuals of white noise. The results of simulation and real data analysis are presented to illustrate these methods
    • …
    corecore