52 research outputs found

    LS2W: Implementing the Locally Stationary 2D Wavelet Process Approach in R

    Get PDF
    Locally stationary process representations have recently been proposed and applied to both time series and image analysis applications. This article describes an implementation of the locally stationary two-dimensional wavelet process approach in R. This package permits construction of estimates of spatially localized spectra and localized autocovariance which can be used to characterize structure within images.

    Detection of changes in the characteristics of oceanographic time-series using changepoint analysis.

    Get PDF
    Changepoint analysis is used to detect changes in variability within GOMOS hindcast time-series for significant wave heights of storm peak events across the Gulf of Mexico for the period 1900ā€“2005. To detect a change in variance, the two-step procedure consists of (1) validating model assumptions per geographic location, followed by (2) application of a penalized likelihood changepoint algorithm. Results suggest that the most important changes in time-series variance occur in 1916 and 1933 at small clusters of boundary locations at which, in general, the variance reduces. No post-war changepoints are detected. The changepoint procedure can be readily applied to other environmental time-series

    Computationally efficient changepoint detection for a range of penalties

    Get PDF
    In the multiple changepoint setting, various search methods have been proposed which involve optimising either a constrained or penalised cost function over possible numbers and locations of changepoints using dynamic programming. Such methods are typically computationally intensive. Recent work in the penalised optimisation setting has focussed on developing a pruning-based approach which gives an improved computational cost that, under certain conditions, is linear in the number of data points. Such an approach naturally requires the specification of a penalty to avoid under/over-fitting. Work has been undertaken to identify the appropriate penalty choice for data generating processes with known distributional form, but in many applications the model assumed for the data is not correct and these penalty choices are not always appropriate. Consequently it is desirable to have an approach that enables us to compare segmentations for different choices of penalty. To this end we present a method to obtain optimal changepoint segmentations of data sequences for all penalty values across a continuous range. This permits an evaluation of the various segmentations to identify a suitably parsimonious penalty choice. The computational complexity of this approach can be linear in the number of data points and linear in the difference between the number of changepoints in the optimal segmentations for the smallest and largest penalty values. This can be orders of magnitude faster than alternative approaches that find optimal segmentations for a range of the number of changepoints

    A Log-Linear Non-Parametric Online Changepoint Detection Algorithm based on Functional Pruning

    Full text link
    Online changepoint detection aims to detect anomalies and changes in real-time in high-frequency data streams, sometimes with limited available computational resources. This is an important task that is rooted in many real-world applications, including and not limited to cybersecurity, medicine and astrophysics. While fast and efficient online algorithms have been recently introduced, these rely on parametric assumptions which are often violated in practical applications. Motivated by data streams from the telecommunications sector, we build a flexible nonparametric approach to detect a change in the distribution of a sequence. Our procedure, NP-FOCuS, builds a sequential likelihood ratio test for a change in a set of points of the empirical cumulative density function of our data. This is achieved by keeping track of the number of observations above or below those points. Thanks to functional pruning ideas, NP-FOCuS has a computational cost that is log-linear in the number of observations and is suitable for high-frequency data streams. In terms of detection power, NP-FOCuS is seen to outperform current nonparametric online changepoint techniques in a variety of settings. We demonstrate the utility of the procedure on both simulated and real data

    High-dimensional time series segmentation via factor-adjusted vector autoregressive modelling

    Full text link
    Vector autoregressive (VAR) models are popularly adopted for modelling high-dimensional time series, and their piecewise extensions allow for structural changes in the data. In VAR modelling, the number of parameters grow quadratically with the dimensionality which necessitates the sparsity assumption in high dimensions. However, it is debatable whether such an assumption is adequate for handling datasets exhibiting strong serial and cross-sectional correlations. We propose a piecewise stationary time series model that simultaneously allows for strong correlations as well as structural changes, where pervasive serial and cross-sectional correlations are accounted for by a time-varying factor structure, and any remaining idiosyncratic dependence between the variables is handled by a piecewise stationary VAR model. We propose an accompanying two-stage data segmentation methodology which fully addresses the challenges arising from the latency of the component processes. Its consistency in estimating both the total number and the locations of the change points in the latent components, is established under conditions considerably more general than those in the existing literature. We demonstrate the competitive performance of the proposed methodology on simulated datasets and an application to US blue chip stocks data
    • ā€¦
    corecore