57,797 research outputs found

    Non-Stationary Process Monitoring for Change-Point Detection With Known Accuracy: Application to Wheels Coating Inspection

    Get PDF
    International audienceThis paper addresses the problem of monitoring online a non-stationary process to detect abrupt changes in the process mean value. Two main challenges are addressed: First, the monitored process is nonstationary; i.e., naturally changes over time and it is necessary to distinguish those “regular”process changes from abrupt changes resulting from potential failures. Second, this paper aims at being applied for industrial processes where the performance of the detection method must be accurately controlled. A novel sequential method, based on two fixed-length windows, is proposed to detect abrupt changes with guaranteed accuracy while dealing with non-stationary process. The first window is used for estimating the non-stationary process parameters, whereas the second window is used to execute the detection. A study on the performances of the proposed method provides analytical expressions of the test statistical properties. This allows to bound the false alarm probability for a given number of observations while maximizing the detection power as a function of a given detection delay. The proposed method is then applied for wheels coating monitoring using an imaging system. Numerical results on a large set of wheel images show the efficiency of the proposed approach and the sharpness of the theoretical study

    Modeling extreme values of processes observed at irregular time steps: Application to significant wave height

    Get PDF
    This work is motivated by the analysis of the extremal behavior of buoy and satellite data describing wave conditions in the North Atlantic Ocean. The available data sets consist of time series of significant wave height (Hs) with irregular time sampling. In such a situation, the usual statistical methods for analyzing extreme values cannot be used directly. The method proposed in this paper is an extension of the peaks over threshold (POT) method, where the distribution of a process above a high threshold is approximated by a max-stable process whose parameters are estimated by maximizing a composite likelihood function. The efficiency of the proposed method is assessed on an extensive set of simulated data. It is shown, in particular, that the method is able to describe the extremal behavior of several common time series models with regular or irregular time sampling. The method is then used to analyze Hs data in the North Atlantic Ocean. The results indicate that it is possible to derive realistic estimates of the extremal properties of Hs from satellite data, despite its complex space--time sampling.Comment: Published in at http://dx.doi.org/10.1214/13-AOAS711 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Quick inference for log Gaussian Cox processes with non-stationary underlying random fields

    Full text link
    For point patterns observed in natura, spatial heterogeneity is more the rule than the exception. In numerous applications, this can be mathematically handled by the flexible class of log Gaussian Cox processes (LGCPs); in brief, a LGCP is a Cox process driven by an underlying log Gaussian random field (log GRF). This allows the representation of point aggregation, point vacuum and intermediate situations, with more or less rapid transitions between these different states depending on the properties of GRF. Very often, the covariance function of the GRF is assumed to be stationary. In this article, we give two examples where the sizes (that is, the number of points) and the spatial extents of point clusters are allowed to vary in space. To tackle such features, we propose parametric and semiparametric models of non-stationary LGCPs where the non-stationarity is included in both the mean function and the covariance function of the GRF. Thus, in contrast to most other work on inhomogeneous LGCPs, second-order intensity-reweighted stationarity is not satisfied and the usual two step procedure for parameter estimation based on e.g. composite likelihood does not easily apply. Instead we propose a fast three step procedure based on composite likelihood. We apply our modelling and estimation framework to analyse datasets dealing with fish aggregation in a reservoir and with dispersal of biological particles

    Pulsar timing analysis in the presence of correlated noise

    Full text link
    Pulsar timing observations are usually analysed with least-square-fitting procedures under the assumption that the timing residuals are uncorrelated (statistically "white"). Pulsar observers are well aware that this assumption often breaks down and causes severe errors in estimating the parameters of the timing model and their uncertainties. Ad hoc methods for minimizing these errors have been developed, but we show that they are far from optimal. Compensation for temporal correlation can be done optimally if the covariance matrix of the residuals is known using a linear transformation that whitens both the residuals and the timing model. We adopt a transformation based on the Cholesky decomposition of the covariance matrix, but the transformation is not unique. We show how to estimate the covariance matrix with sufficient accuracy to optimize the pulsar timing analysis. We also show how to apply this procedure to estimate the spectrum of any time series with a steep red power-law spectrum, including those with irregular sampling and variable error bars, which are otherwise very difficult to analyse.Comment: Accepted by MNRA

    Higher-Order Improvements of the Sieve Bootstrap for Fractionally Integrated Processes

    Full text link
    This paper investigates the accuracy of bootstrap-based inference in the case of long memory fractionally integrated processes. The re-sampling method is based on the semi-parametric sieve approach, whereby the dynamics in the process used to produce the bootstrap draws are captured by an autoregressive approximation. Application of the sieve method to data pre-filtered by a semi-parametric estimate of the long memory parameter is also explored. Higher-order improvements yielded by both forms of re-sampling are demonstrated using Edgeworth expansions for a broad class of statistics that includes first- and second-order moments, the discrete Fourier transform and regression coefficients. The methods are then applied to the problem of estimating the sampling distributions of the sample mean and of selected sample autocorrelation coefficients, in experimental settings. In the case of the sample mean, the pre-filtered version of the bootstrap is shown to avoid the distinct underestimation of the sampling variance of the mean which the raw sieve method demonstrates in finite samples, higher order accuracy of the latter notwithstanding. Pre-filtering also produces gains in terms of the accuracy with which the sampling distributions of the sample autocorrelations are reproduced, most notably in the part of the parameter space in which asymptotic normality does not obtain. Most importantly, the sieve bootstrap is shown to reproduce the (empirically infeasible) Edgeworth expansion of the sampling distribution of the autocorrelation coefficients, in the part of the parameter space in which the expansion is valid
    corecore