A modified approach for obtaining sieve bootstrap prediction intervals for time series

Abstract

The traditional Box-Jenkins approach to obtaining prediction intervals for stationary time seres assumes that the underlying distribution of the innovations is Gaussian. It is well known that deviations from this assumption can lead to prediction intervals with poor coverage. Nonparametric bootstrap-based procedures for obtaining prediction intervals overcome this handicap, but many early versions of such intervals for autoregressive moving average (ARMA) processes assume that the autoregressive and moving average orders, p, q respectively, are known, The sieve bootstrap, first introduced by Bühlmann in 1997, sidesteps this assumption for invertible time series by approximating the ARMA process by a finite autoregressive model whose order is estimated by using a model procedure such as the AICC. Existing sieve bootstrap methods in general, however, produces liberal prediction intervals due to several factors, including the use of residuals that underestimate the actual variance of the innovations and the failure of the methods to capture variations due to sampling error of some parameter estimates. In this dissertation, a modified sieve bootstrap approach, that corrects these deficiencies, is implemented to obtain prediction intervals for both univariate and multivariate time series. Monte Carlo simulations results show that the modifications provide prediction intervals that achieve nominal or near nominal coverage probabilities. Asymptotic results for the univariate series also establish the validity of the modified approach --Abstract, page iii

    Similar works