530 research outputs found

    Bias Reduction of Long Memory Parameter Estimators via the Pre-filtered Sieve Bootstrap

    Full text link
    This paper investigates the use of bootstrap-based bias correction of semi-parametric estimators of the long memory parameter in fractionally integrated processes. The re-sampling method involves the application of the sieve bootstrap to data pre-filtered by a preliminary semi-parametric estimate of the long memory parameter. Theoretical justification for using the bootstrap techniques to bias adjust log-periodogram and semi-parametric local Whittle estimators of the memory parameter is provided. Simulation evidence comparing the performance of the bootstrap bias correction with analytical bias correction techniques is also presented. The bootstrap method is shown to produce notable bias reductions, in particular when applied to an estimator for which analytical adjustments have already been used. The empirical coverage of confidence intervals based on the bias-adjusted estimators is very close to the nominal, for a reasonably large sample size, more so than for the comparable analytically adjusted estimators. The precision of inferences (as measured by interval length) is also greater when the bootstrap is used to bias correct rather than analytical adjustments.Comment: 38 page

    Higher-Order Improvements of the Sieve Bootstrap for Fractionally Integrated Processes

    Full text link
    This paper investigates the accuracy of bootstrap-based inference in the case of long memory fractionally integrated processes. The re-sampling method is based on the semi-parametric sieve approach, whereby the dynamics in the process used to produce the bootstrap draws are captured by an autoregressive approximation. Application of the sieve method to data pre-filtered by a semi-parametric estimate of the long memory parameter is also explored. Higher-order improvements yielded by both forms of re-sampling are demonstrated using Edgeworth expansions for a broad class of statistics that includes first- and second-order moments, the discrete Fourier transform and regression coefficients. The methods are then applied to the problem of estimating the sampling distributions of the sample mean and of selected sample autocorrelation coefficients, in experimental settings. In the case of the sample mean, the pre-filtered version of the bootstrap is shown to avoid the distinct underestimation of the sampling variance of the mean which the raw sieve method demonstrates in finite samples, higher order accuracy of the latter notwithstanding. Pre-filtering also produces gains in terms of the accuracy with which the sampling distributions of the sample autocorrelations are reproduced, most notably in the part of the parameter space in which asymptotic normality does not obtain. Most importantly, the sieve bootstrap is shown to reproduce the (empirically infeasible) Edgeworth expansion of the sampling distribution of the autocorrelation coefficients, in the part of the parameter space in which the expansion is valid

    Expansions for Approximate Maximum Likelihood Estimators of the Fractional Difference Parameter

    Get PDF
    This paper derives second-order expansions for the distributions of the Whittle and profile plug-in maximum likelihood estimators of the fractional difference parameter in the ARFIMA(0,d,0) with unknown mean and variance. Both estimators are shown to be second-order pivotal. This extends earlier findings of Lieberman and Phillips (2001), who derived expansions for the Gaussian maximum likelihood estimator under the assumption that the mean and variance are known. One implication of the results is that the parametric bootstrap upper one-sided confidence interval provides an o(n^{-1}ln n) improvement over the delta method. For statistics that are not second-order pivotal, the improvement is generally only of the order o(n^{-1/2}ln n).Bootstrap; Edgeworth expansion; Fractional differencing; Pivotal statistic

    Bootstrap approaches for estimation and condence intervals of long memory processes.

    Get PDF
    In this work we investigate an alternative bootstrap approach based on a result of Ramsey (1974) and on the Durbin-Levinson algorithm to obtain surrogate series from linear Gaussian processes with long range dependence. We compare this bootstrap method with other existing procedures in a wide Monte Carlo experiment by estimating, parametrically and semiparametrically, the memory parameter d. We consider Gaussian and non-Gaussian processes to prove the robustness of the method to deviations from Normality. The approach is useful also to estimate condence intervals for the memory parameter d by improving the coverage level of the interval

    A new bootstrap approach for Gaussian long memory time series.

    Get PDF
    In this work we introduce a new bootstrap approach based on a result of Ramsey (1974) and on the Durbin-Levinson algorithm to obtain surrogate series from linear Gaussian processes with long range dependence. First we investigate properties of this type of bootstrap, then we apply the method to semi-parametric estimators of the long memory parameter. We find out that the performance of our bootstrap procedure is superior, in terms of MSE, to other established approaches

    Selection of the number of frequencies using bootstrap techniques in log-periodogram regression

    Get PDF
    The choice of the bandwidth in the local log-periodogram regression is of crucial importance for estimation of the memory parameter of a long memory time series. Different choices may give rise to completely different estimates, which may lead to contradictory conclusions, for example about the stationarity of the series. We propose here a data driven bandwidth selection strategy that is based on minimizing a bootstrap approximation of the mean squared error and compare its performance with other existing techniques for optimal bandwidth selection in a mean squared error sense, revealing its better performance in a wider class of models. The empirical applicability of the proposed strategy is shown with two examples: the widely analyzed in a long memory context Nile river annual minimum levels and the input gas rate series of Box and Jenkins.bootstrap, long memory, log-periodogram regression, bandwidth selection

    A Parametric Bootstrap Test for Cycles

    Get PDF
    The paper proposes a simple test for the hypothesis of strong cycles and as a by-product a test for weak dependence for linear processes. We show that the limit distribution of the test is the maximum of a (semi)Gaussian process G(t), t ? [0; 1]. Because the covariance structure of G(t) is a complicated function of t and model dependent, to obtain the critical values (if possible) of maxt?[0;1] G(t) may be difficult. For this reason we propose a bootstrap scheme in the frequency domain to circumvent the problem of obtaining (asymptotically) valid critical values. The proposed bootstrap can be regarded as an alternative procedure to existing bootstrap methods in the time domain such as the residual-based bootstrap. Finally, we illustrate the performance of the bootstrap test by a small Monte Carlo experiment and an empirical example.Cyclical data, strong and weak dependence, spectral density functions, Whittle estimator, bootstrap algorithms
    corecore