2,706 research outputs found

    Higher-order Improvements of the Parametric Bootstrap for Markov Processes

    Get PDF
    This paper provides bounds on the errors in coverage probabilities of maximum likelihood-based, percentile-t, parametric bootstrap confidence intervals for Markov time series processes. These bounds show that the parametric bootstrap for Markov time series provides higher-order improvements (over confidence intervals based on first order asymptotics) that are comparable to those obtained by the parametric and nonparametric bootstrap for iid data and are better than those obtained by the block bootstrap for time series. Additional results are given for Wald-based confidence regions. The paper also shows that k-step parametric bootstrap confidence intervals achieve the same higher-order improvements as the standard parametric bootstrap for Markov processes. The k-step bootstrap confidence intervals are computationally attractive. They circumvent the need to compute a nonlinear optimization for each simulated bootstrap sample. The latter is necessary to implement the standard parametric bootstrap when the maximum likelihood estimator solves a nonlinear optimization problem.Asymptotics, Edgeworth expansion, Gauss-Newton, k-step bootstrap, maximum likelihood estimator, Newton-Raphson, parametric bootstrap, t statistic

    Higher-Order Improvements of a Computationally Attractive-Step Bootstrap for Extremum Estimators

    Get PDF
    This paper establishes the higher-order equivalence of the k-step bootstrap, introduced recently by Davidson and MacKinnon (1999a), and the standard bootstrap. The k-step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also extends results of Hall and Horowitz (1996) to provide new results regarding the higher-order improvements of the standard bootstrap and the k-step bootstrap for extremum estimators (compared to procedures based on first-order asymptotics). The results of the paper apply to Newton-Raphson (NR), default NR, line-search NR, and Gauss-Newton k-step bootstrap procedures. The results apply to the nonparametric iid bootstrap, non-overlapping and overlapping block bootstraps, and restricted and unrestricted parametric bootstraps. The results cover symmetric and equal-tailed two-sided t tests and confidence intervals, one-sided t tests and confidence intervals, Wald tests and confidence regions, and J tests of over-identifying restrictions.Asymptotics, block bootstrap, Edgeworth expansion, extremum estimator, Gauss-Newton, generalized method of moments estimator, k-step bootstrap, maximum likelihood estimator, Newton-Raphson, parametric bootstrap, t statistic, test of over-identifying

    Smoothed Empirical Likelihood Methods for Quantile Regression Models

    Get PDF
    This paper considers an empirical likelihood method to estimate the parameters of the quantile regression (QR) models and to construct confidence regions that are accurate in finite samples. To achieve the higher-order refinements, we smooth the estimating equations for the empirical likelihood. We show that the smoothed empirical likelihood (SEL) estimator is first-order asymptotically equivalent to the standard QR estimator and establish that confidence regions based on the smoothed empirical likelihood ratio have coverage errors of order n^{-1} and may be Bartlett-corrected to produce regions with an error of order n^{-2}, where n denotes the sample size. We further extend these results to censored quantile regression models. Our results are extensions of the previous results of Chen and Hall (1993) to the regression contexts. Monte Carlo experiments suggest that the smoothed empirical likelihood confidence regions may be more accurate in small samples than the confidence regions that can be constructed from the smoothed bootstrap method recently suggested by Horowitz (1998).Bartlett correction, Bootstrap, Edgeworth expansion, Empirical likelihood, Quantile regression model, Censored quantile regression model

    Smoothed and Iterated Bootstrap Confidence Regions for Parameter Vectors

    Full text link
    The construction of confidence regions for parameter vectors is a difficult problem in the nonparametric setting, particularly when the sample size is not large. The bootstrap has shown promise in solving this problem, but empirical evidence often indicates that some bootstrap methods have difficulty in maintaining the correct coverage probability, while other methods may be unstable, often resulting in very large confidence regions. One way to improve the performance of a bootstrap confidence region is to restrict the shape of the region in such a way that the error term of an expansion is as small an order as possible. To some extent, this can be achieved by using the bootstrap to construct an ellipsoidal confidence region. This paper studies the effect of using the smoothed and iterated bootstrap methods to construct an ellipsoidal confidence region for a parameter vector. The smoothed estimate is based on a multivariate kernel density estimator. This paper establishes a bandwidth matrix for the smoothed bootstrap procedure that reduces the asymptotic coverage error of the bootstrap percentile method ellipsoidal confidence region. We also provide an analytical adjustment to the nominal level to reduce the computational cost of the iterated bootstrap method. Simulations demonstrate that the methods can be successfully applied in practice

    Rearranging Edgeworth-Cornish-Fisher Expansions

    Full text link
    This paper applies a regularization procedure called increasing rearrangement to monotonize Edgeworth and Cornish-Fisher expansions and any other related approximations of distribution and quantile functions of sample statistics. Besides satisfying the logical monotonicity, required of distribution and quantile functions, the procedure often delivers strikingly better approximations to the distribution and quantile functions of the sample mean than the original Edgeworth-Cornish-Fisher expansions.Comment: 17 pages, 3 figure

    Objective Bayes and Conditional Frequentist Inference

    Get PDF
    Objective Bayesian methods have garnered considerable interest and support among statisticians, particularly over the past two decades. It has often been ignored, however, that in some cases the appropriate frequentist inference to match is a conditional one. We present various methods for extending the probability matching prior (PMP) methods to conditional settings. A method based on saddlepoint approximations is found to be the most tractable and we demonstrate its use in the most common exact ancillary statistic models. As part of this analysis, we give a proof of an exactness property of a particular PMP in location-scale models. We use the proposed matching methods to investigate the relationships between conditional and unconditional PMPs. A key component of our analysis is a numerical study of the performance of probability matching priors from both a conditional and unconditional perspective in exact ancillary models. In concluding remarks we propose many routes for future research

    Adjusted empirical likelihood with high-order precision

    Full text link
    Empirical likelihood is a popular nonparametric or semi-parametric statistical method with many nice statistical properties. Yet when the sample size is small, or the dimension of the accompanying estimating function is high, the application of the empirical likelihood method can be hindered by low precision of the chi-square approximation and by nonexistence of solutions to the estimating equations. In this paper, we show that the adjusted empirical likelihood is effective at addressing both problems. With a specific level of adjustment, the adjusted empirical likelihood achieves the high-order precision of the Bartlett correction, in addition to the advantage of a guaranteed solution to the estimating equations. Simulation results indicate that the confidence regions constructed by the adjusted empirical likelihood have coverage probabilities comparable to or substantially more accurate than the original empirical likelihood enhanced by the Bartlett correction.Comment: Published in at http://dx.doi.org/10.1214/09-AOS750 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore