182 research outputs found

    Composite Likelihood Inference by Nonparametric Saddlepoint Tests

    Get PDF
    The class of composite likelihood functions provides a flexible and powerful toolkit to carry out approximate inference for complex statistical models when the full likelihood is either impossible to specify or unfeasible to compute. However, the strenght of the composite likelihood approach is dimmed when considering hypothesis testing about a multidimensional parameter because the finite sample behavior of likelihood ratio, Wald, and score-type test statistics is tied to the Godambe information matrix. Consequently inaccurate estimates of the Godambe information translate in inaccurate p-values. In this paper it is shown how accurate inference can be obtained by using a fully nonparametric saddlepoint test statistic derived from the composite score functions. The proposed statistic is asymptotically chi-square distributed up to a relative error of second order and does not depend on the Godambe information. The validity of the method is demonstrated through simulation studies

    A robust approach for skewed and heavy-tailed outcomes in the analysis of health care expenditures

    Get PDF
    In this paper robust statistical procedures are presented for the analysis of skewed and heavy-tailed outcomes as they typically occur in health care data. The new estimators and test statistics are extensions of classical maximum likelihood techniques for generalized linear models. In contrast to their classical counterparts, the new robust techniques show lower variability and excellent effciency properties in the presence of small deviations form the assumed model, i.e. when the underlying distribution of the data lies in a neighborhood of the model. A simulation study, an analysis on real data, and a sensitivity analysis confirm the good theoretical statistical properties of the new techniques.Deviations from the model; GLM modeling; health econometrics; heavy tails; robust estimation; robust inference

    Robust Small Sample Accurate Inference in Moment Condition Models

    Get PDF
    Procedures based on the Generalized Method of Moments (GMM) (Hansen, 1982) are basic tools in modern econometrics. In most cases, the theory available for making inference with these procedures is based on first order asymptotic theory. It is well-known that the (first order) asymptotic distribution does not provide accurate p-values and confidence intervals in moderate to small samples. Moreover, in the presence of small deviations from the assumed model, p-values and confidence intervals based on classical GMM procedures can be drastically affected (nonrobustness). Several alternative techniques have been proposed in the literature to improve the accuracy of GMM procedures. These alternatives address either the first order accuracy of the approximations (information and entropy econometrics (IEE)) or the nonrobustness (Robust GMM estimators and tests). In this paper, we propose a new alternative procedure which combines robustness properties and accuracy in small samples. Specifically, we combine IEE techniques as developed in Imbens, Spady, Johnson (1998) to obtain finite sample accuracy with robust methods obtained by bounding the original orthogonality function as proposed in Ronchetti and Trojani (2001). This leads to new robust estimators and tests in moment condition models with excellent finite sample accuracy. Finally, we illustrate the accuracy of the new statistic by means of some simulations for three models on overidentifying moment conditions.Exponential tilting, Generalized method of moments, Information and entropy econometrics, Monte Carlo, Robust tests, Saddlepoint techniques

    Optimal Conditionally Unbiased Bounded-Influence Inference in Dynamic Location and Scale Models

    Get PDF
    This paper studies the local robustness of estimators and tests for the conditional location and scale parameters in a strictly stationary time series model. We first derive optimal bounded-influence estimators for such settings under a conditionally Gaussian reference model. Based on these results, optimal bounded-influence versions of the classical likelihood-based tests for parametric hypotheses are obtained. We propose a feasible and efficient algorithm for the computation of our robust estimators, which makes use of analytical Laplace approximations to estimate the auxiliary recentering vectors ensuring Fisher consistency in robust estimation. This strongly reduces the necessary computation time by avoiding the simulation of multidimensional integrals, a task that has typically to be addressed in the robust estimation of nonlinear models for time series. In some Monte Carlo simulations of an AR(1)-ARCH(1) process we show that our robust procedures maintain a very high efficiency under ideal model conditions and at the same time perform very satisfactorily under several forms of departure from conditional normality. On the contrary, classical Pseudo Maximum Likelihood inference procedures are found to be highly inefficient under such local model misspecifications. These patterns are confirmed by an application to robust testing for ARCH.Time series models, M-estimators, influence function, robust estimation and testing

    Robust inference with GMM estimators

    Get PDF
    The local robustness properties of Generalized Method of Moments (GMM) estimators and of a broad class of GMM based tests are investigated in a unified framework. GMM statistics are shown to have bounded influence if and only if the function defining the orthogonality restrictions imposed on the underlying model is bounded. Since in many applications this function is unbounded, it is useful to have procedures that modify the starting orthogonality conditions in order to obtain a robust version of a GMM estimator or test. We show how this can be obtained when a reference model for the data distribution can be assumed. We develop a exible algorithm for constructing a robust GMM (RGMM) estimator leading to stable GMM test statistics. The amount of robustness can be controlled by an appropriate tuning constant. We relate by an explicit formula the choice of this constant to the maximal admissible bias on the level or (and) the power of a GMM test and the amount of contamination that one can reasonably assume given some information on the data. Finally, we illustrate the RGMM methodology with some simulations of an application to RGMM testing for conditional heteroscedasticity in a simple linear autoregressive model. In this example we find a significant instability of the size and the power of a classical GMM testing procedure under a non-normal conditional error distribution. On the other side, the RGMM testing procedures can control the size and the power of the test under nonstandard conditions while maintaining a satisfactoy power under an approximatively normal conditional error distribution

    Accurate and robust tests for indirect inference

    Get PDF
    In this paper we propose accurate parameter and over-identification tests for indirect inference. Under the null hypothesis the new tests are asymptotically χ2-distributed with a relative error of order n−1. They exhibit better finite sample accuracy than classical tests for indirect inference, which have the same asymptotic distribution but an absolute error of order n−1/2. Robust versions of the tests are also provided. We illustrate their accuracy in nonlinear regression, Poisson regression with overdispersion and diffusion model

    A robust approach for skewed and heavy-tailed outcomes in the analysis of health care expenditures

    Get PDF
    In this paper robust statistical procedures are presented for the analysis of skewed and heavy-tailed outcomes as they typically occur in health care data. The new estimators and test statistics are extensions of classical maximum likelihood techniques for generalized linear models. In contrast to their classical counterparts, the new robust techniques show lower variability and excellent efficiency properties in the presence of small deviations from the assumed model, i.e. when the underlying distribution of the data lies in a neighborhood of the model. A simulation study, an analysis on real data, and a sensitivity analysis confirm the good theoretical statistical properties of the new techniques
    corecore