103 research outputs found

    Confidence Regions for Averaging Estimators

    Get PDF

    Joint inference based on Stein-type averaging estimators in the linear regression model

    Get PDF
    While averaging unrestricted with restricted estimators is known to reduce estimation risk, it is an open question whether this reduction in turn can improve inference. To analyze this question, we construct joint confidence regions centered at James–Stein averaging estimators in both homoskedastic and heteroskedastic linear regression models. These regions are asymptotically valid when the number of restrictions increases possibly proportionally with the sample size. When used for hypothesis testing, we show that suitable restrictions enhance power over the standard F-test. We study the practical implementation through simulations and an application to consumption-based asset pricing

    Confidence Regions for Averaging Estimators

    Get PDF

    Inference on LATEs with covariates

    Full text link
    In theory, two-stage least squares (TSLS) identifies a weighted average of covariate-specific local average treatment effects (LATEs) from a saturated specification without making parametric assumptions on how available covariates enter the model. In practice, TSLS is severely biased when saturation leads to a number of control dummies that is of the same order of magnitude as the sample size, and the use of many, arguably weak, instruments. This paper derives asymptotically valid tests and confidence intervals for an estimand that identifies the weighted average of LATEs targeted by saturated TSLS, even when the number of control dummies and instrument interactions is large. The proposed inference procedure is robust against four key features of saturated economic data: treatment effect heterogeneity, covariates with rich support, weak identification strength, and conditional heteroskedasticity

    Forecasting using Random Subspace Methods

    Get PDF
    Random subspace methods are a new approach to obtain accurate forecasts in high-dimensional regression settings. Forecasts are constructed by averaging over forecasts from many submodels generated by random selection or random Gaussian weighting of predictors. This paper derives upper bounds on the asymptotic mean squared forecast error of these strategies, which show that the methods are particularly suitable for macroeconomic forecasting. An empirical application to the FRED-MD data confirms the theoretical findings, and shows random subspace methods to outperform competing methods on key macroeconomic indicators

    Identification- and many instrument-robust inference via invariant moment conditions

    Full text link
    Identification-robust hypothesis tests are commonly based on the continuous updating objective function or its score. When the number of moment conditions grows proportionally with the sample size, the large-dimensional weighting matrix prohibits the use of conventional asymptotic approximations and the behavior of these tests remains unknown. We show that the structure of the weighting matrix opens up an alternative route to asymptotic results when, under the null hypothesis, the distribution of the moment conditions is reflection invariant. In a heteroskedastic linear instrumental variables model, we then establish asymptotic normality of conventional tests statistics under many instrument sequences. A key result is that the additional terms that appear in the variance are negative. Revisiting a study on the elasticity of substitution between immigrant and native workers where the number of instruments is over a quarter of the sample size, the many instrument-robust approximation indeed leads to substantially narrower confidence intervals
    • …
    corecore