5,044 research outputs found

    Nonparametric and semiparametric estimation with discrete regressors

    Get PDF
    This paper presents and discusses procedures for estimating regression curves when regressors are discrete and applies them to semiparametric inference problems. We show that pointwise root-n-consistency and global consistency of regression curve estimates are achieved without employing any smoothing, even for discrete regressors with unbounded support. These results still hold when smoothers are used, under much weaker conditions than those required with continuous regressors. Such estimates are useful in semiparametric inference problems. We discuss in detail the partially linear regression model and shape-invariant modelling. We also provide some guidance on estimation in semiparametric models where continuous and discrete regressors are present. The paper also includes a Monte Carlo study

    Pivotal estimation via square-root Lasso in nonparametric regression

    Get PDF
    We propose a self-tuning Lasso\sqrt{\mathrm {Lasso}} method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved designs, for example, perfectly collinear regressors, and generates sharp bounds even in extreme cases, such as the infinite variance case and the noiseless case, in contrast to Lasso. We establish various nonasymptotic bounds for Lasso\sqrt{\mathrm {Lasso}} including prediction norm rate and sparsity. Our analysis is based on new impact factors that are tailored for bounding prediction norm. In order to cover heteroscedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by Lasso\sqrt{\mathrm {Lasso}} accounting for possible misspecification of the selected model. Under mild conditions, the rate of convergence of ols post Lasso\sqrt{\mathrm {Lasso}} is as good as Lasso\sqrt{\mathrm {Lasso}}'s rate. As an application, we consider the use of Lasso\sqrt{\mathrm {Lasso}} and ols post Lasso\sqrt{\mathrm {Lasso}} as estimators of nuisance parameters in a generic semiparametric problem (nonlinear moment condition or ZZ-problem), resulting in a construction of n\sqrt{n}-consistent and asymptotically normal estimators of the main parameters.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1204 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Inference on semiparametric models with discrete regressors

    Get PDF
    We study statistical properties of coefficient estimates of the partially linear regression model when some or all regressors, in the unknown part of the model, are discrete. The method does not require smoothing in the discrete variables. Unlike when there are continuous regressors. when all regressors are discrete independence between regressors and regression errors is not required. We also give some guidance on how to implement the estimate when there are both continuous and discrete regressors in the unknown part of the model. Weights employed in this paper seem straightforwardly applicable to other semiparametric problems

    Decentralization Estimators for Instrumental Variable Quantile Regression Models

    Full text link
    The instrumental variable quantile regression (IVQR) model (Chernozhukov and Hansen, 2005) is a popular tool for estimating causal quantile effects with endogenous covariates. However, estimation is complicated by the non-smoothness and non-convexity of the IVQR GMM objective function. This paper shows that the IVQR estimation problem can be decomposed into a set of conventional quantile regression sub-problems which are convex and can be solved efficiently. This reformulation leads to new identification results and to fast, easy to implement, and tuning-free estimators that do not require the availability of high-level "black box" optimization routines
    • …
    corecore