26,189 research outputs found

    Estimator selection: a new method with applications to kernel density estimation

    Get PDF
    Estimator selection has become a crucial issue in non parametric estimation. Two widely used methods are penalized empirical risk minimization (such as penalized log-likelihood estimation) or pairwise comparison (such as Lepski's method). Our aim in this paper is twofold. First we explain some general ideas about the calibration issue of estimator selection methods. We review some known results, putting the emphasis on the concept of minimal penalty which is helpful to design data-driven selection criteria. Secondly we present a new method for bandwidth selection within the framework of kernel density density estimation which is in some sense intermediate between these two main methods mentioned above. We provide some theoretical results which lead to some fully data-driven selection strategy

    Minimal penalty for Goldenshluger-Lepski method

    Get PDF
    This paper is concerned with adaptive nonparametric estimation using the Goldenshluger-Lepski selection method. This estimator selection method is based on pairwise comparisons between estimators with respect to some loss function. The method also involves a penalty term that typically needs to be large enough in order that the method works (in the sense that one can prove some oracle type inequality for the selected estimator). In the case of density estimation with kernel estimators and a quadratic loss, we show that the procedure fails if the penalty term is chosen smaller than some critical value for the penalty: the minimal penalty. More precisely we show that the quadratic risk of the selected estimator explodes when the penalty is below this critical value while it stays under control when the penalty is above this critical value. This kind of phase transition phenomenon for penalty calibration has already been observed and proved for penalized model selection methods in various contexts but appears here for the first time for the Goldenshluger-Lepski pairwise comparison method. Some simulations illustrate the theoretical results and lead to some hints on how to use the theory to calibrate the method in practice

    Bandwidth selection in kernel density estimation: Oracle inequalities and adaptive minimax optimality

    Full text link
    We address the problem of density estimation with Ls\mathbb{L}_s-loss by selection of kernel estimators. We develop a selection procedure and derive corresponding Ls\mathbb{L}_s-risk oracle inequalities. It is shown that the proposed selection rule leads to the estimator being minimax adaptive over a scale of the anisotropic Nikol'skii classes. The main technical tools used in our derivations are uniform bounds on the Ls\mathbb{L}_s-norms of empirical processes developed recently by Goldenshluger and Lepski [Ann. Probab. (2011), to appear].Comment: Published in at http://dx.doi.org/10.1214/11-AOS883 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Spline-backfitted kernel smoothing of nonlinear additive autoregression model

    Full text link
    Application of nonparametric and semiparametric regression techniques to high-dimensional time series data has been hampered due to the lack of effective tools to address the ``curse of dimensionality.'' Under rather weak conditions, we propose spline-backfitted kernel estimators of the component functions for the nonlinear additive time series data that are both computationally expedient so they are usable for analyzing very high-dimensional time series, and theoretically reliable so inference can be made on the component functions with confidence. Simulation experiments have provided strong evidence that corroborates the asymptotic theory.Comment: Published in at http://dx.doi.org/10.1214/009053607000000488 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore