721,150 research outputs found

    Optimal variance estimation without estimating the mean function

    Full text link
    We study the least squares estimator in the residual variance estimation context. We show that the mean squared differences of paired observations are asymptotically normally distributed. We further establish that, by regressing the mean squared differences of these paired observations on the squared distances between paired covariates via a simple least squares procedure, the resulting variance estimator is not only asymptotically normal and root-nn consistent, but also reaches the optimal bound in terms of estimation variance. We also demonstrate the advantage of the least squares estimator in comparison with existing methods in terms of the second order asymptotic properties.Comment: Published in at http://dx.doi.org/10.3150/12-BEJ432 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Model of Robust Regression with Parametric and Nonparametric Methods

    Get PDF
    In the present work, we evaluate the performance of the classical parametric estimation method "ordinary least squares" with the classical nonparametric estimation methods, some robust estimation methods and two suggested methods for conditions in which varying degrees and directions of outliers are presented in the observed data. The study addresses the problem via computer simulation methods. In order to cover the effects of various situations of outliers on the simple linear regression model, samples were classified into four cases (no outliers, outliers in the X-direction, outliers in the Y-direction and outliers in the XY-direction) and the percentages of outliers are varied between 10%, 20% and 30%. The performances of estimators are evaluated in respect to their mean squares error and relative mean squares error. Keywords: Simple Linear Regression model; Ordinary Least Squares Method; Nonparametric Regression; Robust Regression; Least Absolute Deviations Regression; M-Estimation Regression; Trimmed Least Squares Regression

    Estimation parameters using bisquare weighted robust ridge regression BRLTS estimator in the presence of multicollinearity and outliers

    Get PDF
    This study presents an improvement to robust ridge regression estimator. We proposed two methods Bisquare ridge least trimmed squares (BRLTS) and Bisquare ridge least absolute value (BRLAV) based on ridge least trimmed squares RLTS and ridge least absolute value (RLAV) respectively. We compared these methods with existing estimators, namely ordinary least squares (OLS) and Bisquare ridge regression (BRID) using three criteria: Bias, Root Mean Square Error (RMSE) and Standard Error (SE) to estimate the parameters coe±cients. The results of Bisquare ridge least trimmed squares (BRLTS) and Bisquare ridge least absolute value (BRLAV) are compared with existing methods using real data and simulation study. The empirical evidence shows that the results obtain from the BRLTS are the best among the three estimators followed by BRLAV with the least value of the RMSE for the diÆerent disturbance distributions and degrees of multicollinearity
    corecore