Sparse Ridge Fusion For Linear Regression

Abstract

For a linear regression, the traditional technique deals with a case where the number of observations n more than the number of predictor variables p (n \u3e p). In the case n \u3c p, the classical method fails to estimate the coefficients. A solution of the problem is the case of correlated predictors is provided in this thesis. A new regularization and variable selection is proposed under the name of Sparse Ridge Fusion (SRF). In the case of highly correlated predictor, the simulated examples and a real data show that the SRF always outperforms the lasso, eleastic net, and the S-Lasso, and the results show that the SRF selects more predictor variables than the sample size n while the maximum selected variables by lasso is n size

    Similar works