34 research outputs found

    Penalized flexible Bayesian quantile regression

    Get PDF
    Copyright © 2012 SciResThis article has been made available through the Brunel Open Access Publishing Fund.The selection of predictors plays a crucial role in building a multiple regression model. Indeed, the choice of a suitable subset of predictors can help to improve prediction accuracy and interpretation. In this paper, we propose a flexible Bayesian Lasso and adaptive Lasso quantile regression by introducing a hierarchical model framework approach to en- able exact inference and shrinkage of an unimportant coefficient to zero. The error distribution is assumed to be an infi- nite mixture of Gaussian densities. We have theoretically investigated and numerically compared our proposed methods with Flexible Bayesian quantile regression (FBQR), Lasso quantile regression (LQR) and quantile regression (QR) methods. Simulations and real data studies are conducted under different settings to assess the performance of the pro- posed methods. The proposed methods perform well in comparison to the other methods in terms of median mean squared error, mean and variance of the absolute correlation criterions. We believe that the proposed methods are useful practically

    Bayesian Tobit quantile regression using-prior distribution with ridge parameter

    Get PDF
    A Bayesian approach is proposed for coefficient estimation in the Tobit quantile regression model. The proposed approach is based on placing a g-prior distribution depends on the quantile level on the regression coefficients. The prior is generalized by introducing a ridge parameter to address important challenges that may arise with censored data, such as multicollinearity and overfitting problems. Then, a stochastic search variable selection approach is proposed for Tobit quantile regression model based on g-prior. An expression for the hyperparameter g is proposed to calibrate the modified g-prior with a ridge parameter to the corresponding g-prior. Some possible extensions of the proposed approach are discussed, including the continuous and binary responses in quantile regression. The methods are illustrated using several simulation studies and a microarray study. The simulation studies and the microarray study indicate that the proposed approach performs well

    Quantile regression with group lasso for classification

    Get PDF
    Applications of regression models for binary response are very common and models specific to these problems are widely used. Quantile regression for binary response data has recently attracted attention and regularized quantile regression methods have been proposed for high dimensional problems. When the predictors have a natural group structure, such as in the case of categorical predictors converted into dummy variables, then a group lasso penalty is used in regularized methods. In this paper, we present a Bayesian Gibbs sampling procedure to estimate the parameters of a quantile regression model under a group lasso penalty for classification problems with a binary response. Simulated and real data show a good performance of the proposed method in comparison to mean-based approaches and to quantile-based approaches which do not exploit the group structure of the predictors

    An efficient algorithm for structured sparse quantile regression

    No full text
    An efficient algorithm is derived for solving the quantile regression problem combined with a group sparsity promoting penalty. The group sparsity of the regression parameters is achieved by using a ell1,inftyell_{1,infty}-norm penalty (or constraint) on the regression parameters. The algorithm is efficient in the sense that it obtains the regression parameters for a wide range of penalty parameters, thus enabling easy application of a model selection criteria afterwards. A Matlab implementation of the proposed algorithm is provided and some applications of the methods are studied.SCOPUS: ar.jinfo:eu-repo/semantics/publishe
    corecore