8 research outputs found

    robustHD: An R package for robust regression with high-dimensional data

    Get PDF

    Forecasting stock market out-of-sample with regularised regression training techniques

    Get PDF
    Forecasting stock market out-of-sample is a major concern to researchers in finance and emerging markets. This research focuses mainly on the application of regularised Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regularised RT models involving model complexity were employed. The regularised RT models which include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the Regularised RT models demonstrate significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. Overall, the Ridge gives the best statistical performance evaluation results while the LASSO appeared to be most economical meaningful. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk

    A New Bayesian Huberised Regularisation and Beyond

    Full text link
    Robust regression has attracted a great amount of attention in the literature recently, particularly for taking asymmetricity into account simultaneously and for high-dimensional analysis. However, the majority of research on the topics falls in frequentist approaches, which are not capable of full probabilistic uncertainty quantification. This paper first proposes a new Huberised-type of asymmetric loss function and its corresponding probability distribution which is shown to have the scale-mixture of normals. Then we introduce a new Bayesian Huberised regularisation for robust regression. A by-product of the research is that a new Bayesian Huberised regularised quantile regression is also derived. We further present their theoretical posterior properties. The robustness and effectiveness of the proposed models are demonstrated in the simulation studies and the real data analysis

    Robust groupwise least angle regression

    No full text
    Many regression problems exhibit a natural grouping among predictor variables. Examples are groups of dummy variables representing categorical variables, or present and lagged values of time series data. Since model selection in such cases typically aims for selecting groups of variables rather than individual covariates, an extension of the popular least angle regression (LARS) procedure to groupwise variable selection is considered. Data sets occurring in applied statistics frequently contain outliers that do not follow the model or the majority of the data. Therefore a modification of the groupwise LARS algorithm is introduced that reduces the influence of outlying data points. Simulation studies and a real data example demonstrate the excellent performance of groupwise LARS and, when outliers are present, its robustification

    Robust groupwise least angle regression

    No full text
    Many regression problems exhibit a natural grouping among predictor variables. Examples are groups of dummy variables representing categorical variables, or present and lagged values of time series data. Since model selection in such cases typically aims for selecting groups of variables rather than individual covariates, an extension of the popular least angle regression (LARS) procedure to groupwise variable selection is considered. Data sets occurring in applied statistics frequently contain outliers that do not follow the model or the majority of the data. Therefore a modification of the groupwise LARS algorithm is introduced that reduces the influence of outlying data points. Simulation studies and a real data example demonstrate the excellent performance of groupwise LARS and, when outliers are present, its robustification
    corecore