113,733 research outputs found

    Simple Tests for Reduced Rank in Multivariate Regression

    Get PDF
    The present work proposes tests for reduced rank in multivariate regression coefficient matrices, under rather general conditions. A heuristic approach is to first estimate the regressions via standard methods, then compare the coefficient matrix rows (or columns) to assess their redundancy. A formal version of this approach utilizes the distance between an unrestricted coefficient matrix estimate and an estimate restricted by reduced rank. Two distance minimization problems emerge, based on equivalent formulations of the null hypothesis. For each method we derive estimators and tests, and their asymptotic distributions. We examine test performance in simulation, and give some numerical examples

    Forecasting Large Datasets with Reduced Rank Multivariate Models

    Get PDF
    The paper addresses the issue of forecasting a large set of variables using multivariate models. In particular, we propose three alternative reduced rank forecasting models and compare their predictive performance with the most promising existing alternatives, namely, factor models, large scale bayesian VARs, and multivariate boosting. Specifically, we focus on classical reduced rank regression, a two-step procedure that applies, in turn, shrinkage and reduced rank restrictions, and the reduced rank bayesian VAR of Geweke (1996). As a result, we found that using shrinkage and rank reduction in combination rather than separately improves substantially the accuracy of forecasts, both when the whole set of variables is to be forecast, and for key variables such as industrial production growth, inflation, and the federal funds rate.Bayesian VARs, Factor models, Forecasting, Reduced rank

    Reduced rank ridge regression and its kernel extensions

    Full text link
    In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 2011Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/88011/1/10138_ftp.pd

    Topics on Reduced Rank Methods for Multivariate Regression.

    Full text link
    Multivariate regression is a generalization of the univariate regression to the case where we are interested in predicting q(>1) responses that depend on the same set of features or predictors. Problems of this type are encountered commonly in many quantitative fields, such as bioinformatics, chemometrics, economics and engineering. The main goal is to build more accurate and interpretable models that can exploit the dependence structure among the responses and achieve appropriate dimension reduction. Reduced rank regression has been an important tool to this end due to its simplicity, computational efficiency and superior predictive performance. In the first part of this thesis we propose a reduced rank ridge regression method, which is robust against collinearity among the predictors. It also allows us to extend the solution to the reproducing kernel Hilbert space (RKHS) setting. The second part studies the effective degrees of freedom for a general class of reduced rank estimators in the framework of Stein’s unbiased risk estimation (SURE). A finite sample exact unbiased estimator is derived that admits a closed form solution. This can be used to calculate popular model selection criteria such as BIC, Mallow’s Cp or GCV which provide a principled way of selecting the optimal rank. The proposed estimator is significantly different from the number of free parameters in the model, which is often taken as a heuristic estimate of the degrees of freedom of an estimation procedure. The final part deals with a non-parametric extension to reduced rank regression under high dimensional setting where many of the predictors might be non-informative. We propose a two step penalized regression approach based on spline approximations that encourages both variable selection and rank reduction. We prove rank selection consistency and also provide error bounds for the proposed method.PHDStatisticsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/99837/1/ashinm_1.pd

    Forecasting Large Datasets with Bayesian Reduced Rank Multivariate Models

    Get PDF
    The paper addresses the issue of forecasting a large set of variables using multivariate models. In particular, we propose three alternative reduced rank forecasting models and compare their predictive performance for US time series with the most promising existing alternatives, namely, factor models, large scale Bayesian VARs, and multivariate boosting. Speci.cally, we focus on classical reduced rank regression, a two-step procedure that applies, in turn, shrinkage and reduced rank restrictions, and the reduced rank Bayesian VAR of Geweke (1996). We .nd that using shrinkage and rank reduction in combination rather than separately improves substantially the accuracy of forecasts, both when the whole set of variables is to be forecast, and for key variables such as industrial production growth, inflation, and the federal funds rate. The robustness of this finding is confirmed by a Monte Carlo experiment based on bootstrapped data. We also provide a consistency result for the reduced rank regression valid when the dimension of the system tends to infinity, which opens the ground to use large scale reduced rank models for empirical analysis.Bayesian VARs, factor models, forecasting, reduced rank
    corecore