Location of Repository

Reduced rank ridge regression and its kernel extensions

By Ashin Mukherjee and Ji Zhu


In multivariate linear regression, it is often assumed that the response matrix is intrinsically of lower rank. This could be because of the correlation structure among the prediction variables or the coefficient matrix being lower rank. To accommodate both, we propose a reduced rank ridge regression for multivariate linear regression. Specifically, we combine the ridge penalty with the reduced rank constraint on the coefficient matrix to come up with a computationally straightforward algorithm. Numerical studies indicate that the proposed method consistently outperforms relevant competitors. A novel extension of the proposed method to the reproducing kernel Hilbert space (RKHS) set‐up is also developed. © 2011 Wiley Periodicals, Inc. Statistical Analysis and Data Mining 4: 612–622, 201

Publisher: Wiley Subscription Services, Inc., A Wiley Company
Year: 2011
DOI identifier: 10.1002/sam.10138
OAI identifier: oai:deepblue.lib.umich.edu:2027.42/88011

Suggested articles


To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.