research

Optimal Estimation and Prediction for Dense Signals in High-Dimensional Linear Models

Abstract

Estimation and prediction problems for dense signals are often framed in terms of minimax problems over highly symmetric parameter spaces. In this paper, we study minimax problems over l2-balls for high-dimensional linear models with Gaussian predictors. We obtain sharp asymptotics for the minimax risk that are applicable in any asymptotic setting where the number of predictors diverges and prove that ridge regression is asymptotically minimax. Adaptive asymptotic minimax ridge estimators are also identified. Orthogonal invariance is heavily exploited throughout the paper and, beyond serving as a technical tool, provides additional insight into the problems considered here. Most of our results follow from an apparently novel analysis of an equivalent non-Gaussian sequence model with orthogonally invariant errors. As with many dense estimation and prediction problems, the minimax risk studied here has rate d/n, where d is the number of predictors and n is the number of observations; however, when d is roughly proportional to n the minimax risk is influenced by the spectral distribution of the predictors and is notably different from the linear minimax risk for the Gaussian sequence model (Pinsker, 1980) that often appears in other dense estimation and prediction problems.Comment: 29 pages, 0 figure

    Similar works

    Full text

    thumbnail-image

    Available Versions