27 research outputs found
Estimating Structured High-Dimensional Covariance and Precision Matrices: Optimal Rates and Adaptive Estimation
This is an expository paper that reviews recent developments on optimal estimation of structured high-dimensional covariance and precision matrices. Minimax rates of convergence for estimating several classes of structured covariance and precision matrices, including bandable, Toeplitz, sparse, and sparse spiked covariance matrices as well as sparse precision matrices, are given under the spectral norm loss. Data-driven adaptive procedures for estimating various classes of matrices are presented. Some key technical tools including large deviation results and minimax lower bound arguments that are used in the theoretical analyses are discussed. In addition, estimation under other losses and a few related problems such as Gaussian graphical models, sparse principal component analysis, factor models, and hypothesis testing on the covariance structure are considered. Some open problems on estimating high-dimensional covariance and precision matrices and their functionals are also discussed
Adaptive covariance matrix estimation through block thresholding
Estimation of large covariance matrices has drawn considerable recent
attention, and the theoretical focus so far has mainly been on developing a
minimax theory over a fixed parameter space. In this paper, we consider
adaptive covariance matrix estimation where the goal is to construct a single
procedure which is minimax rate optimal simultaneously over each parameter
space in a large collection. A fully data-driven block thresholding estimator
is proposed. The estimator is constructed by carefully dividing the sample
covariance matrix into blocks and then simultaneously estimating the entries in
a block by thresholding. The estimator is shown to be optimally rate adaptive
over a wide range of bandable covariance matrices. A simulation study is
carried out and shows that the block thresholding estimator performs well
numerically. Some of the technical tools developed in this paper can also be of
independent interest.Comment: Published in at http://dx.doi.org/10.1214/12-AOS999 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Projection inference for high-dimensional covariance matrices with structured shrinkage targets
Analyzing large samples of high-dimensional data under dependence is a
challenging statistical problem as long time series may have change points,
most importantly in the mean and the marginal covariances, for which one needs
valid tests. Inference for large covariance matrices is especially difficult
due to noise accumulation, resulting in singular estimates and poor power of
related tests. The singularity of the sample covariance matrix in high
dimensions can be overcome by considering a linear combination with a regular,
more structured target matrix. This approach is known as shrinkage, and the
target matrix is typically of diagonal form. In this paper, we consider
covariance shrinkage towards structured nonparametric estimators of the
bandable or Toeplitz type, respectively, aiming at improved estimation accuracy
and statistical power of tests even under nonstationarity. We derive feasible
Gaussian approximation results for bilinear projections of the shrinkage
estimators which are valid under nonstationarity and dependence. These
approximations especially enable us to formulate a statistical test for
structural breaks in the marginal covariance structure of high-dimensional time
series without restrictions on the dimension, and which is robust against
nonstationarity of nuisance parameters. We show via simulations that shrinkage
helps to increase the power of the proposed tests. Moreover, we suggest a
data-driven choice of the shrinkage weights, and assess its performance by
means of a Monte Carlo study. The results indicate that the proposed shrinkage
estimator is superior for non-Toeplitz covariance structures close to
fractional Gaussian noise