5,710 research outputs found
Generalized Information Criteria for Structured Sparse Models
Regularized m-estimators are widely used due to their ability of recovering a
low-dimensional model in high-dimensional scenarios. Some recent efforts on
this subject focused on creating a unified framework for establishing oracle
bounds, and deriving conditions for support recovery. Under this same
framework, we propose a new Generalized Information Criteria (GIC) that takes
into consideration the sparsity pattern one wishes to recover. We obtain
non-asymptotic model selection bounds and sufficient conditions for model
selection consistency of the GIC. Furthermore, we show that the GIC can also be
used for selecting the regularization parameter within a regularized
-estimation framework, which allows practical use of the GIC for model
selection in high-dimensional scenarios. We provide examples of group LASSO in
the context of generalized linear regression and low rank matrix regression
Ridge Estimation of Inverse Covariance Matrices from High-Dimensional Data
We study ridge estimation of the precision matrix in the high-dimensional
setting where the number of variables is large relative to the sample size. We
first review two archetypal ridge estimators and note that their utilized
penalties do not coincide with common ridge penalties. Subsequently, starting
from a common ridge penalty, analytic expressions are derived for two
alternative ridge estimators of the precision matrix. The alternative
estimators are compared to the archetypes with regard to eigenvalue shrinkage
and risk. The alternatives are also compared to the graphical lasso within the
context of graphical modeling. The comparisons may give reason to prefer the
proposed alternative estimators
- …