32,150 research outputs found
Variable selection in high-dimensional additive models based on norms of projections
We consider the problem of variable selection in high-dimensional sparse
additive models. We focus on the case that the components belong to
nonparametric classes of functions. The proposed method is motivated by
geometric considerations in Hilbert spaces and consists of comparing the norms
of the projections of the data onto various additive subspaces. Under minimal
geometric assumptions, we prove concentration inequalities which lead to new
conditions under which consistent variable selection is possible. As an
application, we establish conditions under which a single component can be
estimated with the rate of convergence corresponding to the situation in which
the other components are known.Comment: 27 page
Lower bounds for invariant statistical models with applications to principal component analysis
This paper develops nonasymptotic information inequalities for the estimation
of the eigenspaces of a covariance operator. These results generalize previous
lower bounds for the spiked covariance model, and they show that recent upper
bounds for models with decaying eigenvalues are sharp. The proof relies on
lower bound techniques based on group invariance arguments which can also deal
with a variety of other statistical models.Comment: 42 pages, to appear in Annales de l'Institut Henri Poincar\'e
Probabilit\'es et Statistique
- …