A variable screening procedure via correlation learning was proposed Fan and
Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models.
Even when the true model is linear, the marginal regression can be highly
nonlinear. To address this issue, we further extend the correlation learning to
marginal nonparametric learning. Our nonparametric independence screening is
called NIS, a specific member of the sure independence screening. Several
closely related variable screening procedures are proposed. Under the
nonparametric additive models, it is shown that under some mild technical
conditions, the proposed independence screening methods enjoy a sure screening
property. The extent to which the dimensionality can be reduced by independence
screening is also explicitly quantified. As a methodological extension, an
iterative nonparametric independence screening (INIS) is also proposed to
enhance the finite sample performance for fitting sparse additive models. The
simulation results and a real data analysis demonstrate that the proposed
procedure works well with moderate sample size and large dimension and performs
better than competing methods.Comment: 48 page