We consider the problem of surrogate sufficient dimension reduction, that is,
estimating the central subspace of a regression model, when the covariates are
contaminated by measurement error. When no measurement error is present, a
likelihood-based dimension reduction method that relies on maximizing the
likelihood of a Gaussian inverse regression model on the Grassmann manifold is
well-known to have superior performance to traditional inverse moment methods.
We propose two likelihood-based estimators for the central subspace in
measurement error settings, which make different adjustments to the observed
surrogates. Both estimators are computed based on maximizing objective
functions on the Grassmann manifold and are shown to consistently recover the
true central subspace. When the central subspace is assumed to depend on only a
few covariates, we further propose to augment the likelihood function with a
penalty term that induces sparsity on the Grassmann manifold to obtain sparse
estimators. The resulting objective function has a closed-form Riemann gradient
which facilitates efficient computation of the penalized estimator. We leverage
the state-of-the-art trust region algorithm on the Grassmann manifold to
compute the proposed estimators efficiently. Simulation studies and a data
application demonstrate the proposed likelihood-based estimators perform better
than inverse moment-based estimators in terms of both estimation and variable
selection accuracy