965 research outputs found
High-dimensional regression with noisy and missing data: Provable guarantees with nonconvexity
Although the standard formulations of prediction problems involve
fully-observed and noiseless data drawn in an i.i.d. manner, many applications
involve noisy and/or missing data, possibly involving dependence, as well. We
study these issues in the context of high-dimensional sparse linear regression,
and propose novel estimators for the cases of noisy, missing and/or dependent
data. Many standard approaches to noisy or missing data, such as those using
the EM algorithm, lead to optimization problems that are inherently nonconvex,
and it is difficult to establish theoretical guarantees on practical
algorithms. While our approach also involves optimizing nonconvex programs, we
are able to both analyze the statistical error associated with any global
optimum, and more surprisingly, to prove that a simple algorithm based on
projected gradient descent will converge in polynomial time to a small
neighborhood of the set of all global minimizers. On the statistical side, we
provide nonasymptotic bounds that hold with high probability for the cases of
noisy, missing and/or dependent data. On the computational side, we prove that
under the same types of conditions required for statistical consistency, the
projected gradient descent algorithm is guaranteed to converge at a geometric
rate to a near-global minimizer. We illustrate these theoretical predictions
with simulations, showing close agreement with the predicted scalings.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1018 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …