22,483 research outputs found
Faster Eigenvector Computation via Shift-and-Invert Preconditioning
We give faster algorithms and improved sample complexities for estimating the
top eigenvector of a matrix -- i.e. computing a unit vector such
that :
Offline Eigenvector Estimation: Given an explicit with , we show how to compute an approximate top
eigenvector in time and . Here is the number of nonzeros in ,
is the stable rank, is the relative eigengap. By separating the
dependence from the term, our first runtime improves upon the
classical power and Lanczos methods. It also improves prior work using fast
subspace embeddings [AC09, CW13] and stochastic optimization [Sha15c], giving
significantly better dependencies on and . Our second running
time improves these further when .
Online Eigenvector Estimation: Given a distribution with covariance
matrix and a vector which is an approximate top
eigenvector for , we show how to refine to an approximation
using samples from . Here is a
natural notion of variance. Combining our algorithm with previous work to
initialize , we obtain improved sample complexity and runtime results
under a variety of assumptions on .
We achieve our results using a general framework that we believe is of
independent interest. We give a robust analysis of the classic method of
shift-and-invert preconditioning to reduce eigenvector computation to
approximately solving a sequence of linear systems. We then apply fast
stochastic variance reduced gradient (SVRG) based system solvers to achieve our
claims.Comment: Appearing in ICML 2016. Combination of work in arXiv:1509.05647 and
arXiv:1510.0889
- …