This paper explores variants of the subspace iteration algorithm for
computing approximate invariant subspaces. The standard subspace iteration
approach is revisited and new variants that exploit gradient-type techniques
combined with a Grassmann manifold viewpoint are developed. A gradient method
as well as a conjugate gradient technique are described.
Convergence of the gradient-based algorithm is analyzed and a few numerical
experiments are reported, indicating that the proposed algorithms are sometimes
superior to a standard Chebyshev-based subspace iteration when compared in
terms of number of matrix vector products, but do not require estimating
optimal parameters. An important contribution of this paper to achieve this
good performance is the accurate and efficient implementation of an exact line
search. In addition, new convergence proofs are presented for the
non-accelerated gradient method that includes a locally exponential convergence
if started in a O(δ​) neighbourhood of the dominant
subspace with spectral gap δ.Comment: 29 page