Preconditioned gradient iterations for very large eigenvalue problems are
efficient solvers with growing popularity. However, only for the simplest
preconditioned eigensolver, namely the preconditioned gradient iteration (or
preconditioned inverse iteration) with fixed step size, sharp non-asymptotic
convergence estimates are known and these estimates require an ideally scaled
preconditioner. In this paper a new sharp convergence estimate is derived for
the preconditioned steepest descent iteration which combines the preconditioned
gradient iteration with the Rayleigh-Ritz procedure for optimal line search
convergence acceleration. The new estimate always improves that of the fixed
step size iteration. The practical importance of this new estimate is that
arbitrarily scaled preconditioners can be used. The Rayleigh-Ritz procedure
implicitly computes the optimal scaling.Comment: 17 pages, 6 figure