7 research outputs found
Theoretical and Computable Optimal Subspace Expansions for Matrix Eigenvalue Problems
Consider the optimal subspace expansion problem for the matrix eigenvalue
problem : {\em Which vector in the current subspace
, after multiplied by , provides an optimal subspace expansion
for approximating a desired eigenvector in the sense that has the
smallest angle with the expanded subspace , i.e.,
}? This problem
is important as many iterative methods construct nested subspaces that
successively expand to . Ye ({\em Linear Algebra
Appl.}, 428 (2008), p. 911--918) derives an expression of for
general, but it could not be exploited to construct a computable (nearly)
optimally expanded subspace. He turns to deriving a maximization
characterization of for a {\em given} when is Hermitian, but his proof and analysis cannot extend to
the non-Hermitian case. We generalize Ye's maximization characterization to the
general case and find its maximizer. Our main contributions consist of explicit
expressions of , and the optimally expanded subspace
for general, where is the orthogonal
projector onto . These results can be fully exploited to obtain
computable optimally expanded subspaces
within the framework of the standard, harmonic, refined, and refined harmonic
Rayleigh--Ritz methods.Comment: 20 pages, 3 figure
Harmonic and Refined Harmonic Shift-Invert Residual Arnoldi and Jacobi--Davidson Methods for Interior Eigenvalue Problems
This paper concerns the harmonic shift-invert residual Arnoldi (HSIRA) and
Jacobi--Davidson (HJD) methods as well as their refined variants RHSIRA and
RHJD for the interior eigenvalue problem. Each method needs to solve an inner
linear system to expand the subspace successively. When the linear systems are
solved only approximately, we are led to the inexact methods. We prove that the
inexact HSIRA, RHSIRA, HJD and RHJD methods mimic their exact counterparts well
when the inner linear systems are solved with only low or modest accuracy. We
show that (i) the exact HSIRA and HJD expand subspaces better than the exact
SIRA and JD and (ii) the exact RHSIRA and RHJD expand subspaces better than the
exact HSIRA and HJD. Based on the theory, we design stopping criteria for inner
solves. To be practical, we present restarted HSIRA, HJD, RHSIRA and RHJD
algorithms. Numerical results demonstrate that these algorithms are much more
efficient than the restarted standard SIRA and JD algorithms and furthermore
the refined harmonic algorithms outperform the harmonic ones very
substantially.Comment: 15 pages, 4 figure
On Inner Iterations in the Shift-Invert Residual Arnoldi Method and the Jacobi--Davidson Method
Using a new analysis approach, we establish a general convergence theory of
the Shift-Invert Residual Arnoldi (SIRA) method for computing a simple
eigenvalue nearest to a given target and the associated eigenvector.
In SIRA, a subspace expansion vector at each step is obtained by solving a
certain inner linear system. We prove that the inexact SIRA method mimics the
exact SIRA well, that is, the former uses almost the same outer iterations to
achieve the convergence as the latter does if all the inner linear systems are
iteratively solved with {\em low} or {\em modest} accuracy during outer
iterations. Based on the theory, we design practical stopping criteria for
inner solves. Our analysis is on one step expansion of subspace and the
approach applies to the Jacobi--Davidson (JD) method with the fixed target
as well, and a similar general convergence theory is obtained for it.
Numerical experiments confirm our theory and demonstrate that the inexact SIRA
and JD are similarly effective and are considerably superior to the inexact
SIA.Comment: 20 pages, 8 figure