363 research outputs found
A black-box rational Arnoldi variant for Cauchy-Stieltjes matrix functions
Rational Arnoldi is a powerful method for approximating functions of large sparse matrices times a vector. The selection of asymptotically optimal parameters for this method is crucial for its fast convergence. We present and investigate a novel strategy for the automated parameter selection when the function to be approximated is of Cauchy-Stieltjes (or Markov) type, such as the matrix square root or the logarithm. The performance of this approach is demonstrated by numerical examples involving symmetric and nonsymmetric matrices. These examples suggest that our black-box method performs at least as well, and typically better, as the standard rational Arnoldi method with parameters being manually optimized for a given matrix
Some observations on weighted GMRES
We investigate the convergence of the weighted GMRES method for solving linear systems. Two different weighting variants are compared with unweighted GMRES for three model problems, giving a phenomenological explanation of cases where weighting improves convergence, and a case where weighting has no effect on the convergence. We also present new alternative implementations of the weighted Arnoldi algorithm which may be favorable in terms of computational complexity, and examine stability issues connected with these implementations. Two implementations of weighted GMRES are compared for a large number of examples. We find that weighted GMRES may outperform unweighted GMRES for some problems, but more often this method is not competitive with other Krylov subspace methods like GMRES with deflated restarting or BICGSTAB, in particular when a preconditioner is used
A flexible and adaptive Simpler GMRES with deflated restarting for shifted linear systems
In this paper, two efficient iterative algorithms based on the simpler GMRES
method are proposed for solving shifted linear systems. To make full use of the
shifted structure, the proposed algorithms utilizing the deflated restarting
strategy and flexible preconditioning can significantly reduce the number of
matrix-vector products and the elapsed CPU time. Numerical experiments are
reported to illustrate the performance and effectiveness of the proposed
algorithms.Comment: 17 pages. 9 Tables, 1 figure; Newly update: add some new numerical
results and correct some typos and syntax error
Restarted Hessenberg method for solving shifted nonsymmetric linear systems
It is known that the restarted full orthogonalization method (FOM)
outperforms the restarted generalized minimum residual (GMRES) method in
several circumstances for solving shifted linear systems when the shifts are
handled simultaneously. Many variants of them have been proposed to enhance
their performance. We show that another restarted method, the restarted
Hessenberg method [M. Heyouni, M\'ethode de Hessenberg G\'en\'eralis\'ee et
Applications, Ph.D. Thesis, Universit\'e des Sciences et Technologies de Lille,
France, 1996] based on Hessenberg procedure, can effectively be employed, which
can provide accelerating convergence rate with respect to the number of
restarts. Theoretical analysis shows that the new residual of shifted restarted
Hessenberg method is still collinear with each other. In these cases where the
proposed algorithm needs less enough CPU time elapsed to converge than the
earlier established restarted shifted FOM, weighted restarted shifted FOM, and
some other popular shifted iterative solvers based on the short-term vector
recurrence, as shown via extensive numerical experiments involving the recent
popular applications of handling the time fractional differential equations.Comment: 19 pages, 7 tables. Some corrections for updating the reference
- …