Implementation of an adaptive algorithm for Richardson's method

Abstract

AbstractWe discuss the implementation of an adaptive algorithm proposed by one of us. The algorithm is a hybrid of the gmres method and Richardson's method. Richardson's method (RM) depends on a set of parameters that are computed by minimizing the L2 norm of a polynomial over the convex hull of eigenvalues. Execution of GMRES yields not only an approximate solution but also the approximate convex hull. RM is used to avoid storing and working with a large number of vectors as GMRES often requires. This method is also advantageous for the solution of large problems. We consider several test problems and compare our algorithm primarily with the conjugate-gradient-squared algorithm, but also with GMRES and to CG (applied to the normal equations). For many (test) problems our algorithm takes roughly 50 percent more work than the conjugate-gradient-squared algorithm, although if the matrix is either preconditioned or indefinite, our algorithm is more efficient. However, our algorithm currently imposes an undesirable burden on the user, who is invited to consider a variety of numerical parameters to manipulate, such as the number of steps of rm, in order to enhance performance: the values we suggest are only empirical

    Similar works