research

Diagonal preconditioned conjugate gradient algorithm for unconstrained optimization

Abstract

The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method

    Similar works