1 research outputs found

    Convergence conditions, line search algorithms and trust region implementations for the Polak-Ribière conjugate gradient method

    No full text
    We study globally convergent implementations of the Polak–Ribière (PR) conjugate gradient method for the unconstrained minimization of continuously differentiable functions. More specifically, first we state sufficient convergence conditions, which imply that limit points produced by the PR iteration are stationary points of the objective function and we prove that these conditions are satisfied, in particular, when the objective function has some generalized convexity property and exact line searches are performed. In the general case, we show that the convergence conditions can be enforced by means of various inexact line search schemes where, in addition to the usual acceptance criteria, further conditions are imposed on the stepsize. Then we define a new trust region implementation, which is compatible with the behavior of the PR method in the quadratic case, and may perform different linesearches in dependence of the norm of the search direction. In this framework, we show also that it is possible to define globally convergent modified PR iterations that permit exact linesearches at every iteration. Finally, we report the results of a numerical experimentation on a set of large problems
    corecore