291 research outputs found

    A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we present a specific three-term conjugate gradient method based on the multi-step quasi-Newton method. Finally, some numerical results of the proposed method are given

    An Advanced Conjugate Gradient Training Algorithm Based on a Modified Secant Equation

    Get PDF

    Diagonal preconditioned conjugate gradient algorithm for unconstrained optimization

    Get PDF
    The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method

    Some Unconstrained Optimization Methods

    Get PDF
    Although it is a very old theme, unconstrained optimization is an area which is always actual for many scientists. Today, the results of unconstrained optimization are applied in different branches of science, as well as generally in practice. Here, we present the line search techniques. Further, in this chapter we consider some unconstrained optimization methods. We try to present these methods but also to present some contemporary results in this area

    Unconstrained Optimization Methods: Conjugate Gradient Methods and Trust-Region Methods

    Get PDF
    Here, we consider two important classes of unconstrained optimization methods: conjugate gradient methods and trust region methods. These two classes of methods are very interesting; it seems that they are never out of date. First, we consider conjugate gradient methods. We also illustrate the practical behavior of some conjugate gradient methods. Then, we study trust region methods. Considering these two classes of methods, we analyze some recent results

    Effective Modified Hybrid Conjugate Gradient Method for Large-Scale Symmetric Nonlinear Equations

    Get PDF
    In this paper, we proposed hybrid conjugate gradient method using the convex combination of FR and PRP conjugate gradient methods for solving Large-scale symmetric nonlinear equations via Andrei approach with nonmonotone line search. Logical formula for obtaining the convex parameter using Newton and our proposed directions was also proposed. Under appropriate conditions global convergence was established. Reported numerical results show that the proposed method is very promising
    corecore