11,974 research outputs found

    A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we present a specific three-term conjugate gradient method based on the multi-step quasi-Newton method. Finally, some numerical results of the proposed method are given

    Global Convergence of a Nonlinear Conjugate Gradient Method

    Get PDF
    A modified PRP nonlinear conjugate gradient method to solve unconstrained optimization problems is proposed. The important property of the proposed method is that the sufficient descent property is guaranteed independent of any line search. By the use of the Wolfe line search, the global convergence of the proposed method is established for nonconvex minimization. Numerical results show that the proposed method is effective and promising by comparing with the VPRP, CG-DESCENT, and DL+ methods

    Solving Unconstrained Optimization Problems by a New Conjugate Gradient Method with Sufficient Descent Property

    Get PDF
    There have been some conjugate gradient methods with strong convergence but numerical instability and conversely‎. Improving these methods is an interesting idea to produce new methods with both strong convergence and‎‏  â€Žnumerical stability‎. ‎In this paper‎, ‎a new hybrid conjugate gradient method is introduced based on the Fletcher ‎formula (CD) with strong convergence and the Liu and Storey formula (LS) with good numerical results‎. ‎New directions satisfy the sufficient descent property‎, ‎independent of line search‎. ‎Under some mild assumptions‎, ‎the global convergence of new hybrid method is proved‎. ‎Numerical results on unconstrained CUTEst test problems show that the new algorithm is ‎very robust and efficient‎

    An extended Dai-Liao conjugate gradient method with global convergence for nonconvex functions

    Get PDF
    Using an extension of some previously proposed modified secant equations in the Dai-Liao approach, a modified nonlinear conjugate gradient method is proposed. As interesting features, the method employs the objective function values in addition to the gradient information and satisfies the sufficient descent property with proper choices for its parameter. Global convergence of the method is established without convexity assumption on the objective function. Results of numerical comparisons are reported. They demonstrate efficiency of the proposed method in the sense of the Dolan-Moré performance profile

    Two Modified Three-Term Type Conjugate Gradient Methods and Their Global Convergence for Unconstrained Optimization

    Get PDF
    Two modified three-term type conjugate gradient algorithms which satisfy both the descent condition and the Dai-Liao type conjugacy condition are presented for unconstrained optimization. The first algorithm is a modification of the Hager and Zhang type algorithm in such a way that the search direction is descent and satisfies Dai-Liao’s type conjugacy condition. The second simple three-term type conjugate gradient method can generate sufficient decent directions at every iteration; moreover, this property is independent of the steplength line search. Also, the algorithms could be considered as a modification of the MBFGS method, but with different zk. Under some mild conditions, the given methods are global convergence, which is independent of the Wolfe line search for general functions. The numerical experiments show that the proposed methods are very robust and efficient
    • …
    corecore