357 research outputs found

    Global Convergence of a Nonlinear Conjugate Gradient Method

    Get PDF
    A modified PRP nonlinear conjugate gradient method to solve unconstrained optimization problems is proposed. The important property of the proposed method is that the sufficient descent property is guaranteed independent of any line search. By the use of the Wolfe line search, the global convergence of the proposed method is established for nonconvex minimization. Numerical results show that the proposed method is effective and promising by comparing with the VPRP, CG-DESCENT, and DL+ methods

    Global convergence of new conjugate gradient method with inexact line search

    Get PDF
    In this paper, We propose a new nonlinear conjugate gradient method (FRA) that satisfies a sufficient descent condition and global convergence under the inexact line search of strong wolf powell. Our numerical experiment shaw the efficiency of the new method in solving a set of problems from the CUTEst package, the proposed new formula gives excellent numerical results at CPU time, number of iterations, number of gradient ratings when compared to WYL, DY, PRP, and FR methods

    A New Conjugate Gradient Algorithm with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    A new nonlinear conjugate gradient formula, which satisfies the sufficient descent condition, for solving unconstrained optimization problem is proposed. The global convergence of the algorithm is established under weak Wolfe line search. Some numerical experiments show that this new WWPNPRP+ algorithm is competitive to the SWPPRP+ algorithm, the SWPHS+ algorithm, and the WWPDYHS+ algorithm

    Effective Modified Hybrid Conjugate Gradient Method for Large-Scale Symmetric Nonlinear Equations

    Get PDF
    In this paper, we proposed hybrid conjugate gradient method using the convex combination of FR and PRP conjugate gradient methods for solving Large-scale symmetric nonlinear equations via Andrei approach with nonmonotone line search. Logical formula for obtaining the convex parameter using Newton and our proposed directions was also proposed. Under appropriate conditions global convergence was established. Reported numerical results show that the proposed method is very promising

    A dai-liao hybrid hestenes-stiefel and fletcher-revees methods for unconstrained optimization

    Get PDF
    Some problems have no analytical solution or too difficult to solve by scientists, engineers, and mathematicians, so the development of numerical methods to obtain approximate solutions became necessary. Gradient methods are more efficient when the function to be minimized continuously in its first derivative. Therefore, this article presents a new hybrid Conjugate Gradient (CG) method to solve unconstrained optimization problems. The method requires the first-order derivatives but overcomes the steepest descent method’s shortcoming of slow convergence and needs not to save or compute the second-order derivatives needed by the Newton method. The CG update parameter is suggested from the Dai-Liao conjugacy condition as a convex combination of Hestenes-Stiefel and Fletcher-Revees algorithms by employing an optimal modulating choice parameterto avoid matrix storage. Numerical computation adopts an inexact line search to obtain the step-size that generates a decent property, showing that the algorithm is robust and efficient. The scheme converges globally under Wolfe line search, and it’s like is suitable in compressive sensing problems and M-tensor systems

    The Global Convergence of a New Mixed Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    We propose and generalize a new nonlinear conjugate gradient method for unconstrained optimization. The global convergence is proved with the Wolfe line search. Numerical experiments are reported which support the theoretical analyses and show the presented methods outperforming CGDESCENT method

    A New Hybrid Approach for Solving Large-scale Monotone Nonlinear Equations

    Get PDF
    In this paper, a new hybrid conjugate gradient method for solving monotone nonlinear equations is introduced. The scheme is a combination of the Fletcher-Reeves (FR) and Polak-Ribiére-Polyak (PRP) conjugate gradient methods with the Solodov and Svaiter projection strategy. Using suitable assumptions, the global convergence of the scheme with monotone line search is provided. Lastly, a numerical experiment was used to enumerate the suitability of the proposed scheme for large-scale problems

    A COMPARATIVE STUDY OF SOME MODIFICATIONS OF CG METHODS UNDER EXACT LINE SEARCH

    Get PDF
    Conjugate Gradient (CG) method is a technique used in solving nonlinear unconstrained optimization problems. In this paper, we analysed the performance of two modifications and compared the results with the classical conjugate gradient methods of. These proposed methods possesse global convergence properties for general functions using exact line search. Numerical experiments show that the two modifications are more efficient for the test problems compared to classical CG coefficients
    • …
    corecore