10,707 research outputs found

    A Spectral Dai-Yuan-Type Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems

    Diagonal preconditioned conjugate gradient algorithm for unconstrained optimization

    Get PDF
    The nonlinear conjugate gradient (CG) methods have widely been used in solving unconstrained optimization problems. They are well-suited for large-scale optimization problems due to their low memory requirements and least computational costs. In this paper, a new diagonal preconditioned conjugate gradient (PRECG) algorithm is designed, and this is motivated by the fact that a pre-conditioner can greatly enhance the performance of the CG method. Under mild conditions, it is shown that the algorithm is globally convergent for strongly convex functions. Numerical results are presented to show that the new diagonal PRECG method works better than the standard CG method

    The algorithms of Broyden-CG for unconstrained optimization problems

    Get PDF
    The conjugate gradient method plays an important role in solving large-scaled problems and the quasi-Newton method is known as the most efficient method in solving unconstrained optimization problems. Therefore, in this paper, the new hybrid method between the conjugate gradient method and the quasi-newton method for solving optimization problem is suggested. The Broyden family formula is used as an approximation of Hessian in the hybrid method and the quasi-Newton method. Our numerical analysis provides strong evidence that our Broyden-CG method is more efficient than the ordinary Broyden method. Furthermore, we also prove that new algorithm is globally convergent and gratify the sufficient descent condition

    The Global Convergence of a New Mixed Conjugate Gradient Method for Unconstrained Optimization

    Get PDF
    We propose and generalize a new nonlinear conjugate gradient method for unconstrained optimization. The global convergence is proved with the Wolfe line search. Numerical experiments are reported which support the theoretical analyses and show the presented methods outperforming CGDESCENT method

    Combination of Penalty Function, Lagrange Multiplier and Conjugate Gradient Methods for the Solution of Constrained Optimization Problems

    Get PDF
    In this paper, we combined Langrage Multiplier, Penalty Function and Conjugate Gradient Methods (CPLCGM), to enable Conjugate Gradient Method (CGM) to be employed for solving constrained optimization problems. In the year past, Langrage Multiplier Method (LMM) has been used extensively to solve constrained optimization problems likewise Penalty Function Method (PFM). However, with some special features in CGM, which makes it unique in solving unconstrained optimization problems, we see that this features we be advantageous to solve constrained optimization problems if it can be properly amended. This, then call for the CPLCGM that is aimed at taking care of some constrained optimization problems, either with equality or inequality constraint but in this paper, we focus on equality constraints. The authors of this paper desired that, with the construction of the new Algorithm, it will circumvent the difficulties undergone using only LMM and as well as PFM to solve constrained optimization problems and its application will further improve the result of the Conjugate Gradient Method in solving this class of optimization problem. We applied the new algorithm to some constrained optimization problems and compared the results with the LMM and PFM

    New Conjugate Gradient Method for Unconstrained Optimization with Logistic Mapping

    Get PDF
    In this paper , we suggested a new conjugate gradient algorithm for unconstrained optimization based on logistic mapping, descent condition and sufficient descent condition for our  method are provided. Numerical results show that our presented algorithm is more efficient for solving nonlinear unconstrained optimization problems comparing with (DY)

    New Proposed Conjugate Gradient Method for Nonlinear Unconstrained Optimization

    Get PDF
    In this paper, we suggest a new conjugate gradient method for unconstrained optimization by using homotopy theory. Our suggestion algorithm satisfies the conjugacy and descent conditions. Numerical result shows that our new algorithm is better than the standard CG algorithm with respect to the NOI and NOF

    A New Conjugate Gradient for Unconstrained Optimization Based on Step Size of Barzilai and Borwein

    Get PDF
    In this paper, a new formula of  is suggested for conjugate gradient method of solving unconstrained optimization problems based on step size of Barzilai and Borwein. Our new proposed CG-method has descent condition, sufficient descent condition and global convergence properties. Numerical comparisons with a standard conjugate gradient algorithm show that this algorithm very effective depending on the number of iterations and the number of functions evaluation

    A new conjugate gradient method based on the modified secant equations

    Get PDF
    Abstract. Based on the secant equations proposed by Zhang, Deng and Chen, we propose a new nonlinear conjugate gradient method for unconstrained optimization problems. Global convergence of this method is established under some proper conditions

    Modified parameter of Dai Liao conjugacy condition of the conjugate gradient method

    Full text link
    The conjugate gradient (CG) method is widely used for solving nonlinear unconstrained optimization problems because it requires less memory to implement. In this paper, we propose a new parameter of the Dai Liao conjugacy condition of the CG method with the restart property, which depends on the Lipschitz constant and is related to the Hestenes Stiefel method. The proposed method satisfies the descent condition and global convergence properties for convex and non-convex functions. In the numerical experiment, we compare the new method with CG_Descent using more than 200 functions from the CUTEst library. The comparison results show that the new method outperforms CG Descent in terms of CPU time, number of iterations, number of gradient evaluations, and number of function evaluations.Comment: 20 Pages, 4 figure
    corecore