31,729 research outputs found

    A COMPARATIVE STUDY OF SOME MODIFICATIONS OF CG METHODS UNDER EXACT LINE SEARCH

    Get PDF
    Conjugate Gradient (CG) method is a technique used in solving nonlinear unconstrained optimization problems. In this paper, we analysed the performance of two modifications and compared the results with the classical conjugate gradient methods of. These proposed methods possesse global convergence properties for general functions using exact line search. Numerical experiments show that the two modifications are more efficient for the test problems compared to classical CG coefficients

    Two Versions of the Spectral Nonlinear Conjugate Gradient Method

    Get PDF
    The nonlinear conjugate gradient method is widely used to solve unconstrained optimization problems. In this paper the development of different versions of nonlinear conjugate gradient methods with global convergence properties proved. Numerical results indicated that the proposed method is very efficient

    Global convergence properties of conjugate gradient methods for optimization

    Get PDF
    Projet PROMATHWe study the convergence of nonlinear conjugate gradient methods without restarts and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, non convex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribiere method. Numerical experiments are presented

    A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we present a specific three-term conjugate gradient method based on the multi-step quasi-Newton method. Finally, some numerical results of the proposed method are given

    On Algorithms Based on Joint Estimation of Currents and Contrast in Microwave Tomography

    Full text link
    This paper deals with improvements to the contrast source inversion method which is widely used in microwave tomography. First, the method is reviewed and weaknesses of both the criterion form and the optimization strategy are underlined. Then, two new algorithms are proposed. Both of them are based on the same criterion, similar but more robust than the one used in contrast source inversion. The first technique keeps the main characteristics of the contrast source inversion optimization scheme but is based on a better exploitation of the conjugate gradient algorithm. The second technique is based on a preconditioned conjugate gradient algorithm and performs simultaneous updates of sets of unknowns that are normally processed sequentially. Both techniques are shown to be more efficient than original contrast source inversion.Comment: 12 pages, 12 figures, 5 table

    Continuation-conjugate gradient methods for the least squares solution of nonlinear boundary value problems

    Get PDF
    We discuss in this paper a new combination of methods for solving nonlinear boundary value problems containing a parameter. Methods of the continuation type are combined with least squares formulations, preconditioned conjugate gradient algorithms and finite element approximations. We can compute branches of solutions with limit points, bifurcation points, etc. Several numerical tests illustrate the possibilities of the methods discussed in the present paper; these include the Bratu problem in one and two dimensions, one-dimensional bifurcation and perturbed bifurcation problems, the driven cavity problem for the Navier–Stokes equations
    • …
    corecore