11,914 research outputs found

    New Quasi-Newton Equation And Method Via Higher Order Tensor Models

    Get PDF
    This thesis introduces a general approach by proposing a new quasi-Newton (QN) equation via fourth order tensor model. To approximate the curvature of the objective function, more available information from the function-values and gradient is employed. The efficiency of the usual QN methods is improved by accelerating the performance of the algorithms without causing more storage demand. The presented equation allows the modification of several algorithms involving QN equations for practical optimization that possess superior convergence prop- erty. By using a new equation, the BFGS method is modified. This is done twice by employing two different strategies proposed by Zhang and Xu (2001) and Wei et al. (2006) to generate positive definite updates. The superiority of these methods compared to the standard BFGS and the modification proposed by Wei et al. (2006) is shown. Convergence analysis that gives the local and global convergence property of these methods and numerical results that shows the advantage of the modified QN methods are presented. Moreover, a new limited memory QN method to solve large scale unconstrained optimization is developed based on the modified BFGS updated formula. The comparison between this new method with that of the method developed by Xiao et al. (2008) shows better performance in numerical results for the new method. The global and local convergence properties of the new method on uniformly convex problems are also analyzed. The compact limited memory BFGS method is modified to solve the large scale unconstrained optimization problems. This method is derived from the proposed new QN update formula. The new method yields a more efficient algorithm compared to the standard limited memory BFGS with simple bounds (L-BFGS-B) method in the case of solving unconstrained problems. The implementation of the new proposed method on a set of test problems highlights that the derivation of this new method is more efficient in performing the standard algorithm

    Modified parameter of Dai Liao conjugacy condition of the conjugate gradient method

    Full text link
    The conjugate gradient (CG) method is widely used for solving nonlinear unconstrained optimization problems because it requires less memory to implement. In this paper, we propose a new parameter of the Dai Liao conjugacy condition of the CG method with the restart property, which depends on the Lipschitz constant and is related to the Hestenes Stiefel method. The proposed method satisfies the descent condition and global convergence properties for convex and non-convex functions. In the numerical experiment, we compare the new method with CG_Descent using more than 200 functions from the CUTEst library. The comparison results show that the new method outperforms CG Descent in terms of CPU time, number of iterations, number of gradient evaluations, and number of function evaluations.Comment: 20 Pages, 4 figure

    Modifications of the Limited Memory BFGS Algorithm for Large-scale Nonlinear Optimization

    Get PDF
    In this paper we present two new numerical methods for unconstrained large-scale optimization. These methods apply update formulae, which are derived by considering different techniques of approximating the objective function. Theoretical analysis is given to show the advantages of using these update formulae. It is observed that these update formulae can be employed within the framework of limited memory strategy with only a modest increase in the linear algebra cost. Comparative results with limited memory BFGS (L-BFGS) method are presented.</p

    A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

    Get PDF
    Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we present a specific three-term conjugate gradient method based on the multi-step quasi-Newton method. Finally, some numerical results of the proposed method are given
    corecore