1,137 research outputs found

    On affine scaling inexact dogleg methods for bound-constrained nonlinear systems

    Get PDF
    Within the framework of affine scaling trust-region methods for bound constrained problems, we discuss the use of a inexact dogleg method as a tool for simultaneously handling the trust-region and the bound constraints while seeking for an approximate minimizer of the model. Focusing on bound-constrained systems of nonlinear equations, an inexact affine scaling method for large scale problems, employing the inexact dogleg procedure, is described. Global convergence results are established without any Lipschitz assumption on the Jacobian matrix, and locally fast convergence is shown under standard assumptions. Convergence analysis is performed without specifying the scaling matrix used to handle the bounds, and a rather general class of scaling matrices is allowed in actual algorithms. Numerical results showing the performance of the method are also given

    A Subspace, Interior, and Conjugate Gradient Method for Large-scale Bound-constrained Minimization Problems

    Full text link
    A subspace adaption of the Coleman-Li trust region and interior method is proposed for solving large-scale bound-constrained minimization problems. This method can be implemented with either sparse Cholesky factorization or conjugate gradient computation. Under reasonable conditions the convergence properties of this subspace trust region method are as strong as those of its full-space version. Computational performance on various large-scale test problems are reported; advantages of our approach are demonstrated. Our experience indicates our proposed method represents an efficient way to solve large-scalebound-constrained minimization problems

    Large Scale Computational Problems in Numerical Optimization

    Full text link

    Local Convergence of the Affine-Scaling Interior-Point Algorithm for Nonlinear Programming

    Get PDF
    This paper addresses the local convergence properties of the affine-scaling interior-point algorithm for nonlinear programming. The analysis of local convergence is developed in terms of parameters that control the interior-point scheme and the size of the residual of the linear system that provides the step direction. The analysis follows the classical theory for quasi-Newton methods and addresses q-linear, q-superlinear, and q-quadratic rates of convergence

    On affine scaling inexact dogleg methods for bound-constrained nonlinear systems

    Get PDF
    Within the framework of affine scaling trust-region methods for bound constrained problems, we discuss the use of a inexact dogleg method as a tool for simultaneously handling the trust-region and the bound constraints while seeking for an approximate minimizer of the model. Focusing on bound-constrained systems of nonlinear equations, an inexact affine scaling method for large scale problems, employing the inexact dogleg procedure, is described. Global convergence results are established without any Lipschitz assumption on the Jacobian matrix, and locally fast convergence is shown under standard assumptions. Convergence analysis is performed without specifying the scaling matrix used to handle the bounds, and a rather general class of scaling matrices is allowed in actual algorithms. Numerical results showing the performance of the method are also given

    Theory of functional connections applied to nonlinear programming under equality constraints

    Full text link
    This paper introduces an efficient approach to solve quadratic programming problems subject to equality constraints via the Theory of Functional Connections. This is done without using the traditional Lagrange multipliers approach, and the solution is provided in closed-form. Two distinct constrained expressions (satisfying the equality constraints) are introduced. The unknown vector optimization variable is then the free vector \B{g}, introduced by the Theory of Functional Connections, to derive constrained expressions. The solution to the general nonlinear programming problem is obtained by the Newton's method in optimization, and each iteration involves the second-order Taylor approximation, starting from an initial vector \B{x}^{(0)} which is a solution of the equality constraint. To solve the quadratic programming problems, we not only introduce the new approach but also provide a numerical accuracy and speed comparisons with respect to MATLAB's \verb"quadprog". To handle the nonlinear programming problem using the Theory of Functional Connections, convergence analysis of the proposed approach is provided.Comment: 21 pages, 1 figure, 1 table, submitted to Journal of Computational and Applied Mathematic

    Second order gradient ascent pulse engineering

    Full text link
    We report some improvements to the gradient ascent pulse engineering (GRAPE) algorithm for optimal control of quantum systems. These include more accurate gradients, convergence acceleration using the BFGS quasi-Newton algorithm as well as faster control derivative calculation algorithms. In all test systems, the wall clock time and the convergence rates show a considerable improvement over the approximate gradient ascent.Comment: Submitted for publicatio
    • …
    corecore