349 research outputs found

    Globally convergent techniques in nonlinear Newton-Krylov

    Get PDF
    Some convergence theory is presented for nonlinear Krylov subspace methods. The basic idea of these methods is to use variants of Newton's iteration in conjunction with a Krylov subspace method for solving the Jacobian linear systems. These methods are variants of inexact Newton methods where the approximate Newton direction is taken from a subspace of small dimensions. The main focus is to analyze these methods when they are combined with global strategies such as linesearch techniques and model trust region algorithms. Most of the convergence results are formulated for projection onto general subspaces rather than just Krylov subspaces

    On affine scaling inexact dogleg methods for bound-constrained nonlinear systems

    Get PDF
    Within the framework of affine scaling trust-region methods for bound constrained problems, we discuss the use of a inexact dogleg method as a tool for simultaneously handling the trust-region and the bound constraints while seeking for an approximate minimizer of the model. Focusing on bound-constrained systems of nonlinear equations, an inexact affine scaling method for large scale problems, employing the inexact dogleg procedure, is described. Global convergence results are established without any Lipschitz assumption on the Jacobian matrix, and locally fast convergence is shown under standard assumptions. Convergence analysis is performed without specifying the scaling matrix used to handle the bounds, and a rather general class of scaling matrices is allowed in actual algorithms. Numerical results showing the performance of the method are also given

    Preconditioned Continuation Model Predictive Control

    Full text link
    Model predictive control (MPC) anticipates future events to take appropriate control actions. Nonlinear MPC (NMPC) describes systems with nonlinear models and/or constraints. A Continuation/GMRES Method for NMPC, suggested by T. Ohtsuka in 2004, uses the GMRES iterative algorithm to solve a forward difference approximation Ax=bAx=b of the Continuation NMPC (CNMPC) equations on every time step. The coefficient matrix AA of the linear system is often ill-conditioned, resulting in poor GMRES convergence, slowing down the on-line computation of the control by CNMPC, and reducing control quality. We adopt CNMPC for challenging minimum-time problems, and improve performance by introducing efficient preconditioning, utilizing parallel computing, and substituting MINRES for GMRES.Comment: 8 pages, 6 figures. To appear in Proceedings SIAM Conference on Control and Its Applications, July 8-10, 2015, Paris, Franc

    A Parameterized multi-step Newton method for solving systems of nonlinear equations

    Get PDF
    We construct a novel multi-step iterative method for solving systems of nonlinear equations by introducing a parameter. to generalize the multi-step Newton method while keeping its order of convergence and computational cost. By an appropriate selection of theta, the new method can both have faster convergence and have larger radius of convergence. The new iterative method only requires one Jacobian inversion per iteration, and therefore, can be efficiently implemented using Krylov subspace methods. The new method can be used to solve nonlinear systems of partial differential equations, such as complex generalized Zakharov systems of partial differential equations, by transforming them into systems of nonlinear equations by discretizing approaches in both spatial and temporal independent variables such as, for instance, the Chebyshev pseudo-spectral discretizing method. Quite extensive tests show that the new method can have significantly faster convergence and significantly larger radius of convergence than the multi-step Newton method.Peer ReviewedPostprint (author's final draft

    Projected Newton Method for noise constrained Tikhonov regularization

    Full text link
    Tikhonov regularization is a popular approach to obtain a meaningful solution for ill-conditioned linear least squares problems. A relatively simple way of choosing a good regularization parameter is given by Morozov's discrepancy principle. However, most approaches require the solution of the Tikhonov problem for many different values of the regularization parameter, which is computationally demanding for large scale problems. We propose a new and efficient algorithm which simultaneously solves the Tikhonov problem and finds the corresponding regularization parameter such that the discrepancy principle is satisfied. We achieve this by formulating the problem as a nonlinear system of equations and solving this system using a line search method. We obtain a good search direction by projecting the problem onto a low dimensional Krylov subspace and computing the Newton direction for the projected problem. This projected Newton direction, which is significantly less computationally expensive to calculate than the true Newton direction, is then combined with a backtracking line search to obtain a globally convergent algorithm, which we refer to as the Projected Newton method. We prove convergence of the algorithm and illustrate the improved performance over current state-of-the-art solvers with some numerical experiments

    Nonlinear multigrid methods for second order differential operators with nonlinear diffusion coefficient

    Get PDF
    Nonlinear multigrid methods such as the Full Approximation Scheme (FAS) and Newton-multigrid (Newton-MG) are well established as fast solvers for nonlinear PDEs of elliptic and parabolic type. In this paper we consider Newton-MG and FAS iterations applied to second order differential operators with nonlinear diffusion coefficient. Under mild assumptions arising in practical applications, an approximation (shown to be sharp) of the execution time of the algorithms is derived, which demonstrates that Newton-MG can be expected to be a faster iteration than a standard FAS iteration for a finite element discretisation. Results are provided for elliptic and parabolic problems, demonstrating a faster execution time as well as greater stability of the Newton-MG iteration. Results are explained using current theory for the convergence of multigrid methods, giving a qualitative insight into how the nonlinear multigrid methods can be expected to perform in practice
    • …
    corecore