244 research outputs found

    A class of nonsymmetric preconditioners for saddle point problems

    Get PDF
    For iterative solution of saddle point problems, a nonsymmetric preconditioning is studied which, with respect to the upper-left block of the system matrix, can be seen as a variant of SSOR. An idealized situation where the SSOR is taken with respect to the skew-symmetric part plus the diagonal part of the upper-left block is analyzed in detail. Since action of the preconditioner involves solution of a Schur complement system, an inexact form of the preconditioner can be of interest. This results in an inner-outer iterative process. Numerical experiments with solution of linearized Navier-Stokes equations demonstrate efficiency of the new preconditioner, especially when the left-upper block is far from symmetric

    An efficient iterative method based on two-stage splitting methods to solve weakly nonlinear systems

    Full text link
    [EN] In this paper, an iterative method for solving large, sparse systems of weakly nonlinear equations is presented. This method is based on Hermitian/skew-Hermitian splitting (HSS) scheme. Under suitable assumptions, we establish the convergence theorem for this method. In addition, it is shown that any faster and less time-consuming two-stage splitting method that satisfies the convergence theorem can be replaced instead of the HSS inner iterations. Numerical results, such as CPU time, show the robustness of our new method. This method is easy, fast and convenient with an accurate solution.The third and fourth authors have been partially supported by the Spanish Ministerio de Ciencia, Innovacion y Universidades PGC2018-095896-B-C22 and Generalitat Valenciana PROMETEO/2016/089.Amiri, A.; Darvishi, MT.; Cordero Barbero, A.; Torregrosa Sánchez, JR. (2019). An efficient iterative method based on two-stage splitting methods to solve weakly nonlinear systems. Mathematics. 7(9):1-17. https://doi.org/10.3390/math7090815S1177

    GMRES-Accelerated ADMM for Quadratic Objectives

    Full text link
    We consider the sequence acceleration problem for the alternating direction method-of-multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a κ\kappa-conditioned problem in O(κ)O(\sqrt{\kappa}) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(κ1/4)O(\kappa^{1/4}) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(κ)O(\sqrt{\kappa}) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k2)O(1/k^{2}), instead of O(1/k)O(1/k), where kk is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on Optimization (SIOPT

    The Morse index of Chaperon's generating families

    Full text link
    This is an expository paper devoted to the Morse index of Chaperon's generating families of Hamiltonian diffeomorphisms. After reviewing the construction of such generating families, we present Bott's iteration theory in this setting: we study how the Morse index of a critical point corresponding to an iterated periodic orbit depends on the order of iteration of the orbit. We also investigate the precise dependence of the Morse index from the choice of the generating family associated to a given Hamiltonian diffeomorphism, which will allow to see the Morse index as a Maslov index for the linearized Hamiltonian flow in the symplectic group. We will conclude the survey with a proof that the classical Morse index from Tonelli Lagrangian dynamics coincides with the Maslov index.Comment: 45 pages, Version 2: minor corrections to Example 3.

    A universal matrix-free split preconditioner for the fixed-point iterative solution of non-symmetric linear systems

    Full text link
    We present an efficient preconditioner for linear problems Ax=yA x=y. It guarantees monotonic convergence of the memory-efficient fixed-point iteration for all accretive systems of the form A=L+VA = L + V, where LL is an approximation of AA, and the system is scaled so that the discrepancy is bounded with ∥V∥<1\lVert V \rVert<1. In contrast to common splitting preconditioners, our approach is not restricted to any particular splitting. Therefore, the approximate problem can be chosen so that an analytic solution is available to efficiently evaluate the preconditioner. We prove that the only preconditioner with this property has the form (L+I)(I−V)−1(L+I)(I - V)^{-1}. This unique form moreover permits the elimination of the forward problem from the preconditioned system, often halving the time required per iteration. We demonstrate and evaluate our approach for wave problems, diffusion problems, and pantograph delay differential equations. With the latter we show how the method extends to general, not necessarily accretive, linear systems.Comment: Rewritten version, includes efficiency comparison with shift preconditioner by Bai et al, which is shown to be a special cas

    Preconditioners for Krylov subspace methods: An overview

    Get PDF
    When simulating a mechanism from science or engineering, or an industrial process, one is frequently required to construct a mathematical model, and then resolve this model numerically. If accurate numerical solutions are necessary or desirable, this can involve solving large-scale systems of equations. One major class of solution methods is that of preconditioned iterative methods, involving preconditioners which are computationally cheap to apply while also capturing information contained in the linear system. In this article, we give a short survey of the field of preconditioning. We introduce a range of preconditioners for partial differential equations, followed by optimization problems, before discussing preconditioners constructed with less standard objectives in mind

    Author index for volumes 101–200

    Get PDF

    Numerical solution of saddle point problems

    Full text link
    • …
    corecore