576 research outputs found

    Chebyshev semi-iteration in Preconditioning

    Get PDF
    It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. Hence a semi-iterative method, which requires eigenvalue bounds and computes an explicit polynomial, must, for just a little less computational work, give an inferior result. In this manuscript we identify a specific situation in the context of preconditioning when the Chebyshev semi-iterative method is the method of choice since it has properties which make it superior to the Conjugate Gradient method

    A class of nonsymmetric preconditioners for saddle point problems

    Get PDF
    For iterative solution of saddle point problems, a nonsymmetric preconditioning is studied which, with respect to the upper-left block of the system matrix, can be seen as a variant of SSOR. An idealized situation where the SSOR is taken with respect to the skew-symmetric part plus the diagonal part of the upper-left block is analyzed in detail. Since action of the preconditioner involves solution of a Schur complement system, an inexact form of the preconditioner can be of interest. This results in an inner-outer iterative process. Numerical experiments with solution of linearized Navier-Stokes equations demonstrate efficiency of the new preconditioner, especially when the left-upper block is far from symmetric

    The Bramble-Pasciak preconditioner for saddle point problems

    Get PDF
    The Bramble-Pasciak Conjugate Gradient method is a well known tool to solve linear systems in saddle point form. A drawback of this method in order to ensure applicability of Conjugate Gradients is the need for scaling the preconditioner which typically involves the solution of an eigenvalue problem. Here, we introduce a modified preconditioner and inner product which without scaling enable the use of a MINRES variant and can be used for the simplified Lanczos process. Furthermore, the modified preconditioner and inner product can be combined with the original Bramble-Pasciak setup to give new preconditioners and inner products. We undermine the new methods by showing numerical experiments for Stokes problems

    GMRES-Accelerated ADMM for Quadratic Objectives

    Full text link
    We consider the sequence acceleration problem for the alternating direction method-of-multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a κ\kappa-conditioned problem in O(κ)O(\sqrt{\kappa}) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(κ1/4)O(\kappa^{1/4}) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(κ)O(\sqrt{\kappa}) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k2)O(1/k^{2}), instead of O(1/k)O(1/k), where kk is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on Optimization (SIOPT

    Preconditioning for Allen-Cahn variational inequalities with non-local constraints

    Get PDF
    The solution of Allen-Cahn variational inequalities with mass constraints is of interest in many applications. This problem can be solved both in its scalar and vector-valued form as a PDE-constrained optimization problem by means of a primal-dual active set method. At the heart of this method lies the solution of linear systems in saddle point form. In this paper we propose the use of Krylov-subspace solvers and suitable preconditioners for the saddle point systems. Numerical results illustrate the competitiveness of this approach

    Fast Solvers for Cahn-Hilliard Inpainting

    Get PDF
    We consider the efficient solution of the modified Cahn-Hilliard equation for binary image inpainting using convexity splitting, which allows an unconditionally gradient stable time-discretization scheme. We look at a double-well as well as a double obstacle potential. For the latter we get a nonlinear system for which we apply a semi-smooth Newton method combined with a Moreau-Yosida regularization technique. At the heart of both methods lies the solution of large and sparse linear systems. We introduce and study block-triangular preconditioners using an efficient and easy to apply Schur complement approximation. Numerical results indicate that our preconditioners work very well for both problems and show that qualitatively better results can be obtained using the double obstacle potential

    Preconditioning for Allen-Cahn variational inequalities with non-local constraints

    Get PDF
    The solution of Allen-Cahn variational inequalities with mass constraints is of interest in many applications. This problem can be solved both in its scalar and vector-valued form as a PDE-constrained optimization problem by means of a primal-dual active set method. At the heart of this method lies the solution of linear systems in saddle point form. In this paper we propose the use of Krylov-subspace solvers and suitable preconditioners for the saddle point systems. Numerical results illustrate the competitiveness of this approach

    All-at-once solution of time-dependent PDE-constrained optimization problems

    Get PDF
    Time-dependent partial differential equations (PDEs) play an important role in applied mathematics and many other areas of science. One-shot methods try to compute the solution to these problems in a single iteration that solves for all time-steps at the same time. In this paper, we look at one-shot approaches for the optimal control of time-dependent PDEs and focus on the fast solution of these problems. The use of Krylov subspace solvers together with an efficient preconditioner allows for minimal storage requirements. We solve only approximate time-evolutions for both forward and adjoint problem and compute accurate solutions of a given control problem only at convergence of the overall Krylov subspace iteration. We show that our approach can give competitive results for a variety of problem formulations

    All-at-Once Solution if Time-Dependent PDE-Constrained Optimisation Problems

    Get PDF
    Time-dependent partial differential equations (PDEs) play an important role in applied mathematics and many other areas of science. One-shot methods try to compute the solution to these problems in a single iteration that solves for all time-steps at the same time. In this paper, we look at one-shot approaches for the optimal control of time-dependent PDEs and focus on the fast solution of these problems. The use of Krylov subspace solvers together with an efficient preconditioner allows for minimal storage requirements. We solve only approximate time-evolutions for both forward and adjoint problem and compute accurate solutions of a given control problem only at convergence of the overall Krylov subspace iteration. We show that our approach can give competitive results for a variety of problem formulations
    • …
    corecore