226 research outputs found

    The Bramble-Pasciak preconditioner for saddle point problems

    Get PDF
    The Bramble-Pasciak Conjugate Gradient method is a well known tool to solve linear systems in saddle point form. A drawback of this method in order to ensure applicability of Conjugate Gradients is the need for scaling the preconditioner which typically involves the solution of an eigenvalue problem. Here, we introduce a modified preconditioner and inner product which without scaling enable the use of a MINRES variant and can be used for the simplified Lanczos process. Furthermore, the modified preconditioner and inner product can be combined with the original Bramble-Pasciak setup to give new preconditioners and inner products. We undermine the new methods by showing numerical experiments for Stokes problems

    A Bramble-Pasciak-like method with applications in optimization

    Get PDF
    Saddle-point systems arise in many applications areas, in fact in any situation where an extremum principle arises with constraints. The Stokes problem describing slow viscous flow of an incompressible fluid is a classic example coming from partial differential equations and in the area of Optimization such problems are ubiquitous.\ud In this manuscript we show how new approaches for the solution of saddle-point systems arising in Optimization can be derived from the Bramble-Pasciak Conjugate Gradient approach widely used in PDEs and more recent generalizations thereof. In particular we derive a class of new solution methods based on the use of Preconditioned Conjugate Gradients in non-standard inner products and demonstrate how these can be understood through more standard machinery. We show connections to Constraint Preconditioning and give the results of numerical computations on a number of standard Optimization test examples

    Combination preconditioning of saddle point systems for positive definiteness

    Get PDF
    Amongst recent contributions to preconditioning methods for saddle point systems, standard iterative methods in nonstandard inner products have been usefully employed. Krzyzanowski (Numer. Linear Algebra Appl. 2011; 18:123–140) identified a two-parameter family of preconditioners in this context and Stoll and Wathen (SIAM J. Matrix Anal. Appl. 2008; 30:582–608) introduced combination preconditioning, where two preconditioners, self-adjoint with respect to different inner products, can lead to further preconditioners and associated bilinear forms or inner products. Preconditioners that render the preconditioned saddle point matrix nonsymmetric but self-adjoint with respect to a nonstandard inner product always allow a MINRES-type method (W-PMINRES) to be applied in the relevant inner product. If the preconditioned matrix is also positive definite with respect to the inner product a more efficient CG-like method (W-PCG) can be reliably used. We establish eigenvalue expressions for Krzyzanowski preconditioners and show that for a specific choice of parameters, although the Krzyzanowski preconditioned saddle point matrix is self-adjoint with respect to an inner product, it is never positive definite. We provide explicit expressions for the combination of certain preconditioners and prove the rather counterintuitive result that the combination of two specific preconditioners for which only W-PMINRES can be reliably used leads to a preconditioner for which, for certain parameter choices, W-PCG is reliably applicable. That is, combining two indefinite preconditioners can lead to a positive definite preconditioner. This combination preconditioner outperforms either of the two preconditioners from which it is formed for a number of test problems

    A New Approximation of the Schur Complement in Preconditioners for PDE Constrained Optimization

    Get PDF
    Saddle point systems arise widely in optimization problems with constraints. The utility of Schur complement approximation is now broadly appreciated in the context of solving such saddle point systems by iteration. In this short manuscript, we present a new Schur complement approximation for PDE constrained optimization, an important class of these problems. Block diagonal and block triangular preconditioners have previously been designed to be used to solve such problems along with MINRES and non-standard Conjugate Gradients respectively; with appropriate approximation blocks these can be optimal in the sense that the time required for solution scales linearly with the problem size, however small the mesh size we use. In this paper, we extend this work to designing such preconditioners for which this optimality property holds independently of both the mesh size and of the Tikhonov regularization parameter \beta that is used. This also leads to an effective symmetric indefinite preconditioner that exhibits mesh and \beta-independence. We motivate the choice of these preconditioners based on observations about approximating the Schur complement obtained from the matrix system, derive eigenvalue bounds which verify the effectiveness of the approximation, and present numerical results which show that these new preconditioners work well in practice

    Fast iterative solvers for convection-diffusion control problems

    Get PDF
    In this manuscript, we describe effective solvers for the optimal control of stabilized convection-diffusion problems. We employ the local projection stabilization, which we show to give the same matrix system whether the discretize-then-optimize or optimize-then-discretize approach for this problem is used. We then derive two effective preconditioners for this problem, the �first to be used with MINRES and the second to be used with the Bramble-Pasciak Conjugate Gradient method. The key components of both preconditioners are an accurate mass matrix approximation, a good approximation of the Schur complement, and an appropriate multigrid process to enact this latter approximation. We present numerical results to demonstrate that these preconditioners result in convergence in a small number of iterations, which is robust with respect to the mesh size h, and the regularization parameter β, for a range of problems

    A Bramble-Pasciak conjugate gradient method for discrete Stokes equations with random viscosity

    Full text link
    We study the iterative solution of linear systems of equations arising from stochastic Galerkin finite element discretizations of saddle point problems. We focus on the Stokes model with random data parametrized by uniformly distributed random variables and discuss well-posedness of the variational formulations. We introduce a Bramble-Pasciak conjugate gradient method as a linear solver. It builds on a non-standard inner product associated with a block triangular preconditioner. The block triangular structure enables more sophisticated preconditioners than the block diagonal structure usually applied in MINRES methods. We show how the existence requirements of a conjugate gradient method can be met in our setting. We analyze the performance of the solvers depending on relevant physical and numerical parameters by means of eigenvalue estimates. For this purpose, we derive bounds for the eigenvalues of the relevant preconditioned sub-matrices. We illustrate our findings using the flow in a driven cavity as a numerical test case, where the viscosity is given by a truncated Karhunen-Lo\`eve expansion of a random field. In this example, a Bramble-Pasciak conjugate gradient method with block triangular preconditioner outperforms a MINRES method with block diagonal preconditioner in terms of iteration numbers.Comment: 19 pages, 1 figure, submitted to SIAM JU

    Preconditioners for state constrained optimal control problems\ud with Moreau-Yosida penalty function tube

    Get PDF
    Optimal control problems with partial differential equations play an important role in many applications. The inclusion of bound constraints for the state poses a significant challenge for optimization methods. Our focus here is on the incorporation of the constraints via the Moreau-Yosida regularization technique. This method has been studied recently and has proven to be advantageous compared to other approaches. In this paper we develop preconditioners for the efficient solution of the Newton steps associated with the fast solution of the Moreau-Yosida regularized problem. Numerical results illustrate the competitiveness of this approach. \ud \ud Copyright c 2000 John Wiley & Sons, Ltd

    Preconditioning for active set and projected gradient methods as\ud semi-smooth Newton methods for PDE-constrained optimization\ud with control constraints

    Get PDF
    Optimal control problems with partial differential equations play an important role in many applications. The inclusion of bound constraints for the control poses a significant additional challenge for optimization methods. In this paper we propose preconditioners for the saddle point problems that arise when a primal-dual active set method is used. We also show for this method that the same saddle point system can be derived when the method is considered as a semi-smooth Newton method. In addition, the projected gradient method can be employed to solve optimization problems with simple bounds and we discuss the efficient solution of the linear systems in question. In the case when an acceleration technique is employed for the projected gradient method, this again yields a semi-smooth Newton method that is equivalent to the primal-dual active set method. Numerical results illustrate the competitiveness of this approach

    Preconditioners for state constrained optimal control problems with Moreau-Yosida penalty function

    Get PDF
    Optimal control problems with partial differential equations as constraints play an important role in many applications. The inclusion of bound constraints for the state variable poses a significant challenge for optimization methods. Our focus here is on the incorporation of the constraints via the Moreau-Yosida regularization technique. This method has been studied recently and has proven to be advantageous compared to other approaches. In this paper we develop robust preconditioners for the efficient solution of the Newton steps associated with solving the Moreau-Yosida regularized problem. Numerical results illustrate the efficiency of our approach

    Preconditioning for Allen-Cahn variational inequalities with non-local constraints

    Get PDF
    The solution of Allen-Cahn variational inequalities with mass constraints is of interest in many applications. This problem can be solved both in its scalar and vector-valued form as a PDE-constrained optimization problem by means of a primal-dual active set method. At the heart of this method lies the solution of linear systems in saddle point form. In this paper we propose the use of Krylov-subspace solvers and suitable preconditioners for the saddle point systems. Numerical results illustrate the competitiveness of this approach
    corecore