198 research outputs found

    GMRES-Accelerated ADMM for Quadratic Objectives

    Full text link
    We consider the sequence acceleration problem for the alternating direction method-of-multipliers (ADMM) applied to a class of equality-constrained problems with strongly convex quadratic objectives, which frequently arise as the Newton subproblem of interior-point methods. Within this context, the ADMM update equations are linear, the iterates are confined within a Krylov subspace, and the General Minimum RESidual (GMRES) algorithm is optimal in its ability to accelerate convergence. The basic ADMM method solves a κ\kappa-conditioned problem in O(κ)O(\sqrt{\kappa}) iterations. We give theoretical justification and numerical evidence that the GMRES-accelerated variant consistently solves the same problem in O(κ1/4)O(\kappa^{1/4}) iterations for an order-of-magnitude reduction in iterations, despite a worst-case bound of O(κ)O(\sqrt{\kappa}) iterations. The method is shown to be competitive against standard preconditioned Krylov subspace methods for saddle-point problems. The method is embedded within SeDuMi, a popular open-source solver for conic optimization written in MATLAB, and used to solve many large-scale semidefinite programs with error that decreases like O(1/k2)O(1/k^{2}), instead of O(1/k)O(1/k), where kk is the iteration index.Comment: 31 pages, 7 figures. Accepted for publication in SIAM Journal on Optimization (SIOPT

    An alternating positive semidefinite splitting preconditioner for the three-by-three block saddle point problems

    Get PDF
    Using the idea of dimensional splitting method we present an iteration method for solving three-by-three block saddle point problems which can appear in linear programming and finite element discretization of the Maxwell equation. We prove that the method is convergent unconditionally. Then the induced preconditioner is used to accelerate the convergence of the GMRES method for solving the system. Numerical results are presented to compare the performance of the method with some existing ones

    An alternating positive semidefinite splitting preconditioner for the three-by-three block saddle point problems

    Get PDF
    Using the idea of dimensional splitting method we present an iteration method for solving three-by-three block saddle point problems which can appear in linear programming and finite element discretization of the Maxwell equation. We prove that the method is convergent unconditionally. Then the induced preconditioner is used to accelerate the convergence of the GMRES method for solving the system. Numerical results are presented to compare the performance of the method with some existing ones

    When is the Hermitian/skew-Hermitian part of a matrix a potent matrix?

    Full text link
    [EN] This paper deals with the Hermitian H(A) and skew-Hermitian part S(A) of a complex matrix A. We characterize all complex matrices A such that H(A), respectively S(A), is a potent matrix. Two approaches are used: characterizations of idempotent and tripotent Hermitian matrices of the form [ X Y* Y 0], and a singular value decomposition of A. In addition, a relation between the potency of H(A), respectively S(A), and the normality of A is also studied.Supported by the Ministry of Science, Educations and Sports of the Republic of Croatia (Grant 037-0372784-2757). Supported by the Ministry of Education of Spain (Grant DGI MTM2010-18228), the Universidad Nacional de La Pampa, Argentina (Grant Resol. N. 049/11), and the Ministry of Education of Argentina (PPUA, Grant Resol. 228, SPU, 14-15-222)Ilisevic, D.; Thome, N. (2012). When is the Hermitian/skew-Hermitian part of a matrix a potent matrix?. Electronic Journal of Linear Algebra. 24:95-112. https://doi.org/10.13001/1081-3810.1582S951122

    Preconditioners for Krylov subspace methods: An overview

    Get PDF
    When simulating a mechanism from science or engineering, or an industrial process, one is frequently required to construct a mathematical model, and then resolve this model numerically. If accurate numerical solutions are necessary or desirable, this can involve solving large-scale systems of equations. One major class of solution methods is that of preconditioned iterative methods, involving preconditioners which are computationally cheap to apply while also capturing information contained in the linear system. In this article, we give a short survey of the field of preconditioning. We introduce a range of preconditioners for partial differential equations, followed by optimization problems, before discussing preconditioners constructed with less standard objectives in mind

    Natural preconditioners for saddle point systems

    Get PDF
    The solution of quadratic or locally quadratic extremum problems subject to linear(ized) constraints gives rise to linear systems in saddle point form. This is true whether in the continuous or discrete setting, so saddle point systems arising from discretization of partial differential equation problems such as those describing electromagnetic problems or incompressible flow lead to equations with this structure as does, for example, the widely used sequential quadratic programming approach to nonlinear optimization.\ud This article concerns iterative solution methods for these problems and in particular shows how the problem formulation leads to natural preconditioners which guarantee rapid convergence of the relevant iterative methods. These preconditioners are related to the original extremum problem and their effectiveness -- in terms of rapidity of convergence -- is established here via a proof of general bounds on the eigenvalues of the preconditioned saddle point matrix on which iteration convergence depends
    corecore