7 research outputs found

    The antitriangular factorisation of saddle point matrices

    Get PDF
    Mastronardi and Van Dooren [SIAM J. Matrix Anal. Appl., 34 (2013), pp. 173--196] recently introduced the block antitriangular (``Batman'') decomposition for symmetric indefinite matrices. Here we show the simplification of this factorization for saddle point matrices and demonstrate how it represents the common nullspace method. We show that rank-1 updates to the saddle point matrix can be easily incorporated into the factorization and give bounds on the eigenvalues of matrices important in saddle point theory. We show the relation of this factorization to constraint preconditioning and how it transforms but preserves the structure of block diagonal and block triangular preconditioners

    The antitriangular factorisation of saddle point matrices

    Get PDF
    Mastronardi and Van Dooren recently introduced the block antitriangular ("Batman") decomposition for symmetric indefinite matrices. Here we show the simplification of this factorisation for saddle point matrices and demonstrate how it represents the common nullspace method. We show the relation of this factorisation to constraint preconditioning and how it transforms but preserves the block diagonal structure of block diagonal preconditioning

    A comparative study of null-space factorizations for sparse symmetric saddle point systems

    Get PDF
    Null-space methods for solving saddle point systems of equations have long been used to transform an indefinite system into a symmetric positive definite one of smaller dimension. A number of independent works in the literature have identified that we can interpret a null-space method as a matrix factorization. We review these findings, highlight links between them, and bring them into a unified framework. We also investigate the suitability of using null-space factorizations to derive sparse direct methods, and present numerical results for both practical and academic problems

    ``Batman decomposition of a symmetric indefinite matrix

    Get PDF
    V práci nejprve zopakujeme vybrané základních pojmy, zejména vlastní čísla symetrických(obecně indefinitních) matic a kvadratické formy. Pak se zaměříme na vybrané zajímavé podprostory týkající se symetrických matic. Konkrétně tzv. nulový prostor, neutrální podprostor, nezáporný a nekladný podprostor. V práci ukážeme, že ne všechny tyto podprostory jsou dány jednoznačně, ale vždy je umíme zvolit tak, že např. první tři zmiňované jsou postupně svými podprostory. Toho využijeme k volbě vhodné ortonormální báze těchto podprostorů a jejich ortogonálních doplňků. Nakonec ukážeme, že takto zkonstruovaná báze (po drobných úpravách), resp. ortogonální matice, která má tyto bázové vektory jako sloupce, transformuje původní symetrickou matici na tzv. dolní blokově antitrojúhelníkový tvar. Odpovídající rozklad nazýváme Batman decomposition.At first we recapitulate some basic concepts such as eigenvalues of symmetric (in general indefinite) matrices and quadratic forms. Then, we focuse mainly on selected interesting subspaces related to symmetric matrices. Specifically, the so-called null-space, neutral subspace, nonnegative, and nonpositive subspaces. In the thesis we show that not all of these subspaces are given uniquely, in general, but we are always able to choose them in such a way that, e.g. the first three of above mentioned spaces are nested. We will use that for a choice of suitable orthonormal basis of these subspaces and their orthogonal complements. Finally, we show that a basis constructed like that (after small modifications), more precisely the orthogonal matrix having those basis vectors as columns, transforms original symmetric matrix into so-called lower block antitriangular form. We call the corresponding decomposition the Batman decomposition

    Spectral properties of kernel matrices in the flat limit

    Full text link
    Kernel matrices are of central importance to many applied fields. In this manuscript, we focus on spectral properties of kernel matrices in the so-called "flat limit", which occurs when points are close together relative to the scale of the kernel. We establish asymptotic expressions for the determinants of the kernel matrices, which we then leverage to obtain asymptotic expressions for the main terms of the eigenvalues. Analyticity of the eigenprojectors yields expressions for limiting eigenvectors, which are strongly tied to discrete orthogonal polynomials. Both smooth and finitely smooth kernels are covered, with stronger results available in the finite smoothness case.Comment: 40 pages, 8 page

    The antitriangular factorization of saddle point matrices

    No full text
    Mastronardi and Van Dooren [SIAM J. Matrix Anal. Appl., 34 (2013), pp. 173–196] recently introduced the block antitriangular (“Batman”) decomposition for symmetric indefinite matrices. Here we show the simplification of this factorization for saddle point matrices and demonstrate how it represents the common nullspace method. We show that rank-1 updates to the saddle point matrix can be easily incorporated into the factorization and give bounds on the eigenvalues of matrices important in saddle point theory. We show the relation of this factorization to constraint preconditioning and how it transforms but preserves the structure of block diagonal and block triangular preconditioner
    corecore