18 research outputs found

    Multilevel quasiseparable matrices in PDE-constrained optimization

    Get PDF
    Optimization problems with constraints in the form of a partial differential equation arise frequently in the process of engineering design. The discretization of PDE-constrained optimization problems results in large-scale linear systems of saddle-point type. In this paper we propose and develop a novel approach to solving such systems by exploiting so-called quasiseparable matrices. One may think of a usual quasiseparable matrix as of a discrete analog of the Green's function of a one-dimensional differential operator. Nice feature of such matrices is that almost every algorithm which employs them has linear complexity. We extend the application of quasiseparable matrices to problems in higher dimensions. Namely, we construct a class of preconditioners which can be computed and applied at a linear computational cost. Their use with appropriate Krylov methods leads to algorithms of nearly linear complexity

    Groups of banded matrices with banded inverses

    Get PDF
    A product A=F[subscript 1]...F[subscript N] of invertible block-diagonal matrices will be banded with a banded inverse. We establish this factorization with the number N controlled by the bandwidths w and not by the matrix size n. When A is an orthogonal matrix, or a permutation, or banded plus finite rank, the factors F[subscript i] have w=1 and generate that corresponding group. In the case of infinite matrices, conjectures remain open

    Lecture 09: Hierarchically Low Rank and Kronecker Methods

    Get PDF
    Exploiting structures of matrices goes beyond identifying their non-zero patterns. In many cases, dense full-rank matrices have low-rank submatrices that can be exploited to construct fast approximate algorithms. In other cases, dense matrices can be decomposed into Kronecker factors that are much smaller than the original matrix. Sparsity is a consequence of the connectivity of the underlying geometry (mesh, graph, interaction list, etc.), whereas the rank-deficiency of submatrices is closely related to the distance within this underlying geometry. For high dimensional geometry encountered in data science applications, the curse of dimensionality poses a challenge for rank-structured approaches. On the other hand, models in data science that are formulated as a composition of functions, lead to a Kronecker product structure that yields a different kind of fast algorithm. In this lecture, we will look at some examples of when rank structure and Kronecker structure can be useful

    Lecture 09: Hierarchically Low Rank and Kronecker Methods

    Get PDF
    Exploiting structures of matrices goes beyond identifying their non-zero patterns. In many cases, dense full-rank matrices have low-rank submatrices that can be exploited to construct fast approximate algorithms. In other cases, dense matrices can be decomposed into Kronecker factors that are much smaller than the original matrix. Sparsity is a consequence of the connectivity of the underlying geometry (mesh, graph, interaction list, etc.), whereas the rank-deficiency of submatrices is closely related to the distance within this underlying geometry. For high dimensional geometry encountered in data science applications, the curse of dimensionality poses a challenge for rank-structured approaches. On the other hand, models in data science that are formulated as a composition of functions, lead to a Kronecker product structure that yields a different kind of fast algorithm. In this lecture, we will look at some examples of when rank structure and Kronecker structure can be useful

    Inverses of regular Hessenberg matrices

    Get PDF
    A new proof of the general representation for the entries of the inverse of any unreduced Hessenberg matrix of nite order is found. Also this formulation is extended to the inverses of reduced Hessenberg matrices. Those entries are given with proper Hessenbergians from the original matrix. It justies both the use of linear recurrences for such computations and some elementary properties of the inverse matrix. As an application of current interest in the theory of orthogonal polynomials on the complex plane, the resolvent matrix associated to a nite Hes- senberg matrix in standard form is calculated. The results are illustrated with two examples on the unit disk

    Rank and inertia of submatrices of the Moore-Penrose inverse of a Hermitian matrix

    Full text link
    corecore