19,446 research outputs found

    Hierarchical Schur complement preconditioner for the stochastic Galerkin finite element methods

    Full text link
    Use of the stochastic Galerkin finite element methods leads to large systems of linear equations obtained by the discretization of tensor product solution spaces along their spatial and stochastic dimensions. These systems are typically solved iteratively by a Krylov subspace method. We propose a preconditioner which takes an advantage of the recursive hierarchy in the structure of the global matrices. In particular, the matrices posses a recursive hierarchical two-by-two structure, with one of the submatrices block diagonal. Each one of the diagonal blocks in this submatrix is closely related to the deterministic mean-value problem, and the action of its inverse is in the implementation approximated by inner loops of Krylov iterations. Thus our hierarchical Schur complement preconditioner combines, on each level in the approximation of the hierarchical structure of the global matrix, the idea of Schur complement with loops for a number of mutually independent inner Krylov iterations, and several matrix-vector multiplications for the off-diagonal blocks. Neither the global matrix, nor the matrix of the preconditioner need to be formed explicitly. The ingredients include only the number of stiffness matrices from the truncated Karhunen-Lo\`{e}ve expansion and a good preconditioned for the mean-value deterministic problem. We provide a condition number bound for a model elliptic problem and the performance of the method is illustrated by numerical experiments.Comment: 15 pages, 2 figures, 9 tables, (updated numerical experiments

    Chebyshev semi-iteration in Preconditioning

    Get PDF
    It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. Hence a semi-iterative method, which requires eigenvalue bounds and computes an explicit polynomial, must, for just a little less computational work, give an inferior result. In this manuscript we identify a specific situation in the context of preconditioning when the Chebyshev semi-iterative method is the method of choice since it has properties which make it superior to the Conjugate Gradient method
    corecore