14 research outputs found

    Inexact Arnoldi residual estimates and decay properties for functions of non-Hermitian matrices

    Get PDF
    We derive a priori residual-type bounds for the Arnoldi approximation of a matrix function and a strategy for setting the iteration accuracies in the inexact Arnoldi approximation of matrix functions. Such results are based on the decay behavior of the entries of functions of banded matrices. Specifically, we will use a priori decay bounds for the entries of functions of banded non-Hermitian matrices by using Faber polynomial series. Numerical experiments illustrate the quality of the results

    Approximation of functions of large matrices with Kronecker structure

    Full text link
    We consider the numerical approximation of f(A)bf({\cal A})b where b∈RNb\in{\mathbb R}^{N} and A\cal A is the sum of Kronecker products, that is A=M2⊗I+I⊗M1∈RN×N{\cal A}=M_2 \otimes I + I \otimes M_1\in{\mathbb R}^{N\times N}. Here ff is a regular function such that f(A)f({\cal A}) is well defined. We derive a computational strategy that significantly lowers the memory requirements and computational efforts of the standard approximations, with special emphasis on the exponential function, for which the new procedure becomes particularly advantageous. Our findings are illustrated by numerical experiments with typical functions used in applications

    An overview of block Gram-Schmidt methods and their stability properties

    Full text link
    Block Gram-Schmidt algorithms serve as essential kernels in many scientific computing applications, but for many commonly used variants, a rigorous treatment of their stability properties remains open. This survey provides a comprehensive categorization of block Gram-Schmidt algorithms, particularly those used in Krylov subspace methods to build orthonormal bases one block vector at a time. All known stability results are assembled, and new results are summarized or conjectured for important communication-reducing variants. Additionally, new block versions of low-synchronization variants are derived, and their efficacy and stability are demonstrated for a wide range of challenging examples. Low-synchronization variants appear remarkably stable for s-step-like matrices built with Newton polynomials, pointing towards a new stable and efficient backbone for Krylov subspace methods. Numerical examples are computed with a versatile MATLAB package hosted at https://github.com/katlund/BlockStab, and scripts for reproducing all results in the paper are provided. Block Gram-Schmidt implementations in popular software packages are discussed, along with a number of open problems. An appendix containing all algorithms type-set in a uniform fashion is provided.Comment: 42 pages, 5 tables, 17 figures, 20 algorithm

    Convergence of restarted Krylov subspace methods for Stieltjes functions of matrices

    No full text
    To approximate f(A)b---the action of a matrix function on a vector---by a Krylov subspace method, restarts may become mandatory due to storage requirements for the Arnoldi basis or due to the growing computational complexity of evaluating f on a Hessenberg matrix of growing size. A number of restarting methods have been proposed in the literature in recent years and there has been substantial algorithmic advancement concerning their stability and computational efficiency. However, the question under which circumstances convergence of these methods can be guaranteed has remained largely unanswered. In this paper we consider the class of Stieltjes functions and a related class, which contains important functions like the (inverse) square root and the matrix logarithm. For these classes of functions we present new theoretical results which prove convergence for Hermitian positive definite matrices A and arbitrary restart lengths. We also propose a modification of the Arnoldi approximation which guarantees convergence for the same classes of functions and any restart length if A is not necessarily Hermitian but positive real
    corecore