605 research outputs found

    On the decay of the inverse of matrices that are sum of Kronecker products

    Full text link
    Decay patterns of matrix inverses have recently attracted considerable interest, due to their relevance in numerical analysis, and in applications requiring matrix function approximations. In this paper we analyze the decay pattern of the inverse of banded matrices in the form S=M⊗In+In⊗MS=M \otimes I_n + I_n \otimes M where MM is tridiagonal, symmetric and positive definite, InI_n is the identity matrix, and ⊗\otimes stands for the Kronecker product. It is well known that the inverses of banded matrices exhibit an exponential decay pattern away from the main diagonal. However, the entries in S−1S^{-1} show a non-monotonic decay, which is not caught by classical bounds. By using an alternative expression for S−1S^{-1}, we derive computable upper bounds that closely capture the actual behavior of its entries. We also show that similar estimates can be obtained when MM has a larger bandwidth, or when the sum of Kronecker products involves two different matrices. Numerical experiments illustrating the new bounds are also reported

    Flux vector splitting of the inviscid equations with application to finite difference methods

    Get PDF
    The conservation-law form of the inviscid gasdynamic equations has the remarkable property that the nonlinear flux vectors are homogeneous functions of degree one. This property readily permits the splitting of flux vectors into subvectors by similarity transformations so that each subvector has associated with it a specified eigenvalue spectrum. As a consequence of flux vector splitting, new explicit and implicit dissipative finite-difference schemes are developed for first-order hyperbolic systems of equations. Appropriate one-sided spatial differences for each split flux vector are used throughout the computational field even if the flow is locally subsonic. The results of some preliminary numerical computations are included

    Minimizing Communication for Eigenproblems and the Singular Value Decomposition

    Full text link
    Algorithms have two costs: arithmetic and communication. The latter represents the cost of moving data, either between levels of a memory hierarchy, or between processors over a network. Communication often dominates arithmetic and represents a rapidly increasing proportion of the total cost, so we seek algorithms that minimize communication. In \cite{BDHS10} lower bounds were presented on the amount of communication required for essentially all O(n3)O(n^3)-like algorithms for linear algebra, including eigenvalue problems and the SVD. Conventional algorithms, including those currently implemented in (Sca)LAPACK, perform asymptotically more communication than these lower bounds require. In this paper we present parallel and sequential eigenvalue algorithms (for pencils, nonsymmetric matrices, and symmetric matrices) and SVD algorithms that do attain these lower bounds, and analyze their convergence and communication costs.Comment: 43 pages, 11 figure

    Representation of the three-body Coulomb Green's function in parabolic coordinates: paths of integration

    Full text link
    The possibility is discussed of using straight-line paths of integration in computing the integral representation of the three-body Coulomb Green's function. In our numerical examples two different integration contours are considered. It is demonstrated that only one of these straight-line paths provides that the integral representation is valid
    • …
    corecore