49 research outputs found

    Shifted linear systems in electromagnetics. Part II: Systems with multiple right-hand sides

    Get PDF
    We consider the solution of multiply shifted linear systems for multiple right-hand sides. The coefficient matrix is symmetric, complex, and indefinite. The matrix is shifted by different multiples of the identity. Such problems arise in a number of applications, including the electromagnetic simulation in the development of microwave and mm-wave circuits and modules. The properties of microwave circuits can be described in terms of their scattering matrix which is extracted from the orthogonal decomposition of the electric field. We discretize the Maxwell's equations with orthogonal grids using the Finite Integration Technique (FIT). Some Krylov subspace methods have been used to solve systems with multiple right-hand sides. We use both the block-QMR method and a symmetric band Lanczos process based on coupled recurrences with polynomial preconditioning. We present a method for providing initial guesses to a linear solver both for systems with multiple shifts and for systems with multiple right-hand sides each with a different shift

    USRA/RIACS

    Get PDF
    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on 6 June 1983. RIACS is privately operated by USRA, a consortium of universities with research programs in the aerospace sciences, under a cooperative agreement with NASA. The primary mission of RIACS is to provide research and expertise in computer science and scientific computing to support the scientific missions of NASA ARC. The research carried out at RIACS must change its emphasis from year to year in response to NASA ARC's changing needs and technological opportunities. A flexible scientific staff is provided through a university faculty visitor program, a post doctoral program, and a student visitor program. Not only does this provide appropriate expertise but it also introduces scientists outside of NASA to NASA problems. A small group of core RIACS staff provides continuity and interacts with an ARC technical monitor and scientific advisory group to determine the RIACS mission. RIACS activities are reviewed and monitored by a USRA advisory council and ARC technical monitor. Research at RIACS is currently being done in the following areas: Parallel Computing; Advanced Methods for Scientific Computing; Learning Systems; High Performance Networks and Technology; Graphics, Visualization, and Virtual Environments

    On Optimal Short Recurrences for Generating Orthogonal Krylov Subspace Bases

    Full text link

    Preconditioned Recycling Krylov subspace methods for self-adjoint problems

    Full text link
    The authors propose a recycling Krylov subspace method for the solution of a sequence of self-adjoint linear systems. Such problems appear, for example, in the Newton process for solving nonlinear equations. Ritz vectors are automatically extracted from one MINRES run and then used for self-adjoint deflation in the next. The method is designed to work with arbitrary inner products and arbitrary self-adjoint positive-definite preconditioners whose inverse can be computed with high accuracy. Numerical experiments with nonlinear Schr\"odinger equations indicate a substantial decrease in computation time when recycling is used

    Orthogonal matrix polynomials and applications

    Get PDF
    AbstractOrthogonal matrix polynomials, on the real line or on the unit circle, have properties which are natural generalizations of properties of scalar orthogonal polynomials, appropriately modified for the matrix calculus. We show that orthogonal matrix polynomials, both on the real line and on the unit circle, appear at various places and we describe some of them. The spectral theory of doubly infinite Jacobi matrices can be described using orthogonal 2×2 matrix polynomials on the real line. Scalar orthogonal polynomials with a Sobolev inner product containing a finite number of derivatives can be studied using matrix orthogonal polynomials on the real line. Orthogonal matrix polynomials on the unit circle are related to unitary block Hessenberg matrices and are very useful in multivariate time series analysis and multichannel signal processing. Finally we show how orthogonal matrix polynomials can be used for Gaussian quadrature of matrix-valued functions. Parallel algorithms for this purpose have been implemented (using PVM) and some examples are worked out
    corecore