251 research outputs found

    Sameh's parallel eigenvalue algorithm revisited

    Get PDF
    Numerical Computation

    A parallel algorithm for the eigenvalues and eigenvectors for a general complex matrix

    Get PDF
    A new parallel Jacobi-like algorithm is developed for computing the eigenvalues of a general complex matrix. Most parallel methods for this parallel typically display only linear convergence. Sequential norm-reducing algorithms also exit and they display quadratic convergence in most cases. The new algorithm is a parallel form of the norm-reducing algorithm due to Eberlein. It is proven that the asymptotic convergence rate of this algorithm is quadratic. Numerical experiments are presented which demonstrate the quadratic convergence of the algorithm and certain situations where the convergence is slow are also identified. The algorithm promises to be very competitive on a variety of parallel architectures

    The parallel computation of the smallest eigenpair of an acoustic problem with damping

    Get PDF
    Acoustic problems with damping may give rise to large quadratic eigenproblems. Efficient and parallelizable algorithms are required for solving these problems. The recently proposed Jacobi-Davidson method is well suited for parallel computing: no matrix decomposition and no back or forward substitutions are needed. This paper describes the parallel solution of the smallest eigenpair of a realistic and very large quadratic eigenproblem with the Jacobi-Davidson method

    Parallel Krylov Solvers for the Polynomial Eigenvalue Problem in SLEPc

    Full text link
    Polynomial eigenvalue problems are often found in scientific computing applications. When the coefficient matrices of the polynomial are large and sparse, usually only a few eigenpairs are required and projection methods are the best choice. We focus on Krylov methods that operate on the companion linearization of the polynomial but exploit the block structure with the aim of being memory-efficient in the representation of the Krylov subspace basis. The problem may appear in the form of a low-degree polynomial (quartic or quintic, say) expressed in the monomial basis, or a high-degree polynomial (coming from interpolation of a nonlinear eigenproblem) expressed in a nonmonomial basis. We have implemented a parallel solver in SLEPc covering both cases that is able to compute exterior as well as interior eigenvalues via spectral transformation. We discuss important issues such as scaling and restart and illustrate the robustness and performance of the solver with some numerical experiments.The first author was supported by the Spanish Ministry of Education, Culture and Sport through an FPU grant with reference AP2012-0608.Campos, C.; Román Moltó, JE. (2016). Parallel Krylov Solvers for the Polynomial Eigenvalue Problem in SLEPc. SIAM Journal on Scientific Computing. 38(5):385-411. https://doi.org/10.1137/15M1022458S38541138

    Jacobi-Davidson type methods for generalized eigenproblems and polynomial eigenproblems : part I

    Get PDF
    In this paper we will show how the Jacobi-Davidson iterative method can be used to solve generalized eigenproblems. Similar ideas as for the standard eigenproblem are used, but the projections, that are required to reduce the given problem to a small manageable size, need more attention. We show that by proper choices for the projection operators quadratic convergence can be achieved. The advantage of our approach is that none of the involved operators needs to be inverted. It turns out that similar projections can be used for the iterative approximation of selected eigenvalues and eigenvectors of polynomial eigenvalue equations. This approach has already been used with great success for the solution of quadratic eigenproblems associated with acoustic problems

    The LAPW method with eigendecomposition based on the Hari--Zimmermann generalized hyperbolic SVD

    Full text link
    In this paper we propose an accurate, highly parallel algorithm for the generalized eigendecomposition of a matrix pair (H,S)(H, S), given in a factored form (FJF,GG)(F^{\ast} J F, G^{\ast} G). Matrices HH and SS are generally complex and Hermitian, and SS is positive definite. This type of matrices emerges from the representation of the Hamiltonian of a quantum mechanical system in terms of an overcomplete set of basis functions. This expansion is part of a class of models within the broad field of Density Functional Theory, which is considered the golden standard in condensed matter physics. The overall algorithm consists of four phases, the second and the fourth being optional, where the two last phases are computation of the generalized hyperbolic SVD of a complex matrix pair (F,G)(F,G), according to a given matrix JJ defining the hyperbolic scalar product. If J=IJ = I, then these two phases compute the GSVD in parallel very accurately and efficiently.Comment: The supplementary material is available at https://web.math.pmf.unizg.hr/mfbda/papers/sm-SISC.pdf due to its size. This revised manuscript is currently being considered for publicatio
    corecore