66 research outputs found

    Differential qd algorithm with shifts for rank-structured matrices

    Full text link
    Although QR iterations dominate in eigenvalue computations, there are several important cases when alternative LR-type algorithms may be preferable. In particular, in the symmetric tridiagonal case where differential qd algorithm with shifts (dqds) proposed by Fernando and Parlett enjoys often faster convergence while preserving high relative accuracy (that is not guaranteed in QR algorithm). In eigenvalue computations for rank-structured matrices QR algorithm is also a popular choice since, in the symmetric case, the rank structure is preserved. In the unsymmetric case, however, QR algorithm destroys the rank structure and, hence, LR-type algorithms come to play once again. In the current paper we discover several variants of qd algorithms for quasiseparable matrices. Remarkably, one of them, when applied to Hessenberg matrices becomes a direct generalization of dqds algorithm for tridiagonal matrices. Therefore, it can be applied to such important matrices as companion and confederate, and provides an alternative algorithm for finding roots of a polynomial represented in the basis of orthogonal polynomials. Results of preliminary numerical experiments are presented

    A CMV--based eigensolver for companion matrices

    Get PDF
    In this paper we present a novel matrix method for polynomial rootfinding. By exploiting the properties of the QR eigenvalue algorithm applied to a suitable CMV-like form of a companion matrix we design a fast and computationally simple structured QR iteration.Comment: 14 pages, 4 figure

    An Implicit Q Theorem for Hessenberg-like Matrices

    Full text link

    Computing All or Some Eigenvalues of Symmetric ℋ<sub>ℓ</sub>-Matrices

    Get PDF

    Row Compression and Nested Product Decomposition of a Hierarchical Representation of a Quasiseparable Matrix

    Get PDF
    This research introduces a row compression and nested product decomposition of an nxn hierarchical representation of a rank structured matrix A, which extends the compression and nested product decomposition of a quasiseparable matrix. The hierarchical parameter extraction algorithm of a quasiseparable matrix is efficient, requiring only O(nlog(n))operations, and is proven backward stable. The row compression is comprised of a sequence of small Householder transformations that are formed from the low-rank, lower triangular, off-diagonal blocks of the hierarchical representation. The row compression forms a factorization of matrix A, where A = QC, Q is the product of the Householder transformations, and C preserves the low-rank structure in both the lower and upper triangular parts of matrix A. The nested product decomposition is accomplished by applying a sequence of orthogonal transformations to the low-rank, upper triangular, off-diagonal blocks of the compressed matrix C. Both the compression and decomposition algorithms are stable, and require O(nlog(n)) operations. At this point, the matrix-vector product and solver algorithms are the only ones fully proven to be backward stable for quasiseparable matrices. By combining the fast matrix-vector product and system solver, linear systems involving the hierarchical representation to nested product decomposition are directly solved with linear complexity and unconditional stability. Applications in image deblurring and compression, that capitalize on the concepts from the row compression and nested product decomposition algorithms, will be shown

    Quasiseparable Hessenberg reduction of real diagonal plus low rank matrices and applications

    Full text link
    We present a novel algorithm to perform the Hessenberg reduction of an n×nn\times n matrix AA of the form A=D+UVA = D + UV^* where DD is diagonal with real entries and UU and VV are n×kn\times k matrices with knk\le n. The algorithm has a cost of O(n2k)O(n^2k) arithmetic operations and is based on the quasiseparable matrix technology. Applications are shown to solving polynomial eigenvalue problems and some numerical experiments are reported in order to analyze the stability of the approac

    Computing the k-th Eigenvalue of Symmetric H2H^2-Matrices

    Full text link
    The numerical solution of eigenvalue problems is essential in various application areas of scientific and engineering domains. In many problem classes, the practical interest is only a small subset of eigenvalues so it is unnecessary to compute all of the eigenvalues. Notable examples are the electronic structure problems where the kk-th smallest eigenvalue is closely related to the electronic properties of materials. In this paper, we consider the kk-th eigenvalue problems of symmetric dense matrices with low-rank off-diagonal blocks. We present a linear time generalized LDL decomposition of H2\mathcal{H}^2 matrices and combine it with the bisection eigenvalue algorithm to compute the kk-th eigenvalue with controllable accuracy. In addition, if more than one eigenvalue is required, some of the previous computations can be reused to compute the other eigenvalues in parallel. Numerical experiments show that our method is more efficient than the state-of-the-art dense eigenvalue solver in LAPACK/ScaLAPACK and ELPA. Furthermore, tests on electronic state calculations of carbon nanomaterials demonstrate that our method outperforms the existing HSS-based bisection eigenvalue algorithm on 3D problems.Comment: 14 pages, 11 figure
    corecore