5 research outputs found

    Differential qd algorithm with shifts for rank-structured matrices

    Full text link
    Although QR iterations dominate in eigenvalue computations, there are several important cases when alternative LR-type algorithms may be preferable. In particular, in the symmetric tridiagonal case where differential qd algorithm with shifts (dqds) proposed by Fernando and Parlett enjoys often faster convergence while preserving high relative accuracy (that is not guaranteed in QR algorithm). In eigenvalue computations for rank-structured matrices QR algorithm is also a popular choice since, in the symmetric case, the rank structure is preserved. In the unsymmetric case, however, QR algorithm destroys the rank structure and, hence, LR-type algorithms come to play once again. In the current paper we discover several variants of qd algorithms for quasiseparable matrices. Remarkably, one of them, when applied to Hessenberg matrices becomes a direct generalization of dqds algorithm for tridiagonal matrices. Therefore, it can be applied to such important matrices as companion and confederate, and provides an alternative algorithm for finding roots of a polynomial represented in the basis of orthogonal polynomials. Results of preliminary numerical experiments are presented

    Multilevel quasiseparable matrices in PDE-constrained optimization

    Get PDF
    Optimization problems with constraints in the form of a partial differential equation arise frequently in the process of engineering design. The discretization of PDE-constrained optimization problems results in large-scale linear systems of saddle-point type. In this paper we propose and develop a novel approach to solving such systems by exploiting so-called quasiseparable matrices. One may think of a usual quasiseparable matrix as of a discrete analog of the Green's function of a one-dimensional differential operator. Nice feature of such matrices is that almost every algorithm which employs them has linear complexity. We extend the application of quasiseparable matrices to problems in higher dimensions. Namely, we construct a class of preconditioners which can be computed and applied at a linear computational cost. Their use with appropriate Krylov methods leads to algorithms of nearly linear complexity

    Matrix-free interior point method for compressed sensing problems

    Get PDF
    We consider a class of optimization problems for sparse signal reconstruction which arise in the field of Compressed Sensing (CS). A plethora of approaches and solvers exist for such problems, for example GPSR, FPC AS, SPGL1, NestA, \ell_{1}_\ell_{s}, PDCO to mention a few. Compressed Sensing applications lead to very well conditioned optimization problems and therefore can be solved easily by simple first-order methods. Interior point methods (IPMs) rely on the Newton method hence they use the second-order information. They have numerous advantageous features and one clear drawback: being the second-order approach they need to solve linear equations and this operation has (in the general dense case) an O(n3)O(n^3) computational complexity. Attempts have been made to specialize IPMs to sparse reconstruction problems and they have led to interesting developments implemented in â„“1_â„“s\ell_1\_\ell_s and PDCO softwares. We go a few steps further. First, we use the matrix-free interior point method, an approach which redesigns IPM to avoid the need to explicitly formulate (and store) the Newton equation systems. Secondly, we exploit the special features of the signal processing matrices within the matrix-free IPM. Two such features are of particular interest: an excellent conditioning of these matrices and the ability to perform inexpensive (low complexity) matrix-vector multiplications with them. Computational experience with large scale one-dimensional signals confirms that the new approach is efficient and offers an attractive alternative to other state-of-the-art solvers

    Quasiseparable Matrices and Polynomials

    No full text
    The interplay between structured matrices and corresponding systems of polynomials is a classical topic, and two classical matrices: Jacobi (tridiagonal) and unitary Hessenberg that are often studied in this context are known to correspond to real orthogonal polynomials and Szegö polynomials, respectively. These two polynomial families arise in a wide variety of applications, and their short recurrence relations are often at the heart of a number of fast algorithms involving them. It has been shown recently that a family of low rank structured matrices called quasiseparable include both unitary Hessenberg and tridiagonal matrices thus allowing one to obtain true generalizations of several classical algorithms. Quasiseparable matrices also arise in many applications in linear system theory and control, statistics, mechanics, orthogonal polynomials and others. This justifies why quasiseparable matrices have been among the hottest research topics in Numerical Linear Algebra in recent years. ^ We present several results obtained for quasiseparable matrices and related areas. First, we describe the results of error analysis of several published quasiseparable system solvers that indicate that only one of them is a provably backward stable algorithm while the others are not. To the best of our knowledge, this is the first error analysis result for this type of matrices. Second, we obtained a classification of the subcasses of Hessenberg quasiseparable matrices via the recurrence relations satisfied by their characteristic polynomials and vice versa. This results let us to generalize classical fast Traub-like algorithms for inversion of polynomial Vandermode matrices to more general classes of polynomials. Next, we generalized some already classical results for CMV matrices, important in orthogonal polynomials theory. We also generalize the celebrated Markel-Grey filter signal flow graph structure and Kimura structure. Finally, we study the relation between signal flow graphs, quasiseparable matrices and numerical linear algebra algorithms.

    Differential qd Algorithm with Shifts for Rank-Structured Matrices

    No full text
    corecore