1,099 research outputs found

    Diagonality Measures of Hermitian Positive-Definite Matrices with Application to the Approximate Joint Diagonalization Problem

    Full text link
    In this paper, we introduce properly-invariant diagonality measures of Hermitian positive-definite matrices. These diagonality measures are defined as distances or divergences between a given positive-definite matrix and its diagonal part. We then give closed-form expressions of these diagonality measures and discuss their invariance properties. The diagonality measure based on the log-determinant α\alpha-divergence is general enough as it includes a diagonality criterion used by the signal processing community as a special case. These diagonality measures are then used to formulate minimization problems for finding the approximate joint diagonalizer of a given set of Hermitian positive-definite matrices. Numerical computations based on a modified Newton method are presented and commented

    Novel Modifications of Parallel Jacobi Algorithms

    Get PDF
    We describe two main classes of one-sided trigonometric and hyperbolic Jacobi-type algorithms for computing eigenvalues and eigenvectors of Hermitian matrices. These types of algorithms exhibit significant advantages over many other eigenvalue algorithms. If the matrices permit, both types of algorithms compute the eigenvalues and eigenvectors with high relative accuracy. We present novel parallelization techniques for both trigonometric and hyperbolic classes of algorithms, as well as some new ideas on how pivoting in each cycle of the algorithm can improve the speed of the parallel one-sided algorithms. These parallelization approaches are applicable to both distributed-memory and shared-memory machines. The numerical testing performed indicates that the hyperbolic algorithms may be superior to the trigonometric ones, although, in theory, the latter seem more natural.Comment: Accepted for publication in Numerical Algorithm

    Small-Deviation Inequalities for Sums of Random Matrices

    Full text link
    Random matrices have played an important role in many fields including machine learning, quantum information theory and optimization. One of the main research focuses is on the deviation inequalities for eigenvalues of random matrices. Although there are intensive studies on the large-deviation inequalities for random matrices, only a few of works discuss the small-deviation behavior of random matrices. In this paper, we present the small-deviation inequalities for the largest eigenvalues of sums of random matrices. Since the resulting inequalities are independent of the matrix dimension, they are applicable to the high-dimensional and even the infinite-dimensional cases

    Computing a logarithm of a unitary matrix with general spectrum

    Full text link
    We analyze an algorithm for computing a skew-Hermitian logarithm of a unitary matrix. This algorithm is very easy to implement using standard software and it works well even for unitary matrices with no spectral conditions assumed. Certain examples, with many eigenvalues near -1, lead to very non-Hermitian output for other basic methods of calculating matrix logarithms. Altering the output of these algorithms to force an Hermitian output creates accuracy issues which are avoided in the considered algorithm. A modification is introduced to deal properly with the JJ-skew symmetric unitary matrices. Applications to numerical studies of topological insulators in two symmetry classes are discussed.Comment: Added discussion of Floquet Hamiltonian
    corecore