13 research outputs found

    Expansion algorithm for the density matrix

    Full text link
    A purification algorithm for expanding the single-particle density matrix in terms of the Hamiltonian operator is proposed. The scheme works with a predefined occupation and requires less than half the number of matrix-matrix multiplications compared to existing methods at low (90%) occupancy. The expansion can be used with a fixed chemical potential in which case it is an asymmetric generalization of and a substantial improvement over grand canonical McWeeny purification. It is shown that the computational complexity, measured as number of matrix multiplications, essentially is independent of system size even for metallic materials with a vanishing band gap.Comment: 5 pages, 4 figures, to appear in Phys. Rev.

    On the Convergence of Ritz Pairs and Refined Ritz Vectors for Quadratic Eigenvalue Problems

    Full text link
    For a given subspace, the Rayleigh-Ritz method projects the large quadratic eigenvalue problem (QEP) onto it and produces a small sized dense QEP. Similar to the Rayleigh-Ritz method for the linear eigenvalue problem, the Rayleigh-Ritz method defines the Ritz values and the Ritz vectors of the QEP with respect to the projection subspace. We analyze the convergence of the method when the angle between the subspace and the desired eigenvector converges to zero. We prove that there is a Ritz value that converges to the desired eigenvalue unconditionally but the Ritz vector converges conditionally and may fail to converge. To remedy the drawback of possible non-convergence of the Ritz vector, we propose a refined Ritz vector that is mathematically different from the Ritz vector and is proved to converge unconditionally. We construct examples to illustrate our theory.Comment: 20 page

    Two-sided Grassmann-Rayleigh quotient iteration

    Full text link
    The two-sided Rayleigh quotient iteration proposed by Ostrowski computes a pair of corresponding left-right eigenvectors of a matrix CC. We propose a Grassmannian version of this iteration, i.e., its iterates are pairs of pp-dimensional subspaces instead of one-dimensional subspaces in the classical case. The new iteration generically converges locally cubically to the pairs of left-right pp-dimensional invariant subspaces of CC. Moreover, Grassmannian versions of the Rayleigh quotient iteration are given for the generalized Hermitian eigenproblem, the Hamiltonian eigenproblem and the skew-Hamiltonian eigenproblem.Comment: The text is identical to a manuscript that was submitted for publication on 19 April 200

    Backward Errors for Eigenvalue and Singular Value Decompositions

    No full text
    . We present bounds on the backward errors for the symmetric eigenvalue decomposition and the singular value decomposition in the two-norm and in the Frobenius norm. Through different orthogonal decompositions of the computed eigenvectors we can define different symmetric backward errors for the eigenvalue decomposition. When the computed eigenvectors have a small residual and are close to orthonormal then all backward errors tend to be small. Consequently it does not matter how exactly a backward error is defined and how exactly residual and deviation from orthogonality are measured. Analogous results hold for the singular vectors. We indicate the effect of our error bounds on implementations for eigenvector and singular vector computation. In a more general context we prove that the distance of an appropriately scaled matrix to its orthogonal QR factor is not much larger than its distance to the closest orthogonal matrix. Mathematics Subject Classification (1991): 15A18, 15A23, 15A4..

    Two perturbation bounds for singular values and eigenvalues

    No full text

    Semantic spaces : measuring the distance between different subspaces

    Get PDF
    Semantic Space models, which provide a numerical representation of words’ meaning extracted from corpus of documents, have been formalized in terms of Hermitian operators over real valued Hilbert spaces by Bruza et al. [1]. The collapse of a word into a particular meaning has been investigated applying the notion of quantum collapse of superpositional states [2]. While the semantic association between words in a Semantic Space can be computed by means of the Minkowski distance [3] or the cosine of the angle between the vector representation of each pair of words, a new procedure is needed in order to establish relations between two or more Semantic Spaces. We address the question: how can the distance between different Semantic Spaces be computed? By representing each Semantic Space as a subspace of a more general Hilbert space, the relationship between Semantic Spaces can be computed by means of the subspace distance. Such distance needs to take into account the difference in the dimensions between subspaces. The availability of a distance for comparing different Semantic Subspaces would enable to achieve a deeper understanding about the geometry of Semantic Spaces which would possibly translate into better effectiveness in Information Retrieval tasks
    corecore