71,118 research outputs found
The use of Lanczos's method to solve the large generalized symmetric definite eigenvalue problem
The generalized eigenvalue problem, Kx = Lambda Mx, is of significant practical importance, especially in structural enginering where it arises as the vibration and buckling problem. A new algorithm, LANZ, based on Lanczos's method is developed. LANZ uses a technique called dynamic shifting to improve the efficiency and reliability of the Lanczos algorithm. A new algorithm for solving the tridiagonal matrices that arise when using Lanczos's method is described. A modification of Parlett and Scott's selective orthogonalization algorithm is proposed. Results from an implementation of LANZ on a Convex C-220 show it to be superior to a subspace iteration code
Penalized Orthogonal Iteration for Sparse Estimation of Generalized Eigenvalue Problem
We propose a new algorithm for sparse estimation of eigenvectors in
generalized eigenvalue problems (GEP). The GEP arises in a number of modern
data-analytic situations and statistical methods, including principal component
analysis (PCA), multiclass linear discriminant analysis (LDA), canonical
correlation analysis (CCA), sufficient dimension reduction (SDR) and invariant
co-ordinate selection. We propose to modify the standard generalized orthogonal
iteration with a sparsity-inducing penalty for the eigenvectors. To achieve
this goal, we generalize the equation-solving step of orthogonal iteration to a
penalized convex optimization problem. The resulting algorithm, called
penalized orthogonal iteration, provides accurate estimation of the true
eigenspace, when it is sparse. Also proposed is a computationally more
efficient alternative, which works well for PCA and LDA problems. Numerical
studies reveal that the proposed algorithms are competitive, and that our
tuning procedure works well. We demonstrate applications of the proposed
algorithm to obtain sparse estimates for PCA, multiclass LDA, CCA and SDR.
Supplementary materials are available online
On pole-swapping algorithms for the eigenvalue problem
Pole-swapping algorithms, which are generalizations of the QZ algorithm for
the generalized eigenvalue problem, are studied. A new modular (and therefore
more flexible) convergence theory that applies to all pole-swapping algorithms
is developed. A key component of all such algorithms is a procedure that swaps
two adjacent eigenvalues in a triangular pencil. An improved swapping routine
is developed, and its superiority over existing methods is demonstrated by a
backward error analysis and numerical tests. The modularity of the new
convergence theory and the generality of the pole-swapping approach shed new
light on bi-directional chasing algorithms, optimally packed shifts, and bulge
pencils, and allow the design of novel algorithms
Residual Component Analysis
Probabilistic principal component analysis (PPCA) seeks a low dimensional
representation of a data set in the presence of independent spherical Gaussian
noise, Sigma = (sigma^2)*I. The maximum likelihood solution for the model is an
eigenvalue problem on the sample covariance matrix. In this paper we consider
the situation where the data variance is already partially explained by other
factors, e.g. covariates of interest, or temporal correlations leaving some
residual variance. We decompose the residual variance into its components
through a generalized eigenvalue problem, which we call residual component
analysis (RCA). We show that canonical covariates analysis (CCA) is a special
case of our algorithm and explore a range of new algorithms that arise from the
framework. We illustrate the ideas on a gene expression time series data set
and the recovery of human pose from silhouette
Generalized Pseudospectral Shattering and Inverse-Free Matrix Pencil Diagonalization
We present a randomized, inverse-free algorithm for producing an approximate
diagonalization of any matrix pencil . The bulk of the
algorithm rests on a randomized divide-and-conquer eigensolver for the
generalized eigenvalue problem originally proposed by Ballard, Demmel, and
Dumitriu [Technical Report 2010]. We demonstrate that this divide-and-conquer
approach can be formulated to succeed with high probability as long as the
input pencil is sufficiently well-behaved, which is accomplished by
generalizing the recent pseudospectral shattering work of Banks, Garza-Vargas,
Kulkarni, and Srivastava [Foundations of Computational Mathematics 2022]. In
particular, we show that perturbing and scaling regularizes its
pseudospectra, allowing divide-and-conquer to run over a simple random grid and
in turn producing an accurate diagonalization of in the backward error
sense. The main result of the paper states the existence of a randomized
algorithm that with high probability (and in exact arithmetic) produces
invertible and diagonal such that and in at most
operations, where is the asymptotic complexity of matrix
multiplication. This not only provides a new set of guarantees for highly
parallel generalized eigenvalue solvers but also establishes nearly matrix
multiplication time as an upper bound on the complexity of exact arithmetic
matrix pencil diagonalization.Comment: 58 pages, 8 figures, 2 table
From FNS to HEIV: A link between two vision parameter estimation methods
Copyright © 2004 IEEEProblems requiring accurate determination of parameters from imagebased quantities arise often in computer vision. Two recent, independently developed frameworks for estimating such parameters are the FNS and HEIV schemes. Here, it is shown that FNS and a core version of HEIV are essentially equivalent, solving a common underlying equation via different means. The analysis is driven by the search for a nondegenerate form of a certain generalized eigenvalue problem and effectively leads to a new derivation of the relevant case of the HEIV algorithm. This work may be seen as an extension of previous efforts to rationalize and interrelate a spectrum of estimators, including the renormalization method of Kanatani and the normalized eight-point method of Hartley.Wojciech Chojnacki, Michael J. Brooks, Anton van den Hengel, and Darren Gawle
- …