4 research outputs found

    First-order approximation of Gram-Schmidt orthonormalization beats deflation in coupled PCA learning rules

    No full text
    Möller R. First-order approximation of Gram-Schmidt orthonormalization beats deflation in coupled PCA learning rules. Neurocomputing. 2006;69(13-15):1582-1590.In coupled. learning rules for principal component analysis, eigenvectors and eigenvalues are simultaneously estimated in a coupled system of equations. Coupled single-neuron rules have favorable convergence properties. For the estimation of multiple eigenvectors, orthonormalization methods have to be applied, either full Gram-Schmidt orthonormalization, its first-order approximation as used in Oja's stochastic gradient ascent algorithm, or deflation as in Sanger's generalized Hebbian algorithm. This paper reports the observation that a first-order approximation of Gram-Schmidt orthonormalization is superior to the standard deflation procedure in coupled learning rules. The first-order approximation exhibits a smaller orthonormality error and produces eigenvectors and eigenvalues of better quality. This improvement is essential for applications where multiple principal eigenvectors have to be estimated simultaneously rather than sequentially. Moreover, loss of orthonormality may have an harmful effect on subsequent processing stages, like the computation of distance measures for competition in local PCA methods. (c) 2005 Elsevier B.V. All rights reserved

    System- and Data-Driven Methods and Algorithms

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This two-volume handbook covers methods as well as applications. This first volume focuses on real-time control theory, data assimilation, real-time visualization, high-dimensional state spaces and interaction of different reduction techniques
    corecore