1 research outputs found

    Recurrent neural networks for solving matrix algebra problems

    Get PDF
    The aim of this dissertation is the application of recurrent neural networks (RNNs) to solving some problems from a matrix algebra with particular reference to the computations of the generalized inverses as well as solving the matrix equations of constant (timeinvariant) matrices. We examine the ability to exploit the correlation between the dynamic state equations of recurrent neural networks for computing generalized inverses and integral representations of these generalized inverses. Recurrent neural networks are composed of independent parts (sub-networks). These sub-networks can work simultaneously, so parallel and distributed processing can be accomplished. In this way, the computational advantages over the existing sequential algorithms can be attained in real-time applications. We investigate and exploit an analogy between the scaled hyperpower family (SHPI family) of iterative methods for computing the matrix inverse and the discretization of Zhang Neural Network (ZNN) models. A class of ZNN models corresponding to the family of hyperpower iterative methods for computing the generalized inverses on the basis of the discovered analogy is defined. The Matlab Simulink implementation of the introduced ZNN models is described in the case of scaled hyperpower methods of the order 2 and 3. We present the Matlab Simulink model of a hybrid recursive neural implicit dynamics and give a simulation and comparison to the existing Zhang dynamics for real-time matrix inversion. Simulation results confirm a superior convergence of the hybrid model compared to Zhang model
    corecore