3,680 research outputs found

    Numerical Implementation of Gradient Algorithms

    Get PDF
    A numerical method for computational implementation of gradient dynamical systems is presented. The method is based upon the development of geometric integration numerical methods, which aim at preserving the dynamical properties of the original ordinary differential equation under discretization. In particular, the proposed method belongs to the class of discrete gradients methods, which substitute the gradient of the continuous equation with a discrete gradient, leading to a map that possesses the same Lyapunov function of the dynamical system, thus preserving the qualitative properties regardless of the step size. In this work, we apply a discrete gradient method to the implementation of Hopfield neural networks. Contrary to most geometric integration methods, the proposed algorithm can be rewritten in explicit form, which considerably improves its performance and stability. Simulation results show that the preservation of the Lyapunov function leads to an improved performance, compared to the conventional discretization.Spanish Government project no. TIN2010-16556 Junta de Andalucía project no. P08-TIC-04026 Agencia Española de Cooperación Internacional para el Desarrollo project no. A2/038418/1

    Statics and dynamics of an Ashkin-Teller neural network with low loading

    Full text link
    An Ashkin-Teller neural network, allowing for two types of neurons is considered in the case of low loading as a function of the strength of the respective couplings between these neurons. The storage and retrieval of embedded patterns built from the two types of neurons, with different degrees of (in)dependence is studied. In particular, thermodynamic properties including the existence and stability of Mattis states are discussed. Furthermore, the dynamic behaviour is examined by deriving flow equations for the macroscopic overlap. It is found that for linked patterns the model shows better retrieval properties than a corresponding Hopfield model.Comment: 20 pages, 6 figures, Latex with postscript figures in one tar.gz fil

    Delay-independent stability in bidirectional associative memory networks

    Get PDF
    It is shown that if the neuronal gains are small compared with the synaptic connection weights, then a bidirectional associative memory network with axonal signal transmission delays converges to the equilibria associated with exogenous inputs to the network. Both discrete and continuously distributed delays are considered; the asymptotic stability is global in the state space of neuronal activations and also is independent of the delays

    Pattern reconstruction and sequence processing in feed-forward layered neural networks near saturation

    Get PDF
    The dynamics and the stationary states for the competition between pattern reconstruction and asymmetric sequence processing are studied here in an exactly solvable feed-forward layered neural network model of binary units and patterns near saturation. Earlier work by Coolen and Sherrington on a parallel dynamics far from saturation is extended here to account for finite stochastic noise due to a Hebbian and a sequential learning rule. Phase diagrams are obtained with stationary states and quasi-periodic non-stationary solutions. The relevant dependence of these diagrams and of the quasi-periodic solutions on the stochastic noise and on initial inputs for the overlaps is explicitly discussed.Comment: 9 pages, 7 figure
    • …
    corecore