849 research outputs found

    Finite-time Anti-synchronization of Memristive Stochastic BAM Neural Networks with Probabilistic Time-varying Delays

    Get PDF
    This paper investigates the drive-response finite-time anti-synchronization for memristive bidirectional associative memory neural networks (MBAMNNs). Firstly, a class of MBAMNNs with mixed probabilistic time-varying delays and stochastic perturbations is first formulated and analyzed in this paper. Secondly, an nonlinear control law is constructed and utilized to guarantee drive-response finite-time anti-synchronization of the neural networks. Thirdly, by employing some inequality technique and constructing an appropriate Lyapunov function, some anti-synchronization criteria are derived. Finally, a number simulation is provided to demonstrate the effectiveness of the proposed mechanism

    Turing instability and pattern formation of a fractional Hopfield reaction–diffusion neural network with transmission delay

    Get PDF
    It is well known that integer-order neural networks with diffusion have rich spatial and temporal dynamical behaviors, including Turing pattern and Hopf bifurcation. Recently, some studies indicate that fractional calculus can depict the memory and hereditary attributes of neural networks more accurately. In this paper, we mainly investigate the Turing pattern in a delayed reaction–diffusion neural network with Caputo-type fractional derivative. In particular, we find that this fractional neural network can form steadily spatial patterns even if its first-derivative counterpart cannot develop any steady pattern, which implies that temporal fractional derivative contributes to pattern formation. Numerical simulations show that both fractional derivative and time delay have influence on the shape of Turing patterns

    Stability in N-Layer recurrent neural networks

    Get PDF
    Starting with the theory developed by Hopfield, Cohen-Grossberg and Kosko, the study of associative memories is extended to N - layer re-current neural networks. The stability of different multilayer networks is demonstrated under specified bounding hypotheses. The analysis involves theorems for the additive as well as the multiplicative models for continuous and discrete N - layer networks. These demonstrations are based on contin-uous and discrete Liapunov theory. The thesis develops autoassociative and heteroassociative memories. It points out the link between all recurrent net-works of this type. The discrete case is analyzed using the threshold signal function as the activation function. A general approach for studying the sta-bility and convergence of the multilayer recurrent networks is developed

    Spiking Neural Networks for Inference and Learning: A Memristor-based Design Perspective

    Get PDF
    On metrics of density and power efficiency, neuromorphic technologies have the potential to surpass mainstream computing technologies in tasks where real-time functionality, adaptability, and autonomy are essential. While algorithmic advances in neuromorphic computing are proceeding successfully, the potential of memristors to improve neuromorphic computing have not yet born fruit, primarily because they are often used as a drop-in replacement to conventional memory. However, interdisciplinary approaches anchored in machine learning theory suggest that multifactor plasticity rules matching neural and synaptic dynamics to the device capabilities can take better advantage of memristor dynamics and its stochasticity. Furthermore, such plasticity rules generally show much higher performance than that of classical Spike Time Dependent Plasticity (STDP) rules. This chapter reviews the recent development in learning with spiking neural network models and their possible implementation with memristor-based hardware
    • …
    corecore