33 research outputs found

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    A switching control for finite-time synchronization of memristor-based BAM neural networks with stochastic disturbances

    Get PDF
    This paper deals with the finite-time stochastic synchronization for a class of memristorbased bidirectional associative memory neural networks (MBAMNNs) with time-varying delays and stochastic disturbances. Firstly, based on the physical property of memristor and the circuit of MBAMNNs, a MBAMNNs model with more reasonable switching conditions is established. Then, based on the theory of Filippov’s solution, by using Lyapunov–Krasovskii functionals and stochastic analysis technique, a sufficient condition is given to ensure the finite-time stochastic synchronization of MBAMNNs with a certain controller. Next, by a further discussion, an errordependent switching controller is given to shorten the stochastic settling time. Finally, numerical simulations are carried out to illustrate the effectiveness of theoretical results

    Dynamics meets Morphology: towards Dymorph Computation

    Get PDF
    In this dissertation, approaches are presented for both technically using and investigating biological principles with oscillators in the context of electrical engineering, in particular neuromorphic engineering. Thereby, dynamics as well as morphology as important neuronal principles were explicitly selected, which shape the information processing in the human brain and distinguish it from other technical systems. The aspects and principles selected here are adaptation during the encoding of stimuli, the comparatively low signal transmission speed, the continuous formation and elimination of connections, and highly complex, partly chaotic, dynamics. The selection of these phenomena and properties has led to the development of a sensory unit that is capable of encoding mechanical stress into a series of voltage pulses by the use of a MOSFET augmented by AlScN. The circuit is based on a leaky integrate and fire neuron model and features an adaptation of the pulse frequency. Furthermore, the slow signal transmission speed of biological systems was the motivation for the investigation of a temporal delay in the feedback of the output pulses of a relaxation oscillator. In this system stable pulse patterns which form due to so-called jittering bifurcations could be observed. In particular, switching between different stable pulse patterns was possible to induce. In the further course of the work, the first steps towards time-varying coupling of dynamic systems are investigated. It was shown that in a system consisting of dimethyl sulfoxid and zinc acetate, oscillators can be used to force the formation of filaments. The resulting filaments then lead to a change in the dynamics of the oscillators. Finally, it is shown that in a system with chaotic dynamics, the extension of it with a memristive device can lead to a transient stabilisation of the dynamics, a behaviour that can be identified as a repeated pass of Hopf bifurcations

    Mittag-Leffler state estimator design and synchronization analysis for fractional order BAM neural networks with time delays

    Get PDF
    This paper deals with the extended design of Mittag-Leffler state estimator and adaptive synchronization for fractional order BAM neural networks (FBNNs) with time delays. By the aid of Lyapunov direct approach and Razumikhin-type method a suitable fractional order Lyapunov functional is constructed and a new set of novel sufficient condition are derived to estimate the neuron states via available output measurements such that the ensuring estimator error system is globally Mittag-Leffler stable. Then, the adaptive feedback control rule is designed, under which the considered FBNNs can achieve Mittag-Leffler adaptive synchronization by means of some fractional order inequality techniques. Moreover, the adaptive feedback control may be utilized even when there is no ideal information from the system parameters. Finally, two numerical simulations are given to reveal the effectiveness of the theoretical consequences.N/

    A Survey on Reservoir Computing and its Interdisciplinary Applications Beyond Traditional Machine Learning

    Full text link
    Reservoir computing (RC), first applied to temporal signal processing, is a recurrent neural network in which neurons are randomly connected. Once initialized, the connection strengths remain unchanged. Such a simple structure turns RC into a non-linear dynamical system that maps low-dimensional inputs into a high-dimensional space. The model's rich dynamics, linear separability, and memory capacity then enable a simple linear readout to generate adequate responses for various applications. RC spans areas far beyond machine learning, since it has been shown that the complex dynamics can be realized in various physical hardware implementations and biological devices. This yields greater flexibility and shorter computation time. Moreover, the neuronal responses triggered by the model's dynamics shed light on understanding brain mechanisms that also exploit similar dynamical processes. While the literature on RC is vast and fragmented, here we conduct a unified review of RC's recent developments from machine learning to physics, biology, and neuroscience. We first review the early RC models, and then survey the state-of-the-art models and their applications. We further introduce studies on modeling the brain's mechanisms by RC. Finally, we offer new perspectives on RC development, including reservoir design, coding frameworks unification, physical RC implementations, and interaction between RC, cognitive neuroscience and evolution.Comment: 51 pages, 19 figures, IEEE Acces
    corecore