303 research outputs found

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    A unified view on weakly correlated recurrent networks

    Get PDF
    The diversity of neuron models used in contemporary theoretical neuroscience to investigate specific properties of covariances raises the question how these models relate to each other. In particular it is hard to distinguish between generic properties and peculiarities due to the abstracted model. Here we present a unified view on pairwise covariances in recurrent networks in the irregular regime. We consider the binary neuron model, the leaky integrate-and-fire model, and the Hawkes process. We show that linear approximation maps each of these models to either of two classes of linear rate models, including the Ornstein-Uhlenbeck process as a special case. The classes differ in the location of additive noise in the rate dynamics, which is on the output side for spiking models and on the input side for the binary model. Both classes allow closed form solutions for the covariance. For output noise it separates into an echo term and a term due to correlated input. The unified framework enables us to transfer results between models. For example, we generalize the binary model and the Hawkes process to the presence of conduction delays and simplify derivations for established results. Our approach is applicable to general network structures and suitable for population averages. The derived averages are exact for fixed out-degree network architectures and approximate for fixed in-degree. We demonstrate how taking into account fluctuations in the linearization procedure increases the accuracy of the effective theory and we explain the class dependent differences between covariances in the time and the frequency domain. Finally we show that the oscillatory instability emerging in networks of integrate-and-fire models with delayed inhibitory feedback is a model-invariant feature: the same structure of poles in the complex frequency plane determines the population power spectra

    Neural Bursting and Synchronization Emulated by Neural Networks and Circuits

    Get PDF
    © 2021 IEEE - All rights reserved. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1109/TCSI.2021.3081150Nowadays, research, modeling, simulation and realization of brain-like systems to reproduce brain behaviors have become urgent requirements. In this paper, neural bursting and synchronization are imitated by modeling two neural network models based on the Hopfield neural network (HNN). The first neural network model consists of four neurons, which correspond to realizing neural bursting firings. Theoretical analysis and numerical simulation show that the simple neural network can generate abundant bursting dynamics including multiple periodic bursting firings with different spikes per burst, multiple coexisting bursting firings, as well as multiple chaotic bursting firings with different amplitudes. The second neural network model simulates neural synchronization using a coupling neural network composed of two above small neural networks. The synchronization dynamics of the coupling neural network is theoretically proved based on the Lyapunov stability theory. Extensive simulation results show that the coupling neural network can produce different types of synchronous behaviors dependent on synaptic coupling strength, such as anti-phase bursting synchronization, anti-phase spiking synchronization, and complete bursting synchronization. Finally, two neural network circuits are designed and implemented to show the effectiveness and potential of the constructed neural networks.Peer reviewe

    The complexity of dynamics in small neural circuits

    Full text link
    Mean-field theory is a powerful tool for studying large neural networks. However, when the system is composed of a few neurons, macroscopic differences between the mean-field approximation and the real behavior of the network can arise. Here we introduce a study of the dynamics of a small firing-rate network with excitatory and inhibitory populations, in terms of local and global bifurcations of the neural activity. Our approach is analytically tractable in many respects, and sheds new light on the finite-size effects of the system. In particular, we focus on the formation of multiple branching solutions of the neural equations through spontaneous symmetry-breaking, since this phenomenon increases considerably the complexity of the dynamical behavior of the network. For these reasons, branching points may reveal important mechanisms through which neurons interact and process information, which are not accounted for by the mean-field approximation.Comment: 34 pages, 11 figures. Supplementary materials added, colors of figures 8 and 9 fixed, results unchange

    Global exponential periodicity of nonlinear neural networks with multiple time-varying delays

    Get PDF
    Global exponential periodicity of nonlinear neural networks with multiple time-varying delays is investigated. Such neural networks cannot be written in the vector-matrix form because of the existence of the multiple delays. It is noted that although the neural network with multiple time-varying delays has been investigated by Lyapunov-Krasovskii functional method in the literature, the sufficient conditions in the linear matrix inequality form have not been obtained. Two sets of sufficient conditions in the linear matrix inequality form are established by Lyapunov-Krasovskii functional and linear matrix inequality to ensure that two arbitrary solutions of the neural network with multiple delays attract each other exponentially. This is a key prerequisite to prove the existence, uniqueness, and global exponential stability of periodic solutions. Some examples are provided to demonstrate the effectiveness of the established results. We compare the established theoretical results with the previous results and show that the previous results are not applicable to the systems in these examples
    • …
    corecore