2,685 research outputs found

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    Nonlinear analysis of dynamical complex networks

    Get PDF
    Copyright © 2013 Zidong Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Complex networks are composed of a large number of highly interconnected dynamical units and therefore exhibit very complicated dynamics. Examples of such complex networks include the Internet, that is, a network of routers or domains, the World Wide Web (WWW), that is, a network of websites, the brain, that is, a network of neurons, and an organization, that is, a network of people. Since the introduction of the small-world network principle, a great deal of research has been focused on the dependence of the asymptotic behavior of interconnected oscillatory agents on the structural properties of complex networks. It has been found out that the general structure of the interaction network may play a crucial role in the emergence of synchronization phenomena in various fields such as physics, technology, and the life sciences

    Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions

    Full text link
    In this manuscript we analyze the collective behavior of mean-field limits of large-scale, spatially extended stochastic neuronal networks with delays. Rigorously, the asymptotic regime of such systems is characterized by a very intricate stochastic delayed integro-differential McKean-Vlasov equation that remain impenetrable, leaving the stochastic collective dynamics of such networks poorly understood. In order to study these macroscopic dynamics, we analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics and sigmoidal interactions. In that case, we prove that the solution of the mean-field equation is Gaussian, hence characterized by its two first moments, and that these two quantities satisfy a set of coupled delayed integro-differential equations. These equations are similar to usual neural field equations, and incorporate noise levels as a parameter, allowing analysis of noise-induced transitions. We identify through bifurcation analysis several qualitative transitions due to noise in the mean-field limit. In particular, stabilization of spatially homogeneous solutions, synchronized oscillations, bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence in space of Brownian motion

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Dynamical Behavior of Nonautonomous Stochastic Reaction-Diffusion Neural Network Models

    Get PDF
    This brief investigates nonautonomous stochastic reaction-diffusion neural-network models with S-type distributed delays. First, the existence and uniqueness of mild solution are studied under the Lipschitz condition without the linear growth condition. Due to the existence of a nonautonomous reaction-diffusion term and the infinite dimensional Wiener process, the criteria for the well-posedness of the models are established based on the evolution system theory. Then, the S-type distributed delay, which is an infinite delay, is handled by the truncation method, and sufficient conditions for the global exponential stability are obtained by constructing a simple Lyapunov-Krasovskii functional candidate. Finally, neural-network examples and an illustrative example are given to show the applications of the obtained results.</p

    Further analysis of stability of uncertain neural networks with multiple time delays

    Get PDF
    This paper studies the robust stability of uncertain neural networks with multiple time delays with respect to the class of nondecreasing activation functions. By using the Lyapunov functional and homeomorphism mapping theorems, we derive a new delay-independent sufficient condition the existence, uniqueness, and global asymptotic stability of the equilibrium point for delayed neural networks with uncertain network parameters. The condition obtained for the robust stability establishes a matrix-norm relationship between the network parameters of the neural system, and therefore it can easily be verified. We also present some constructive numerical examples to compare the proposed result with results in the previously published corresponding literature. These comparative examples show that our new condition can be considered as an alternative result to the previous corresponding literature results as it defines a new set of network parameters ensuring the robust stability of delayed neural networks.Publisher's Versio
    corecore