508 research outputs found

    Nonlinear analysis of dynamical complex networks

    Get PDF
    Copyright Ā© 2013 Zidong Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Complex networks are composed of a large number of highly interconnected dynamical units and therefore exhibit very complicated dynamics. Examples of such complex networks include the Internet, that is, a network of routers or domains, the World Wide Web (WWW), that is, a network of websites, the brain, that is, a network of neurons, and an organization, that is, a network of people. Since the introduction of the small-world network principle, a great deal of research has been focused on the dependence of the asymptotic behavior of interconnected oscillatory agents on the structural properties of complex networks. It has been found out that the general structure of the interaction network may play a crucial role in the emergence of synchronization phenomena in various fields such as physics, technology, and the life sciences

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunovā€“Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Design of exponential state estimators for neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the state estimation problem is dealt with for a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. The activation functions are assumed to be neither monotonic, nor differentiable, nor bounded. We aim at designing a state estimator to estimate the neuron states, through available output measurements, such that the dynamics of the estimation error is globally exponentially stable in the presence of mixed time delays. By using the Laypunovā€“Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions to guarantee the existence of the state estimators. We show that both the existence conditions and the explicit expression of the desired estimator can be characterized in terms of the solution to an LMI. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China under Grants 05KJB110154 and BK2006064, and the National Natural Science Foundation of China under Grants 10471119 and 10671172
    • ā€¦
    corecore