4,553 research outputs found

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    General criteria for asymptotic and exponential stabilities of neural network models with unbounded delays

    Get PDF
    For a family of differential equations with infinite delay, we give sufficient conditions for the global asymptotic, and global exponential stability of an equilibrium point. This family includes most of the delayed models of neural networks of Cohen-Grossberg type, with both bounded and unbounded distributed delay, for which general asymptotic and exponential stability criteria are derived. As illustrations, the results are applied to several concrete models studied in the literature, and a comparison of results is given.Fundação para a Ciência e a Tecnologia (FCT) - 2009-ISFL-1-209Universidade do Minho. Centro de Matemática (CMAT

    Intrinsic adaptation in autonomous recurrent neural networks

    Full text link
    A massively recurrent neural network responds on one side to input stimuli and is autonomously active, on the other side, in the absence of sensory inputs. Stimuli and information processing depends crucially on the qualia of the autonomous-state dynamics of the ongoing neural activity. This default neural activity may be dynamically structured in time and space, showing regular, synchronized, bursting or chaotic activity patterns. We study the influence of non-synaptic plasticity on the default dynamical state of recurrent neural networks. The non-synaptic adaption considered acts on intrinsic neural parameters, such as the threshold and the gain, and is driven by the optimization of the information entropy. We observe, in the presence of the intrinsic adaptation processes, three distinct and globally attracting dynamical regimes, a regular synchronized, an overall chaotic and an intermittent bursting regime. The intermittent bursting regime is characterized by intervals of regular flows, which are quite insensitive to external stimuli, interseeded by chaotic bursts which respond sensitively to input signals. We discuss these finding in the context of self-organized information processing and critical brain dynamics.Comment: 24 pages, 8 figure

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Boundedness and global exponential stability for delayed differential equations with applications

    Get PDF
    The boundedness of solutions for a class of n-dimensional differential equations with distributed delays is established by assuming the existence of instantaneous negative feedbacks which dominate the delay effect. As an important by-product, some criteria for global exponential stability of equilibria are obtained. The results are illustrated with applications to delayed neural networks and population dynamics models.POCI 2010CMATFundação para a Ciência e a Tecnologia (FCT) - SFRH/BD/29563/2006CMAFFEDE
    corecore