303 research outputs found

    On asymptotic stability of discrete-time non-autonomous delayed Hopfield neural networks

    Get PDF
    AbstractIn this paper, we obtain some sufficient conditions for determining the asymptotic stability of discrete-time non-autonomous delayed Hopfield neural networks by utilizing the Lyapunov functional method. An example is given to show the validity of the results

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    General criteria for asymptotic and exponential stabilities of neural network models with unbounded delays

    Get PDF
    For a family of differential equations with infinite delay, we give sufficient conditions for the global asymptotic, and global exponential stability of an equilibrium point. This family includes most of the delayed models of neural networks of Cohen-Grossberg type, with both bounded and unbounded distributed delay, for which general asymptotic and exponential stability criteria are derived. As illustrations, the results are applied to several concrete models studied in the literature, and a comparison of results is given.Fundação para a Ciência e a Tecnologia (FCT) - 2009-ISFL-1-209Universidade do Minho. Centro de Matemática (CMAT

    Convergence of asymptotic systems of non-autonomous neural network models with infinite distributed delays

    Get PDF
    In this paper we investigate the global convergence of solutions of non-autonomous Hopfield neural network models with discrete time-varying delays, infinite distributed delays, and possible unbounded coefficient functions. Instead of using Lyapunov functionals, we explore intrinsic features between the non-autonomous systems and their asymptotic systems to ensure the boundedness and global convergence of the solutions of the studied models. Our results are new and complement known results in the literature. The theoretical analysis is illustrated with some examples and numerical simulations.The paper was supported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the "Fundacao para a Ciencia e a Tecnologia", through the Project PEstOE/MAT/UI0013/2014. The author thanks the referee for valuable comments.info:eu-repo/semantics/publishedVersio

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Global asymptotic stability for neural network models with distributed delays

    Get PDF
    In this paper, we obtain the global asymptotic stability of the zero solution of a general n-dimensional delayed differential system, by imposing a condition of dominance of the nondelayed terms which cancels the delayed effect. We consider several delayed differential systems in general settings, which allow us to study, as subclasses, the well known neural network models of Hopfield, Cohn-Grossberg, bidirectional associative memory, and static with S-type distributed delays. For these systems, we establish sufficient conditions for the existence of a unique equilibrium and its global asymptotic stability, without using the Lyapunov functional technique. Our results improve and generalize some existing ones.Fundação para a Ciência e a Tecnologia (FCT

    Nonuniform behavior and stability of Hopfield neural networks with delay

    Get PDF
    Based on a new abstract result on the behavior of nonautonomous delayed equations, we obtain a stability result for the solutions of a general discrete nonautonomous Hopfield neural network model with delay. As an application we improve some existing results on the stability of Hopfield models.Antonio J G Bento and Cesar M Silva were partially supported by FCT through CMA-UBI (project PEst-OE/MAT/UI0212/2013).Jose J Oliveira was supported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the Fundacao para a Ciencia e a Tecnologia, through the Project PEstOE/MAT/UI0013/2014.info:eu-repo/semantics/publishedVersio

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed
    corecore