111 research outputs found

    Generalized non-autonomous Cohen-Grossberg neural network model

    Full text link
    In the present paper, we investigate both the global exponential stability and the existence of a periodic solution of a general differential equation with unbounded distributed delays. The main stability criterion depends on the dominance of the non-delay terms over the delay terms. The criterion for the existence of a periodic solution is obtained with the application of the coincide degree theorem. We use the main results to get criteria for the existence and global exponential stability of periodic solutions of a generalized higher-order periodic Cohen-Grossberg neural network model with discrete-time varying delays and infinite distributed delays. Additionally, we provide a comparison with the results in the literature and a numerical simulation to illustrate the effectiveness of some of our results.Comment: 30 page

    Boundedness and stability for Cohen–Grossberg neural network with time-varying delays

    Get PDF
    AbstractIn this paper, a model is considered to describe the dynamics of Cohen–Grossberg neural network with variable coefficients and time-varying delays. Uniformly ultimate boundedness and uniform boundedness are studied for the model by utilizing the Hardy inequality. Combining with the Halanay inequality and the Lyapunov functional method, some new sufficient conditions are derived for the model to be globally exponentially stable. The activation functions are not assumed to be differentiable or strictly increasing. Moreover, no assumption on the symmetry of the connection matrices is necessary. These criteria are important in signal processing and the design of networks

    Bifurcational Behavior of a Cohen-Grossberg Neural Network of Two Neurons with Impulsive Effects

    Get PDF
    Abstract: In this paper, a Cohen-Grossberg neural network composed of two neurons with nonisochronous impulsive effects is proposed and investigated. By employing Mawhin's coincidence theorem, we first show that the existence of semi-trivial periodic solutions. Under this situation, sufficient conditions assuring the asymptotic stability of semi-trivial periodic solutions are derived by using Floquet theory of the impulsive differential equation. Finally, we extend the method i

    Stability analysis of impulsive stochastic Cohen–Grossberg neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link - Copyright 2008 Elsevier LtdIn this paper, the problem of stability analysis for a class of impulsive stochastic Cohen–Grossberg neural networks with mixed delays is considered. The mixed time delays comprise both the time-varying and infinite distributed delays. By employing a combination of the M-matrix theory and stochastic analysis technique, a sufficient condition is obtained to ensure the existence, uniqueness, and exponential p-stability of the equilibrium point for the addressed impulsive stochastic Cohen–Grossberg neural network with mixed delays. The proposed method, which does not make use of the Lyapunov functional, is shown to be simple yet effective for analyzing the stability of impulsive or stochastic neural networks with variable and/or distributed delays. We then extend our main results to the case where the parameters contain interval uncertainties. Moreover, the exponential convergence rate index is estimated, which depends on the system parameters. An example is given to show the effectiveness of the obtained results.This work was supported by the Natural Science Foundation of CQ CSTC under grant 2007BB0430, the Scientific Research Fund of Chongqing Municipal Education Commission under Grant KJ070401, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany

    Global asymptotic stability of nonautonomous Cohen-Grossberg neural network models with infinite delays

    Get PDF
    For a general Cohen-Grossberg neural network model with potentially unbounded time-varying coeffi cients and infi nite distributed delays, we give su fficient conditions for its global asymptotic stability. The model studied is general enough to include, as subclass, the most of famous neural network models such as Cohen-Grossberg, Hopfi eld, and bidirectional associative memory. Contrary to usual in the literature, in the proofs we do not use Lyapunov functionals. As illustrated, the results are applied to several concrete models studied in the literature and a comparison of results shows that our results give new global stability criteria for several neural network models and improve some earlier publications.The second author research was suported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the "Fundacao para a Ciencia e a Tecnologia", through the project PEstOE/MAT/UI0013/2014. The authors thank the referee for valuable comments

    Global stability of a Cohen-Grossberg neural network with both time-varying and continuous distributed delays

    Get PDF
    In this paper, a generalized neural network of Cohen-Grossberg type with both discrete time-varying and distributed unbounded delays is considered. Based on M-matrix theory, sufficient conditions are established to ensure the existence and global attractivity of an equilibrium point. The global exponential stability of the equilibrium is also addressed, but for the model with bounded discrete time-varying delays. A comparison of results shows that these results generalize and improve some earlier publications.Fundação para a Ciência e a Tecnologia (FCT)Universidade do Minho. Centro de Matemática (CMAT

    An LMI approach to global asymptotic stability of the delayed Cohen-Grossberg neural network via nonsmooth analysis

    Get PDF
    In this paper, a linear matrix inequality (LMI) to global asymptotic stability of the delayed Cohen-Grossberg neural network is investigated by means of nonsmooth analysis. Several new sufficient conditions are presented to ascertain the uniqueness of the equilibrium point and the global asymptotic stability of the neural network. It is noted that the results herein require neither the smoothness of the behaved function, or the activation function nor the boundedness of the activation function. In addition, from theoretical analysis, it is found that the condition for ensuring the global asymptotic stability of the neural network also implies the uniqueness of equilibrium. The obtained results improve many earlier ones and are easy to apply. Some simulation results are shown to substantiate the theoretical results

    Global exponential stability of nonautonomous neural network models with unbounded delays

    Get PDF
    For a nonautonomous class of n-dimensional di erential system with in nite delays, we give su cient conditions for its global exponential stability, without showing the existence of an equilibrium point, or a periodic solution, or an almost periodic solution. We apply our main result to several concrete neural network models, studied in the literature, and a comparison of results is given. Contrary to usual in the literature about neural networks, the assumption of bounded coe cients is not need to obtain the global exponential stability. Finally, we present numerical examples to illustrate the e ectiveness of our results.The paper was supported by the Research Center of Mathematics of University of Minho with the Portuguese Funds from the FCT - “Fundação para a Ciência e a Tecnologia”, through the Project UID/MAT/00013/2013. The author thanks the referees for valuable comments.info:eu-repo/semantics/publishedVersio

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Robustness analysis of Cohen-Grossberg neural network with piecewise constant argument and stochastic disturbances

    Get PDF
    Robustness of neural networks has been a hot topic in recent years. This paper mainly studies the robustness of the global exponential stability of Cohen-Grossberg neural networks with a piecewise constant argument and stochastic disturbances, and discusses the problem of whether the Cohen-Grossberg neural networks can still maintain global exponential stability under the perturbation of the piecewise constant argument and stochastic disturbances. By using stochastic analysis theory and inequality techniques, the interval length of the piecewise constant argument and the upper bound of the noise intensity are derived by solving transcendental equations. In the end, we offer several examples to illustrate the efficacy of the findings
    corecore