177 research outputs found
Anti-periodic solution for fuzzy Cohen–Grossberg neural networks with time-varying and distributed delays
In this paper, by using a continuation theorem of coincidence degree theory and a differential inequality, we establish some sufficient conditions ensuring the existence and global exponential stability of anti-periodic solutions for a class of fuzzy Cohen–Grossberg neural networks with time-varying and distributed delays. In addition, we present an illustrative example to show the feasibility of obtained results
pth moment exponential stability of stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays
In this paper, stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are investigated. By using Lyapunov function and the Ito differential formula, some sufficient conditions for the pth moment exponential stability of such stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are established. An example is given to illustrate the feasibility of our main theoretical findings. Finally, the paper ends with a brief conclusion. Methodology and achieved results is to be presented
New Stability Criterion for Takagi-Sugeno Fuzzy Cohen-Grossberg Neural Networks with Probabilistic Time-Varying Delays
A new global asymptotic stability criterion of Takagi-Sugeno fuzzy Cohen-Grossberg neural networks with probabilistic time-varying delays was derived, in which the diffusion item can play its role. Owing to deleting the boundedness conditions on amplification functions, the main result is a novelty to some extent. Besides, there is another novelty in methods, for Lyapunov-Krasovskii functional is the positive definite form of p powers, which is different from those of existing literature. Moreover, a numerical example illustrates the effectiveness of the proposed methods
Global asymptotic stability of nonautonomous Cohen-Grossberg neural network models with infinite delays
For a general Cohen-Grossberg neural network model with potentially unbounded time-varying
coeffi cients and infi nite distributed delays, we give su fficient conditions for its global asymptotic
stability. The model studied is general enough to include, as subclass, the most of famous
neural network models such as Cohen-Grossberg, Hopfi eld, and bidirectional associative memory.
Contrary to usual in the literature, in the proofs we do not use Lyapunov functionals. As
illustrated, the results are applied to several concrete models studied in the literature and a
comparison of results shows that our results give new global stability criteria for several neural
network models and improve some earlier publications.The second author research was suported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the "Fundacao para a Ciencia e a Tecnologia", through the project PEstOE/MAT/UI0013/2014. The authors thank the referee for valuable comments
Global exponential stability of nonautonomous neural network models with unbounded delays
For a nonautonomous class of n-dimensional di erential system with in nite delays, we give
su cient conditions for its global exponential stability, without showing the existence of an
equilibrium point, or a periodic solution, or an almost periodic solution. We apply our main
result to several concrete neural network models, studied in the literature, and a comparison of
results is given. Contrary to usual in the literature about neural networks, the assumption of
bounded coe cients is not need to obtain the global exponential stability. Finally, we present
numerical examples to illustrate the e ectiveness of our results.The paper was supported by the Research Center of Mathematics of University of Minho with the Portuguese Funds from the FCT - “Fundação para a Ciência e a Tecnologia”, through the Project UID/MAT/00013/2013. The author thanks the referees for valuable comments.info:eu-repo/semantics/publishedVersio
Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)
General criteria for asymptotic and exponential stabilities of neural network models with unbounded delays
For a family of differential equations with infinite delay, we give sufficient conditions for the global asymptotic, and global exponential stability of an equilibrium point. This family includes most of the delayed models of neural networks of Cohen-Grossberg type, with both bounded and unbounded distributed delay, for which general asymptotic and exponential stability criteria
are derived. As illustrations, the results are applied to several concrete models studied in the literature, and a comparison of results is given.Fundação para a Ciência e a Tecnologia (FCT) - 2009-ISFL-1-209Universidade do Minho. Centro de Matemática (CMAT
Recommended from our members
A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the analysis problem for the existence and stability of periodic solutions is investigated for a class of general discrete-time recurrent neural networks with time-varying delays. For the neural networks under study, a generalized activation function is considered, and the traditional assumptions on the boundedness, monotony and differentiability of the activation functions are removed. By employing the latest free-weighting matrix method, an appropriate Lyapunov–Krasovskii functional is constructed and several sufficient conditions are established to ensure the existence, uniqueness, and globally exponential stability of the periodic solution for the addressed neural network. The conditions are dependent on both the lower bound and upper bound of the time-varying time delays. Furthermore, the conditions are expressed in terms of the linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Two simulation examples are given to show the effectiveness and less conservatism of the proposed criteria.This work was supported in part by the National Natural Science Foundation of China under Grant 50608072, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany
- …