283 research outputs found
Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a LyapunovāKrasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)
Robust synchronization of an array of coupled stochastic discrete-time delayed neural networks
Copyright [2008] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.This paper is concerned with the robust synchronization problem for an array of coupled stochastic discrete-time neural networks with time-varying delay. The individual neural network is subject to parameter uncertainty, stochastic disturbance, and time-varying delay, where the norm-bounded parameter uncertainties exist in both the state and weight matrices, the stochastic disturbance is in the form of a scalar Wiener process, and the time delay enters into the activation function. For the array of coupled neural networks, the constant coupling and delayed coupling are simultaneously considered. We aim to establish easy-to-verify conditions under which the addressed neural networks are synchronized. By using the Kronecker product as an effective tool, a linear matrix inequality (LMI) approach is developed to derive several sufficient criteria ensuring the coupled delayed neural networks to be globally, robustly, exponentially synchronized in the mean square. The LMI-based conditions obtained are dependent not only on the lower bound but also on the upper bound of the time-varying delay, and can be solved efficiently via the Matlab LMI Toolbox. Two numerical examples are given to demonstrate the usefulness of the proposed synchronization scheme
On Exponential Periodicity And Stability of Nonlinear Neural Networks With Variable Coefficients And Distributed Delays
The exponential periodicity and stability of continuous nonlinear neural networks with variable coefficients and distributed delays are investigated via employing Young inequality technique and Lyapunov method.
Some new sufficient conditions ensuring existence and uniqueness of periodic solution for a general class of neural systems are obtained. Without assuming the activation functions are to be bounded, differentiable or strictly increasing. Moreover, the symmetry of the connection matrix is not also necessary. Thus, we generalize and improve some previous works, and they are easy to check and apply in practice.Facultad de InformƔtic
On Exponential Periodicity And Stability of Nonlinear Neural Networks With Variable Coefficients And Distributed Delays
The exponential periodicity and stability of continuous nonlinear neural networks with variable coefficients and distributed delays are investigated via employing Young inequality technique and Lyapunov method.
Some new sufficient conditions ensuring existence and uniqueness of periodic solution for a general class of neural systems are obtained. Without assuming the activation functions are to be bounded, differentiable or strictly increasing. Moreover, the symmetry of the connection matrix is not also necessary. Thus, we generalize and improve some previous works, and they are easy to check and apply in practice.Facultad de InformƔtic
Recommended from our members
A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the analysis problem for the existence and stability of periodic solutions is investigated for a class of general discrete-time recurrent neural networks with time-varying delays. For the neural networks under study, a generalized activation function is considered, and the traditional assumptions on the boundedness, monotony and differentiability of the activation functions are removed. By employing the latest free-weighting matrix method, an appropriate LyapunovāKrasovskii functional is constructed and several sufficient conditions are established to ensure the existence, uniqueness, and globally exponential stability of the periodic solution for the addressed neural network. The conditions are dependent on both the lower bound and upper bound of the time-varying time delays. Furthermore, the conditions are expressed in terms of the linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Two simulation examples are given to show the effectiveness and less conservatism of the proposed criteria.This work was supported in part by the National Natural Science Foundation of China under Grant 50608072, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany
Robust stability for stochastic Hopfield neural networks with time delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.In this paper, the asymptotic stability analysis problem is considered for a class of uncertain stochastic neural networks with time delays and parameter uncertainties. The delays are time-invariant, and the uncertainties are norm-bounded that enter into all the network parameters. The aim of this paper is to establish easily verifiable conditions under which the delayed neural network is robustly asymptotically stable in the mean square for all admissible parameter uncertainties. By employing a LyapunovāKrasovskii functional and conducting the stochastic analysis, a linear matrix inequality (LMI) approach is developed to derive the stability criteria. The proposed criteria can be checked readily by using some standard numerical packages, and no tuning of parameters is required. Examples are provided to demonstrate the effectiveness and applicability of the proposed criteria.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of German
Boundedness and stability for CohenāGrossberg neural network with time-varying delays
AbstractIn this paper, a model is considered to describe the dynamics of CohenāGrossberg neural network with variable coefficients and time-varying delays. Uniformly ultimate boundedness and uniform boundedness are studied for the model by utilizing the Hardy inequality. Combining with the Halanay inequality and the Lyapunov functional method, some new sufficient conditions are derived for the model to be globally exponentially stable. The activation functions are not assumed to be differentiable or strictly increasing. Moreover, no assumption on the symmetry of the connection matrices is necessary. These criteria are important in signal processing and the design of networks
Stochastic stability of uncertain Hopfield neural networks with discrete and distributed delays
This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.This Letter is concerned with the global asymptotic stability analysis problem for a class of uncertain stochastic Hopfield neural networks with discrete and distributed time-delays. By utilizing a LyapunovāKrasovskii functional, using the well-known S-procedure and conducting stochastic analysis, we show that the addressed neural networks are robustly, globally, asymptotically stable if a convex optimization problem is feasible. Then, the stability criteria are derived in terms of linear matrix inequalities (LMIs), which can be effectively solved by some standard numerical packages. The main results are also extended to the multiple time-delay case. Two numerical examples are given to demonstrate the usefulness of the proposed global stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of Germany
Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument
We consider a new model for shunting inhibitory cellular neural networks,
retarded functional differential equations with piecewise constant argument.
The existence and exponential stability of almost periodic solutions are
investigated. An illustrative example is provided.Comment: 24 pages, 1 figur
- ā¦