237 research outputs found

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunovā€“Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Dynamical Behavior of Nonautonomous Stochastic Reaction-Diffusion Neural Network Models

    Get PDF
    This brief investigates nonautonomous stochastic reaction-diffusion neural-network models with S-type distributed delays. First, the existence and uniqueness of mild solution are studied under the Lipschitz condition without the linear growth condition. Due to the existence of a nonautonomous reaction-diffusion term and the infinite dimensional Wiener process, the criteria for the well-posedness of the models are established based on the evolution system theory. Then, the S-type distributed delay, which is an infinite delay, is handled by the truncation method, and sufficient conditions for the global exponential stability are obtained by constructing a simple Lyapunov-Krasovskii functional candidate. Finally, neural-network examples and an illustrative example are given to show the applications of the obtained results.</p

    Asymptotic stability for neural networks with mixed time-delays: The discrete-time case

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link - Copyright 2009 Elsevier LtdThis paper is concerned with the stability analysis problem for a new class of discrete-time recurrent neural networks with mixed time-delays. The mixed time-delays that consist of both the discrete and distributed time-delays are addressed, for the first time, when analyzing the asymptotic stability for discrete-time neural networks. The activation functions are not required to be differentiable or strictly monotonic. The existence of the equilibrium point is first proved under mild conditions. By constructing a new Lyapnuovā€“Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the discrete-time neural networks to be globally asymptotically stable. As an extension, we further consider the stability analysis problem for the same class of neural networks but with state-dependent stochastic disturbances. All the conditions obtained are expressed in terms of LMIs whose feasibility can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Biotechnology and Biological Sciences Research Council (BBSRC) of the UK under Grants BB/C506264/1 and 100/EGM17735, the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grants GR/S27658/01 and EP/C524586/1, an International Joint Project sponsored by the Royal Society of the UK, the Natural Science Foundation of Jiangsu Province of China under Grant BK2007075, the National Natural Science Foundation of China under Grant 60774073, and the Alexander von Humboldt Foundation of Germany

    Stability analysis of discrete-time recurrent neural networks with stochastic delay

    Get PDF
    This paper is concerned with the stability analysis of discrete-time recurrent neural networks (RNNs) with time delays as random variables drawn from some probability distribution. By introducing the variation probability of the time delay, a common delayed discrete-time RNN system is transformed into one with stochastic parameters. Improved conditions for the mean square stability of these systems are obtained by employing new Lyapunov functions and novel techniques are used to achieve delay dependence. The merit of the proposed conditions lies in its reduced conservatism, which is made possible by considering not only the range of the time delays, but also the variation probability distribution. A numerical example is provided to show the advantages of the proposed conditions. Ā© 2009 IEEE.published_or_final_versio

    Multi-almost periodicity and invariant basins of general neural networks under almost periodic stimuli

    Full text link
    In this paper, we investigate convergence dynamics of 2N2^N almost periodic encoded patterns of general neural networks (GNNs) subjected to external almost periodic stimuli, including almost periodic delays. Invariant regions are established for the existence of 2N2^N almost periodic encoded patterns under two classes of activation functions. By employing the property of M\mathscr{M}-cone and inequality technique, attracting basins are estimated and some criteria are derived for the networks to converge exponentially toward 2N2^N almost periodic encoded patterns. The obtained results are new, they extend and generalize the corresponding results existing in previous literature.Comment: 28 pages, 4 figure

    Global asymptotic stability of nonautonomous Cohen-Grossberg neural network models with infinite delays

    Get PDF
    For a general Cohen-Grossberg neural network model with potentially unbounded time-varying coeffi cients and infi nite distributed delays, we give su fficient conditions for its global asymptotic stability. The model studied is general enough to include, as subclass, the most of famous neural network models such as Cohen-Grossberg, Hopfi eld, and bidirectional associative memory. Contrary to usual in the literature, in the proofs we do not use Lyapunov functionals. As illustrated, the results are applied to several concrete models studied in the literature and a comparison of results shows that our results give new global stability criteria for several neural network models and improve some earlier publications.The second author research was suported by the Research Centre of Mathematics of the University of Minho with the Portuguese Funds from the "Fundacao para a Ciencia e a Tecnologia", through the project PEstOE/MAT/UI0013/2014. The authors thank the referee for valuable comments

    Nonlinear dynamics of full-range CNNs with time-varying delays and variable coefficients

    Get PDF
    In the article, the dynamical behaviours of the full-range cellular neural networks (FRCNNs) with variable coefficients and time-varying delays are considered. Firstly, the improved model of the FRCNNs is proposed, and the existence and uniqueness of the solution are studied by means of differential inclusions and set-valued analysis. Secondly, by using the Hardy inequality, the matrix analysis, and the Lyapunov functional method, we get some criteria for achieving the globally exponential stability (GES). Finally, some examples are provided to verify the correctness of the theoretical results

    Synchronization between Bidirectional Coupled Nonautonomous Delayed Cohen-Grossberg Neural Networks

    Get PDF
    Based on using suitable Lyapunov function and the properties of M-matrix, sufficient conditions for complete synchronization of bidirectional coupled nonautonomous Cohen-Grossberg neural networks are obtained. The methods for discussing synchronization avoid complicated error system of Cohen-Grossberg neural networks. Two numerical examples are given to show the effectiveness of the proposed synchronization method
    • ā€¦
    corecore