12,055 research outputs found

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Robust synchronization of an array of coupled stochastic discrete-time delayed neural networks

    Get PDF
    Copyright [2008] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.This paper is concerned with the robust synchronization problem for an array of coupled stochastic discrete-time neural networks with time-varying delay. The individual neural network is subject to parameter uncertainty, stochastic disturbance, and time-varying delay, where the norm-bounded parameter uncertainties exist in both the state and weight matrices, the stochastic disturbance is in the form of a scalar Wiener process, and the time delay enters into the activation function. For the array of coupled neural networks, the constant coupling and delayed coupling are simultaneously considered. We aim to establish easy-to-verify conditions under which the addressed neural networks are synchronized. By using the Kronecker product as an effective tool, a linear matrix inequality (LMI) approach is developed to derive several sufficient criteria ensuring the coupled delayed neural networks to be globally, robustly, exponentially synchronized in the mean square. The LMI-based conditions obtained are dependent not only on the lower bound but also on the upper bound of the time-varying delay, and can be solved efficiently via the Matlab LMI Toolbox. Two numerical examples are given to demonstrate the usefulness of the proposed synchronization scheme

    Delay-dependent criterion for exponential stability analysis of neural networks with time-varying delays

    Get PDF
    This note investigates the problem of exponential stability of neural networks with time-varying delays. To derive a less conservative stability condition, a novel augmented Lyapunov-Krasovskii functional (LKF) which includes triple and quadruple-integral terms is employed. In order to reduce the complexity of the stability test, the convex combination method is utilized to derive an improved delay dependent stability criterion in the form of linear matrix inequalities (LMIs). The superiority of the proposed approach is demonstrated by two comparative examples

    New delay-dependent stability criteria for recurrent neural networks with time-varying delays

    Get PDF
    Dimirovski, Georgi M. (Dogus Author)This work is concerned with the delay-dependentstability problem for recurrent neural networks with time-varying delays. A new improved delay-dependent stability criterion expressed in terms of linear matrix inequalities is derived by constructing a dedicated Lyapunov-Krasovskii functional via utilizing Wirtinger inequality and convex combination approach. Moreover, a further improved delay-dependent stability criterion is established by means of a new partitioning method for bounding conditions on the activation function and certain new activation function conditions presented. Finally, the application of these novel results to an illustrative example from the literature has been investigated and their effectiveness is shown via comparison with the existing recent ones
    corecore