27,132 research outputs found

    Restrictions and Stability of Time-Delayed Dynamical Networks

    Full text link
    This paper deals with the global stability of time-delayed dynamical networks. We show that for a time-delayed dynamical network with non-distributed delays the network and the corresponding non-delayed network are both either globally stable or unstable. We demonstrate that this may not be the case if the network's delays are distributed. The main tool in our analysis is a new procedure of dynamical network restrictions. This procedure is useful in that it allows for improved estimates of a dynamical network's global stability. Moreover, it is a computationally simpler and much more effective means of analyzing the stability of dynamical networks than the procedure of isospectral network expansions introduced in [Isospectral graph transformations, spectral equivalence, and global stability of dynamical networks. Nonlinearity, 25 (2012) 211-254]. The effectiveness of our approach is illustrated by applications to various classes of Cohen-Grossberg neural networks.Comment: 32 pages, 9 figure

    Global exponential stability of nonautonomous neural network models with continuous distributed delays

    Get PDF
    For a family of non-autonomous differential equations with distributed delays, we give sufficient conditions for the global exponential stability of an equilibrium point. This family includes most of the delayed models of neural networks of Hopfield type, with time-varying coefficients and distributed delays. For these models, we establish sufficient conditions for their global exponential stability. The existence and global exponential stability of a periodic solution is also addressed. A comparison of results shows that these results are general, news, and add something new to some earlier publications.Fundação para a CiĂȘncia e a Tecnologia (FCT

    Global exponential stability of generalized recurrent neural networks with discrete and distributed delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.This paper is concerned with analysis problem for the global exponential stability of a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. We first prove the existence and uniqueness of the equilibrium point under mild conditions, assuming neither differentiability nor strict monotonicity for the activation function. Then, by employing a new Lyapunov–Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the RNNs to be globally exponentially stable. Therefore, the global exponential stability of the delayed RNNs can be easily checked by utilizing the numerically efficient Matlab LMI toolbox, and no tuning of parameters is required. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of Germany

    On global asymptotic stability of neural networks with discrete and distributed delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2005 Elsevier Ltd.In this Letter, the global asymptotic stability analysis problem is investigated for a class of neural networks with discrete and distributed time-delays. The purpose of the problem is to determine the asymptotic stability by employing some easy-to-test conditions. It is shown, via the Lyapunov–Krasovskii stability theory, that the class of neural networks under consideration is globally asymptotically stable if a quadratic matrix inequality involving several parameters is feasible. Furthermore, a linear matrix inequality (LMI) approach is exploited to transform the addressed stability analysis problem into a convex optimization problem, and sufficient conditions for the neural networks to be globally asymptotically stable are then derived in terms of a linear matrix inequality, which can be readily solved by using the Matlab LMI toolbox. Two numerical examples are provided to show the usefulness of the proposed global stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of Germany

    Stochastic stability of uncertain Hopfield neural networks with discrete and distributed delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2006 Elsevier Ltd.This Letter is concerned with the global asymptotic stability analysis problem for a class of uncertain stochastic Hopfield neural networks with discrete and distributed time-delays. By utilizing a Lyapunov–Krasovskii functional, using the well-known S-procedure and conducting stochastic analysis, we show that the addressed neural networks are robustly, globally, asymptotically stable if a convex optimization problem is feasible. Then, the stability criteria are derived in terms of linear matrix inequalities (LMIs), which can be effectively solved by some standard numerical packages. The main results are also extended to the multiple time-delay case. Two numerical examples are given to demonstrate the usefulness of the proposed global stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, and the Alexander von Humboldt Foundation of Germany

    Global stability of Hopfield neural networks under dynamical thresholds with distributed delays

    Get PDF
    We study the dynamical behavior of a class of Hopfield neural networks with distributed delays under dynamical thresholds. Some new criteria ensuring the existence, uniqueness, and global asymptotic stability of equilibrium point are derived. In the results, we do not require the activation functions to satisfy the Lipschitz condition, and also not to be bounded, differentiable, or monotone nondecreasing. Moreover, the symmetry of the connection matrix is not also necessary. Thus, our results improve some previous works in the literature. These conditions have great importance in designs and applications of the global asymptotic stability for Hopfield neural networks involving distributed delays under dynamical thresholds

    A New Globally Exponential Stability Criterion for Neural Networks with Discrete and Distributed Delays

    Get PDF
    This paper concerns the problem of the globally exponential stability of neural networks with discrete and distributed delays. A novel criterion for the globally exponential stability of neural networks is derived by employing the Lyapunov stability theory, homomorphic mapping theory, and matrix theory. The proposed result improves the previously reported global stability results. Finally, two illustrative numerical examples are given to show the effectiveness of our results

    A New Globally Exponential Stability Criterion for Neural Networks with Discrete and Distributed Delays

    Get PDF
    This paper concerns the problem of the globally exponential stability of neural networks with discrete and distributed delays. A novel criterion for the globally exponential stability of neural networks is derived by employing the Lyapunov stability theory, homomorphic mapping theory, and matrix theory. The proposed result improves the previously reported global stability results. Finally, two illustrative numerical examples are given to show the effectiveness of our results

    Anti-periodic solution for fuzzy Cohen–Grossberg neural networks with time-varying and distributed delays

    Get PDF
    In this paper, by using a continuation theorem of coincidence degree theory and a differential inequality, we establish some sufficient conditions ensuring the existence and global exponential stability of anti-periodic solutions for a class of fuzzy Cohen–Grossberg neural networks with time-varying and distributed delays. In addition, we present an illustrative example to show the feasibility of obtained results
    • 

    corecore