52 research outputs found

    pth moment exponential stability of stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays

    Get PDF
    In this paper, stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are investigated. By using Lyapunov function and the Ito differential formula, some sufficient conditions for the pth moment exponential stability of such stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are established. An example is given to illustrate the feasibility of our main theoretical findings. Finally, the paper ends with a brief conclusion. Methodology and achieved results is to be presented

    New Stability Criterion for Takagi-Sugeno Fuzzy Cohen-Grossberg Neural Networks with Probabilistic Time-Varying Delays

    Get PDF
    A new global asymptotic stability criterion of Takagi-Sugeno fuzzy Cohen-Grossberg neural networks with probabilistic time-varying delays was derived, in which the diffusion item can play its role. Owing to deleting the boundedness conditions on amplification functions, the main result is a novelty to some extent. Besides, there is another novelty in methods, for Lyapunov-Krasovskii functional is the positive definite form of p powers, which is different from those of existing literature. Moreover, a numerical example illustrates the effectiveness of the proposed methods

    Novel Lagrange sense exponential stability criteria for time-delayed stochastic Cohen–Grossberg neural networks with Markovian jump parameters: A graph-theoretic approach

    Get PDF
    This paper concerns the issues of exponential stability in Lagrange sense for a class of stochastic Cohen–Grossberg neural networks (SCGNNs) with Markovian jump and mixed time delay effects. A systematic approach of constructing a global Lyapunov function for SCGNNs with mixed time delays and Markovian jumping is provided by applying the association of Lyapunov method and graph theory results. Moreover, by using some inequality techniques in Lyapunov-type and coefficient-type theorems we attain two kinds of sufficient conditions to ensure the global exponential stability (GES) through Lagrange sense for the addressed SCGNNs. Ultimately, some examples with numerical simulations are given to demonstrate the effectiveness of the acquired result

    LMI Approach to Exponential Stability and Almost Sure Exponential Stability for Stochastic Fuzzy Markovian-Jumping Cohen-Grossberg Neural Networks with Nonlinear p-Laplace Diffusion

    Get PDF
    The robust exponential stability of delayed fuzzy Markovian-jumping Cohen-Grossberg neural networks (CGNNs) with nonlinear p-Laplace diffusion is studied. Fuzzy mathematical model brings a great difficulty in setting up LMI criteria for the stability, and stochastic functional differential equations model with nonlinear diffusion makes it harder. To study the stability of fuzzy CGNNs with diffusion, we have to construct a Lyapunov-Krasovskii functional in non-matrix form. But stochastic mathematical formulae are always described in matrix forms. By way of some variational methods in W1,p(Ω), Itô formula, Dynkin formula, the semi-martingale convergence theorem, Schur Complement Theorem, and LMI technique, the LMI-based criteria on the robust exponential stability and almost sure exponential robust stability are finally obtained, the feasibility of which can efficiently be computed and confirmed by computer MatLab LMI toolbox. It is worth mentioning that even corollaries of the main results of this paper improve some recent related existing results. Moreover, some numerical examples are presented to illustrate the effectiveness and less conservatism of the proposed method due to the significant improvement in the allowable upper bounds of time delays

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Stability Analysis for Stochastic Markovian Jump Reaction-Diffusion Neural Networks with Partially Known Transition Probabilities and Mixed Time Delays

    Get PDF
    The stability problem is proposed for a new class of stochastic Markovian jump reaction-diffusion neural networks with partial information on transition probability and mixed time delays. The new stability conditions are established in terms of linear matrix inequalities (LMIs). To reduce the conservatism of the stability conditions, an improved Lyapunov-Krasovskii functional and free-connection weighting matrices are introduced. The obtained results are dependent on delays and the measure of the space AND, therefore, have less conservativeness than delay-independent and space-independent ones. An example is given to show the effectiveness of the obtained results

    Further analysis of stability of uncertain neural networks with multiple time delays

    Get PDF
    This paper studies the robust stability of uncertain neural networks with multiple time delays with respect to the class of nondecreasing activation functions. By using the Lyapunov functional and homeomorphism mapping theorems, we derive a new delay-independent sufficient condition the existence, uniqueness, and global asymptotic stability of the equilibrium point for delayed neural networks with uncertain network parameters. The condition obtained for the robust stability establishes a matrix-norm relationship between the network parameters of the neural system, and therefore it can easily be verified. We also present some constructive numerical examples to compare the proposed result with results in the previously published corresponding literature. These comparative examples show that our new condition can be considered as an alternative result to the previous corresponding literature results as it defines a new set of network parameters ensuring the robust stability of delayed neural networks.Publisher's Versio
    corecore