134 research outputs found

    Stability analysis for delayed quaternion-valued neural networks via nonlinear measure approach

    Get PDF
    In this paper, the existence and stability analysis of the quaternion-valued neural networks (QVNNs) with time delay are considered. Firstly, the QVNNs are equivalently transformed into four real-valued systems. Then, based on the Lyapunov theory, nonlinear measure approach, and inequality technique, some sufficient criteria are derived to ensure the existence and uniqueness of the equilibrium point as well as global stability of delayed QVNNs. In addition, the provided criteria are presented in the form of linear matrix inequality (LMI), which can be easily checked by LMI toolbox in MATLAB. Finally, two simulation examples are demonstrated to verify the effectiveness of obtained results. Moreover, the less conservatism of the obtained results is also showed by two comparison examples

    LMI Approach to Exponential Stability and Almost Sure Exponential Stability for Stochastic Fuzzy Markovian-Jumping Cohen-Grossberg Neural Networks with Nonlinear p-Laplace Diffusion

    Get PDF
    The robust exponential stability of delayed fuzzy Markovian-jumping Cohen-Grossberg neural networks (CGNNs) with nonlinear p-Laplace diffusion is studied. Fuzzy mathematical model brings a great difficulty in setting up LMI criteria for the stability, and stochastic functional differential equations model with nonlinear diffusion makes it harder. To study the stability of fuzzy CGNNs with diffusion, we have to construct a Lyapunov-Krasovskii functional in non-matrix form. But stochastic mathematical formulae are always described in matrix forms. By way of some variational methods in W1,p(Ω), Itô formula, Dynkin formula, the semi-martingale convergence theorem, Schur Complement Theorem, and LMI technique, the LMI-based criteria on the robust exponential stability and almost sure exponential robust stability are finally obtained, the feasibility of which can efficiently be computed and confirmed by computer MatLab LMI toolbox. It is worth mentioning that even corollaries of the main results of this paper improve some recent related existing results. Moreover, some numerical examples are presented to illustrate the effectiveness and less conservatism of the proposed method due to the significant improvement in the allowable upper bounds of time delays

    Системи диференциални уравнения и невронни мрежи със закъснения и импулси

    Get PDF
    Department of Mathematics & Statistics, College of Science, Sultan Qaboos University, Muscat, Sultanate of Oman и ИМИ-БАН, 16.06.2014 г., присъждане на научна степен "доктор на науките" на Валерий Ковачев по научна специалност 01.01.13. математическо моделиране и приложение на математиката. [Covachev Valery Hristov; Ковачев Валерий Христов

    Stability and dissipativity analysis of static neural networks with time delay

    Get PDF
    This paper is concerned with the problems of stability and dissipativity analysis for static neural networks (NNs) with time delay. Some improved delay-dependent stability criteria are established for static NNs with time-varying or time-invariant delay using the delay partitioning technique. Based on these criteria, several delay-dependent sufficient conditions are given to guarantee the dissipativity of static NNs with time delay. All the given results in this paper are not only dependent upon the time delay but also upon the number of delay partitions. Some examples are given to illustrate the effectiveness and reduced conservatism of the proposed results.published_or_final_versio

    Impulsive stabilization of stochastic functional differential equations

    Get PDF
    AbstractThis paper investigates impulsive stabilization of stochastic delay differential equations. Both moment and almost sure exponential stability criteria are established using the Lyapunov–Razumikhin method. It is shown that an unstable stochastic delay system can be successfully stabilized by impulses. The results can be easily applied to stochastic systems with arbitrarily large delays. An example with its numerical simulation is presented to illustrate the main results

    Stability analysis of discrete-time recurrent neural networks with stochastic delay

    Get PDF
    This paper is concerned with the stability analysis of discrete-time recurrent neural networks (RNNs) with time delays as random variables drawn from some probability distribution. By introducing the variation probability of the time delay, a common delayed discrete-time RNN system is transformed into one with stochastic parameters. Improved conditions for the mean square stability of these systems are obtained by employing new Lyapunov functions and novel techniques are used to achieve delay dependence. The merit of the proposed conditions lies in its reduced conservatism, which is made possible by considering not only the range of the time delays, but also the variation probability distribution. A numerical example is provided to show the advantages of the proposed conditions. © 2009 IEEE.published_or_final_versio

    Exponential Lag Synchronization of Cohen-Grossberg Neural Networks with Discrete and Distributed Delays on Time Scales

    Full text link
    In this article, we investigate exponential lag synchronization results for the Cohen-Grossberg neural networks (C-GNNs) with discrete and distributed delays on an arbitrary time domain by applying feedback control. We formulate the problem by using the time scales theory so that the results can be applied to any uniform or non-uniform time domains. Also, we provide a comparison of results that shows that obtained results are unified and generalize the existing results. Mainly, we use the unified matrix-measure theory and Halanay inequality to establish these results. In the last section, we provide two simulated examples for different time domains to show the effectiveness and generality of the obtained analytical results.Comment: 20 pages, 18 figure

    Training Winner-Take-All Simultaneous Recurrent Neural Networks

    Get PDF
    The winner-take-all (WTA) network is useful in database management, very large scale integration (VLSI) design, and digital processing. The synthesis procedure of WTA on single-layer fully connected architecture with sigmoid transfer function is still not fully explored. We discuss the use of simultaneous recurrent networks (SRNs) trained by Kalman filter algorithms for the task of finding the maximum among N numbers. The simulation demonstrates the effectiveness of our training approach under conditions of a shared-weight SRN architecture. A more general SRN also succeeds in solving a real classification application on car engine data
    corecore