336 research outputs found

    LMI Approach to Exponential Stability and Almost Sure Exponential Stability for Stochastic Fuzzy Markovian-Jumping Cohen-Grossberg Neural Networks with Nonlinear p-Laplace Diffusion

    Get PDF
    The robust exponential stability of delayed fuzzy Markovian-jumping Cohen-Grossberg neural networks (CGNNs) with nonlinear p-Laplace diffusion is studied. Fuzzy mathematical model brings a great difficulty in setting up LMI criteria for the stability, and stochastic functional differential equations model with nonlinear diffusion makes it harder. To study the stability of fuzzy CGNNs with diffusion, we have to construct a Lyapunov-Krasovskii functional in non-matrix form. But stochastic mathematical formulae are always described in matrix forms. By way of some variational methods in W1,p(Ω), Itô formula, Dynkin formula, the semi-martingale convergence theorem, Schur Complement Theorem, and LMI technique, the LMI-based criteria on the robust exponential stability and almost sure exponential robust stability are finally obtained, the feasibility of which can efficiently be computed and confirmed by computer MatLab LMI toolbox. It is worth mentioning that even corollaries of the main results of this paper improve some recent related existing results. Moreover, some numerical examples are presented to illustrate the effectiveness and less conservatism of the proposed method due to the significant improvement in the allowable upper bounds of time delays

    Practical Exponential Stability of Impulsive Stochastic Reaction-Diffusion Systems With Delays

    Get PDF

    Dynamical Behaviors of Stochastic Hopfield Neural Networks with Both Time-Varying and Continuously Distributed Delays

    Get PDF
    This paper investigates dynamical behaviors of stochastic Hopfield neural networks with both time-varying and continuously distributed delays. By employing the Lyapunov functional theory and linear matrix inequality, some novel criteria on asymptotic stability, ultimate boundedness, and weak attractor are derived. Finally, an example is given to illustrate the correctness and effectiveness of our theoretical results

    Stochastic Dynamics of Nonautonomous Cohen-Grossberg Neural Networks

    Get PDF
    This paper is devoted to the study of the stochastic stability of a class of Cohen-Grossberg neural networks, in which the interconnections and delays are time-varying. With the help of Lyapunov function, Burkholder-Davids-Gundy inequality, and Borel-Cantell's theory, a set of novel sufficient conditions on pth moment exponential stability and almost sure exponential stability for the trivial solution of the system is derived. Compared with the previous published results, our method does not resort to the Razumikhin-type theorem and the semimartingale convergence theorem. Results of the development as presented in this paper are more general than those reported in some previously published papers. An illustrative example is also given to show the effectiveness of the obtained results

    Exponential Stability for a Class of Stochastic Reaction-Diffusion Hopfield Neural Networks with Delays

    Get PDF
    This paper studies the asymptotic behavior for a class of delayed reaction-diffusion Hopfield neural networks driven by finite-dimensional Wiener processes. Some new sufficient conditions are established to guarantee the mean square exponential stability of this system by using Poincaré’s inequality and stochastic analysis technique. The proof of the almost surely exponential stability for this system is carried out by using the Burkholder-Davis-Gundy inequality, the Chebyshev inequality and the Borel-Cantelli lemma. Finally, an example is given to illustrate the effectiveness of the proposed approach, and the simulation is also given by using the Matlab

    Stability and synchronization of discrete-time neural networks with switching parameters and time-varying delays

    Get PDF
    published_or_final_versio

    Razumikhin-type theorem for stochastic functional differential systems via vector Lyapunov function

    Get PDF
    This paper is concerned with input-to-state stability of SFDSs. By using stochastic analysis techniques, Razumikhin techniques and vector Lyapunov function method, vector Razumikhin-type theorem has been established on input-to-state stability for SFDSs. Novel sufficient criteria on the pth moment exponential input-to-state stability are obtained by the established vector Razumikhin-type theorem. When input is zero, an improved criterion on exponential stability is obtained. Two examples are provided to demonstrate validity of the obtained results

    Nonlinear dynamics of pattern recognition and optimization

    Get PDF
    We associate learning in living systems with the shaping of the velocity vector field of a dynamical system in response to external, generally random, stimuli. We consider various approaches to implement a system that is able to adapt the whole vector field, rather than just parts of it - a drawback of the most common current learning systems: artificial neural networks. This leads us to propose the mathematical concept of self-shaping dynamical systems. To begin, there is an empty phase space with no attractors, and thus a zero velocity vector field. Upon receiving the random stimulus, the vector field deforms and eventually becomes smooth and deterministic, despite the random nature of the applied force, while the phase space develops various geometrical objects. We consider the simplest of these - gradient self-shaping systems, whose vector field is the gradient of some energy function, which under certain conditions develops into the multi-dimensional probability density distribution of the input. We explain how self-shaping systems are relevant to artificial neural networks. Firstly, we show that they can potentially perform pattern recognition tasks typically implemented by Hopfield neural networks, but without any supervision and on-line, and without developing spurious minima in the phase space. Secondly, they can reconstruct the probability density distribution of input signals, like probabilistic neural networks, but without the need for new training patterns to have to enter the network as new hardware units. We therefore regard self-shaping systems as a generalisation of the neural network concept, achieved by abandoning the "rigid units - flexible couplings'' paradigm and making the vector field fully flexible and amenable to external force. It is not clear how such systems could be implemented in hardware, and so this new concept presents an engineering challenge. It could also become an alternative paradigm for the modelling of both living and learning systems. Mathematically it is interesting to find how a self shaping system could develop non-trivial objects in the phase space such as periodic orbits or chaotic attractors. We investigate how a delayed vector field could form such objects. We show that this method produces chaos in a class systems which have very simple dynamics in the non-delayed case. We also demonstrate the coexistence of bounded and unbounded solutions dependent on the initial conditions and the value of the delay. Finally, we speculate about how such a method could be used in global optimization
    • …
    corecore