178 research outputs found

    State estimation for discrete-time neural networks with Markov-mode-dependent lower and upper bounds on the distributed delays

    Get PDF
    Copyright @ 2012 Springer VerlagThis paper is concerned with the state estimation problem for a new class of discrete-time neural networks with Markovian jumping parameters and mixed time-delays. The parameters of the neural networks under consideration switch over time subject to a Markov chain. The networks involve both the discrete-time-varying delay and the mode-dependent distributed time-delay characterized by the upper and lower boundaries dependent on the Markov chain. By constructing novel Lyapunov-Krasovskii functionals, sufficient conditions are firstly established to guarantee the exponential stability in mean square for the addressed discrete-time neural networks with Markovian jumping parameters and mixed time-delays. Then, the state estimation problem is coped with for the same neural network where the goal is to design a desired state estimator such that the estimation error approaches zero exponentially in mean square. The derived conditions for both the stability and the existence of desired estimators are expressed in the form of matrix inequalities that can be solved by the semi-definite programme method. A numerical simulation example is exploited to demonstrate the usefulness of the main results obtained.This work was supported in part by the Royal Society of the U.K., the National Natural Science Foundation of China under Grants 60774073 and 61074129, and the Natural Science Foundation of Jiangsu Province of China under Grant BK2010313

    Finite-time Stability, Dissipativity and Passivity Analysis of Discrete-time Neural Networks Time-varying Delays

    Get PDF
    The neural network time-varying delay was described as the dynamic properties of a neural cell, including neural functional and neural delay differential equations. The differential expression explains the derivative term of current and past state. The objective of this paper obtained the neural network time-varying delay. A delay-dependent condition is provided to ensure the considered discrete-time neural networks with time-varying delays to be finite-time stability, dissipativity, and passivity. This paper using a new Lyapunov-Krasovskii functional as well as the free-weighting matrix approach and a linear matrix inequality analysis (LMI) technique constructing to a novel sufficient criterion on finite-time stability, dissipativity, and passivity of the discrete-time neural networks with time-varying delays for improving. We propose sufficient conditions for discrete-time neural networks with time-varying delays. An effective LMI approach derives by base the appropriate type of Lyapunov functional. Finally, we present the effectiveness of novel criteria of finite-time stability, dissipativity, and passivity condition of discrete-time neural networks with time-varying delays in the form of linear matrix inequality (LMI)

    A delay-dividing approach to robust stability of uncertain stochastic complex-valued Hopfield delayed neural networks

    Full text link
    In scientific disciplines and other engineering applications, most of the systems refer to uncertainties, because when modeling physical systems the uncertain parameters are unavoidable. In view of this, it is important to investigate dynamical systems with uncertain parameters. In the present study, a delay-dividing approach is devised to study the robust stability issue of uncertain neural networks. Specifically, the uncertain stochastic complex-valued Hopfield neural network (USCVHNN) with time delay is investigated. Here, the uncertainties of the system parameters are norm-bounded. Based on the Lyapunov mathematical approach and homeomorphism principle, the sufficient conditions for the global asymptotic stability of USCVHNN are derived. To perform this derivation, we divide a complex-valued neural network (CVNN) into two parts, namely real and imaginary, using the delay-dividing approach. All the criteria are expressed by exploiting the linear matrix inequalities (LMIs). Based on two examples, we obtain good theoretical results that ascertain the usefulness of the proposed delay-dividing approach for the USCVHNN model

    Stability analysis for delayed quaternion-valued neural networks via nonlinear measure approach

    Get PDF
    In this paper, the existence and stability analysis of the quaternion-valued neural networks (QVNNs) with time delay are considered. Firstly, the QVNNs are equivalently transformed into four real-valued systems. Then, based on the Lyapunov theory, nonlinear measure approach, and inequality technique, some sufficient criteria are derived to ensure the existence and uniqueness of the equilibrium point as well as global stability of delayed QVNNs. In addition, the provided criteria are presented in the form of linear matrix inequality (LMI), which can be easily checked by LMI toolbox in MATLAB. Finally, two simulation examples are demonstrated to verify the effectiveness of obtained results. Moreover, the less conservatism of the obtained results is also showed by two comparison examples

    Quantized passive filtering for switched delayed neural networks

    Get PDF
    The issue of quantized passive filtering for switched delayed neural networks with noise interference is studied in this paper. Both arbitrary and semi-Markov switching rules are taken into account. By choosing Lyapunov functionals and applying several inequality techniques, sufficient conditions are proposed to ensure the filter error system to be not only exponentially stable, but also exponentially passive from the noise interference to the output error. The gain matrix for the proposed quantized passive filter is able to be determined through the feasible solution of linear matrix inequalities, which are computationally tractable with the help of some popular convex optimization tools. Finally, two numerical examples are given to illustrate the usefulness of the quantized passive filter design methods
    corecore