363 research outputs found

    On the validity of memristor modeling in the neural network literature

    Full text link
    An analysis of the literature shows that there are two types of non-memristive models that have been widely used in the modeling of so-called "memristive" neural networks. Here, we demonstrate that such models have nothing in common with the concept of memristive elements: they describe either non-linear resistors or certain bi-state systems, which all are devices without memory. Therefore, the results presented in a significant number of publications are at least questionable, if not completely irrelevant to the actual field of memristive neural networks

    pth moment exponential stability of stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays

    Get PDF
    In this paper, stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are investigated. By using Lyapunov function and the Ito differential formula, some sufficient conditions for the pth moment exponential stability of such stochastic fuzzy Cohen–Grossberg neural networks with discrete and distributed delays are established. An example is given to illustrate the feasibility of our main theoretical findings. Finally, the paper ends with a brief conclusion. Methodology and achieved results is to be presented

    New Stability Criterion for Takagi-Sugeno Fuzzy Cohen-Grossberg Neural Networks with Probabilistic Time-Varying Delays

    Get PDF
    A new global asymptotic stability criterion of Takagi-Sugeno fuzzy Cohen-Grossberg neural networks with probabilistic time-varying delays was derived, in which the diffusion item can play its role. Owing to deleting the boundedness conditions on amplification functions, the main result is a novelty to some extent. Besides, there is another novelty in methods, for Lyapunov-Krasovskii functional is the positive definite form of p powers, which is different from those of existing literature. Moreover, a numerical example illustrates the effectiveness of the proposed methods

    Design of exponential state estimators for neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the state estimation problem is dealt with for a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. The activation functions are assumed to be neither monotonic, nor differentiable, nor bounded. We aim at designing a state estimator to estimate the neuron states, through available output measurements, such that the dynamics of the estimation error is globally exponentially stable in the presence of mixed time delays. By using the Laypunov–Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions to guarantee the existence of the state estimators. We show that both the existence conditions and the explicit expression of the desired estimator can be characterized in terms of the solution to an LMI. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China under Grants 05KJB110154 and BK2006064, and the National Natural Science Foundation of China under Grants 10471119 and 10671172

    Further analysis of stability of uncertain neural networks with multiple time delays

    Get PDF
    This paper studies the robust stability of uncertain neural networks with multiple time delays with respect to the class of nondecreasing activation functions. By using the Lyapunov functional and homeomorphism mapping theorems, we derive a new delay-independent sufficient condition the existence, uniqueness, and global asymptotic stability of the equilibrium point for delayed neural networks with uncertain network parameters. The condition obtained for the robust stability establishes a matrix-norm relationship between the network parameters of the neural system, and therefore it can easily be verified. We also present some constructive numerical examples to compare the proposed result with results in the previously published corresponding literature. These comparative examples show that our new condition can be considered as an alternative result to the previous corresponding literature results as it defines a new set of network parameters ensuring the robust stability of delayed neural networks.Publisher's Versio

    Stability of reaction–diffusion systems with stochastic switching

    Get PDF
    oai:ojs.www4063.vu.lt:article/12877In this paper, we investigate the stability for reaction systems with stochastic switching. Two types of switched models are considered: (i) Markov switching and (ii) independent and identically distributed switching. By means of the ergodic property of Markov chain, Dynkin formula and Fubini theorem, together with the Lyapunov direct method, some sufficient conditions are obtained to ensure that the zero solution of reaction–diffusion systems with Markov switching is almost surely exponential stable or exponentially stable in the mean square. By using Theorem 7.3 in [R. Durrett, Probability: Theory and Examples, Duxbury Press, Belmont, CA, 2005], we also investigate the stability of reaction–diffusion systems with independent and identically distributed switching. Meanwhile, an example with simulations is provided to certify that the stochastic switching plays an essential role in the stability of systems

    Finite-time Anti-synchronization of Memristive Stochastic BAM Neural Networks with Probabilistic Time-varying Delays

    Get PDF
    This paper investigates the drive-response finite-time anti-synchronization for memristive bidirectional associative memory neural networks (MBAMNNs). Firstly, a class of MBAMNNs with mixed probabilistic time-varying delays and stochastic perturbations is first formulated and analyzed in this paper. Secondly, an nonlinear control law is constructed and utilized to guarantee drive-response finite-time anti-synchronization of the neural networks. Thirdly, by employing some inequality technique and constructing an appropriate Lyapunov function, some anti-synchronization criteria are derived. Finally, a number simulation is provided to demonstrate the effectiveness of the proposed mechanism

    LMI Approach to Exponential Stability and Almost Sure Exponential Stability for Stochastic Fuzzy Markovian-Jumping Cohen-Grossberg Neural Networks with Nonlinear p-Laplace Diffusion

    Get PDF
    The robust exponential stability of delayed fuzzy Markovian-jumping Cohen-Grossberg neural networks (CGNNs) with nonlinear p-Laplace diffusion is studied. Fuzzy mathematical model brings a great difficulty in setting up LMI criteria for the stability, and stochastic functional differential equations model with nonlinear diffusion makes it harder. To study the stability of fuzzy CGNNs with diffusion, we have to construct a Lyapunov-Krasovskii functional in non-matrix form. But stochastic mathematical formulae are always described in matrix forms. By way of some variational methods in W1,p(Ω), Itô formula, Dynkin formula, the semi-martingale convergence theorem, Schur Complement Theorem, and LMI technique, the LMI-based criteria on the robust exponential stability and almost sure exponential robust stability are finally obtained, the feasibility of which can efficiently be computed and confirmed by computer MatLab LMI toolbox. It is worth mentioning that even corollaries of the main results of this paper improve some recent related existing results. Moreover, some numerical examples are presented to illustrate the effectiveness and less conservatism of the proposed method due to the significant improvement in the allowable upper bounds of time delays
    corecore