602 research outputs found

    On the validity of memristor modeling in the neural network literature

    Full text link
    An analysis of the literature shows that there are two types of non-memristive models that have been widely used in the modeling of so-called "memristive" neural networks. Here, we demonstrate that such models have nothing in common with the concept of memristive elements: they describe either non-linear resistors or certain bi-state systems, which all are devices without memory. Therefore, the results presented in a significant number of publications are at least questionable, if not completely irrelevant to the actual field of memristive neural networks

    Nonlinear analysis of dynamical complex networks

    Get PDF
    Copyright Ā© 2013 Zidong Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Complex networks are composed of a large number of highly interconnected dynamical units and therefore exhibit very complicated dynamics. Examples of such complex networks include the Internet, that is, a network of routers or domains, the World Wide Web (WWW), that is, a network of websites, the brain, that is, a network of neurons, and an organization, that is, a network of people. Since the introduction of the small-world network principle, a great deal of research has been focused on the dependence of the asymptotic behavior of interconnected oscillatory agents on the structural properties of complex networks. It has been found out that the general structure of the interaction network may play a crucial role in the emergence of synchronization phenomena in various fields such as physics, technology, and the life sciences

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunovā€“Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    pth moment exponential stability of stochastic fuzzy Cohenā€“Grossberg neural networks with discrete and distributed delays

    Get PDF
    In this paper, stochastic fuzzy Cohenā€“Grossberg neural networks with discrete and distributed delays are investigated. By using Lyapunov function and the Ito differential formula, some sufficient conditions for the pth moment exponential stability of such stochastic fuzzy Cohenā€“Grossberg neural networks with discrete and distributed delays are established. An example is given to illustrate the feasibility of our main theoretical findings. Finally, the paper ends with a brief conclusion. Methodology and achieved results is to be presented

    Stability analysis of impulsive stochastic Cohenā€“Grossberg neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link - Copyright 2008 Elsevier LtdIn this paper, the problem of stability analysis for a class of impulsive stochastic Cohenā€“Grossberg neural networks with mixed delays is considered. The mixed time delays comprise both the time-varying and infinite distributed delays. By employing a combination of the M-matrix theory and stochastic analysis technique, a sufficient condition is obtained to ensure the existence, uniqueness, and exponential p-stability of the equilibrium point for the addressed impulsive stochastic Cohenā€“Grossberg neural network with mixed delays. The proposed method, which does not make use of the Lyapunov functional, is shown to be simple yet effective for analyzing the stability of impulsive or stochastic neural networks with variable and/or distributed delays. We then extend our main results to the case where the parameters contain interval uncertainties. Moreover, the exponential convergence rate index is estimated, which depends on the system parameters. An example is given to show the effectiveness of the obtained results.This work was supported by the Natural Science Foundation of CQ CSTC under grant 2007BB0430, the Scientific Research Fund of Chongqing Municipal Education Commission under Grant KJ070401, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany

    New Stability Criterion for Takagi-Sugeno Fuzzy Cohen-Grossberg Neural Networks with Probabilistic Time-Varying Delays

    Get PDF
    A new global asymptotic stability criterion of Takagi-Sugeno fuzzy Cohen-Grossberg neural networks with probabilistic time-varying delays was derived, in which the diffusion item can play its role. Owing to deleting the boundedness conditions on amplification functions, the main result is a novelty to some extent. Besides, there is another novelty in methods, for Lyapunov-Krasovskii functional is the positive definite form of p powers, which is different from those of existing literature. Moreover, a numerical example illustrates the effectiveness of the proposed methods

    Existence and stability of a periodic solution of a general difference equation with applications to neural networks with a delay in the leakage terms

    Full text link
    In this paper, a new global exponential stability criterion is obtained for a general multidimensional delay difference equation using induction arguments. In the cases that the difference equation is periodic, we prove the existence of a periodic solution by constructing a type of Poincar\'e map. The results are used to obtain stability criteria for a general discrete-time neural network model with a delay in the leakage terms. As particular cases, we obtain new stability criteria for neural network models recently studied in the literature, in particular for low-order and high-order Hopfield and Bidirectional Associative Memory(BAM).Comment: 20 pages, 3 figure

    Combined Convex Technique on Delay-Distribution-Dependent Stability for Delayed Neural Networks

    Get PDF
    Together with the Lyapunov-Krasovskii functional approach and an improved delay-partitioning idea, one novel sufficient condition is derived to guarantee a class of delayed neural networks to be asymptotically stable in the mean-square sense, in which the probabilistic variable delay and both of delay variation limits can be measured. Through combining the reciprocal convex technique and convex technique one, the criterion is presented via LMIs and its solvability heavily depends on the sizes of both time-delay range and its variations, which can become much less conservative than those present ones by thinning the delay intervals. Finally, it can be demonstrated by four numerical examples that our idea reduces the conservatism more effectively than some earlier reported ones
    • ā€¦
    corecore