101 research outputs found

    Improved synchronization analysis of competitive neural networks with time-varying delays

    Get PDF
    Synchronization and control are two very important aspects of any dynamical systems. Among various kinds of nonlinear systems, competitive neural network holds a very important place due to its application in diverse fields. The model is general enough to include, as subclass, the most famous neural network models such as competitive neural networks, cellular neural networks and Hopfield neural networks. In this paper, the problem of feedback controller design to guarantee synchronization for competitive neural networks with time-varying delays is investigated. The goal of this work is to derive an existent criterion of the controller for the exponential synchronization between drive and response neutral-type competitive neural networks with time-varying delays. The method used in this brief is based on feedback control gain matrix by using the Lyapunov stability theory. The synchronization conditions are given in terms of LMIs. To the best of our knowledge, the results presented here are novel and generalize some previous results. Some numerical simulations are also represented graphically to validate the effectiveness and advantages of our theoretical results

    Piecewise pseudo almost periodic solutions of interval general BAM neural networks with mixed time-varying delays and impulsive perturbations

    Get PDF
    This paper is concerned with piecewise pseudo almost periodic solutions of a class of interval general BAM neural networks with mixed time-varying delays and impulsive perturbations. By adopting the exponential dichotomy of linear differential equations and the fixed point theory of contraction mapping. The sufficient conditions for the existence of piecewise pseudo almost periodic solutions of the interval general BAM neural networks with mixed time-varying delays and impulsive perturbations are obtained. By adopting differential inequality techniques and mathematical methods of induction, the global exponential stability for the piecewise pseudo almost periodic solutions of the interval general BAM neural networks with mixed time-varying delays and impulsive perturbations is discussed. An example is given to illustrate the effectiveness of the results obtained in the paper

    Global exponential periodicity of nonlinear neural networks with multiple time-varying delays

    Get PDF
    Global exponential periodicity of nonlinear neural networks with multiple time-varying delays is investigated. Such neural networks cannot be written in the vector-matrix form because of the existence of the multiple delays. It is noted that although the neural network with multiple time-varying delays has been investigated by Lyapunov-Krasovskii functional method in the literature, the sufficient conditions in the linear matrix inequality form have not been obtained. Two sets of sufficient conditions in the linear matrix inequality form are established by Lyapunov-Krasovskii functional and linear matrix inequality to ensure that two arbitrary solutions of the neural network with multiple delays attract each other exponentially. This is a key prerequisite to prove the existence, uniqueness, and global exponential stability of periodic solutions. Some examples are provided to demonstrate the effectiveness of the established results. We compare the established theoretical results with the previous results and show that the previous results are not applicable to the systems in these examples

    Global exponential stability of nonautonomous neural network models with unbounded delays

    Get PDF
    For a nonautonomous class of n-dimensional di erential system with in nite delays, we give su cient conditions for its global exponential stability, without showing the existence of an equilibrium point, or a periodic solution, or an almost periodic solution. We apply our main result to several concrete neural network models, studied in the literature, and a comparison of results is given. Contrary to usual in the literature about neural networks, the assumption of bounded coe cients is not need to obtain the global exponential stability. Finally, we present numerical examples to illustrate the e ectiveness of our results.The paper was supported by the Research Center of Mathematics of University of Minho with the Portuguese Funds from the FCT - “Fundação para a Ciência e a Tecnologia”, through the Project UID/MAT/00013/2013. The author thanks the referees for valuable comments.info:eu-repo/semantics/publishedVersio

    Modelling and Contractivity of Neural-Synaptic Networks with Hebbian Learning

    Full text link
    This paper is concerned with the modelling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories -- for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum in-degree, and the maximum synaptic strength. Then, we show that the models satisfy Dale's Principle. Finally, we illustrate the effectiveness of our results via a numerical example.Comment: 24 pages, 4 figure

    Symmetries, Stability, and Control in Nonlinear Systems and Networks

    Full text link
    This paper discusses the interplay of symmetries and stability in the analysis and control of nonlinear dynamical systems and networks. Specifically, it combines standard results on symmetries and equivariance with recent convergence analysis tools based on nonlinear contraction theory and virtual dynamical systems. This synergy between structural properties (symmetries) and convergence properties (contraction) is illustrated in the contexts of network motifs arising e.g. in genetic networks, of invariance to environmental symmetries, and of imposing different patterns of synchrony in a network.Comment: 16 pages, second versio
    corecore