309 research outputs found

    Stability analysis of impulsive stochastic Cohenā€“Grossberg neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link - Copyright 2008 Elsevier LtdIn this paper, the problem of stability analysis for a class of impulsive stochastic Cohenā€“Grossberg neural networks with mixed delays is considered. The mixed time delays comprise both the time-varying and infinite distributed delays. By employing a combination of the M-matrix theory and stochastic analysis technique, a sufficient condition is obtained to ensure the existence, uniqueness, and exponential p-stability of the equilibrium point for the addressed impulsive stochastic Cohenā€“Grossberg neural network with mixed delays. The proposed method, which does not make use of the Lyapunov functional, is shown to be simple yet effective for analyzing the stability of impulsive or stochastic neural networks with variable and/or distributed delays. We then extend our main results to the case where the parameters contain interval uncertainties. Moreover, the exponential convergence rate index is estimated, which depends on the system parameters. An example is given to show the effectiveness of the obtained results.This work was supported by the Natural Science Foundation of CQ CSTC under grant 2007BB0430, the Scientific Research Fund of Chongqing Municipal Education Commission under Grant KJ070401, an International Joint Project sponsored by the Royal Society of the UK and the National Natural Science Foundation of China, and the Alexander von Humboldt Foundation of Germany

    Asymptotic Stability and Exponential Stability of Impulsive Delayed Hopfield Neural Networks

    Get PDF
    A criterion for the uniform asymptotic stability of the equilibrium point of impulsive delayed Hopfield neural networks is presented by using Lyapunov functions and linear matrix inequality approach. The criterion is a less restrictive version of a recent result. By means of constructing the extended impulsive Halanay inequality, we also analyze the exponential stability of impulsive delayed Hopfield neural networks. Some new sufficient conditions ensuring exponential stability of the equilibrium point of impulsive delayed Hopfield neural networks are obtained. An example showing the effectiveness of the present criterion is given

    Delay-dependent criterion for exponential stability analysis of neural networks with time-varying delays

    Get PDF
    This note investigates the problem of exponential stability of neural networks with time-varying delays. To derive a less conservative stability condition, a novel augmented Lyapunov-Krasovskii functional (LKF) which includes triple and quadruple-integral terms is employed. In order to reduce the complexity of the stability test, the convex combination method is utilized to derive an improved delay dependent stability criterion in the form of linear matrix inequalities (LMIs). The superiority of the proposed approach is demonstrated by two comparative examples

    Global exponential stability of impulsive high-order Hopfield typeneural networks with delays

    Get PDF
    AbstractIn this paper, we investigate the global exponential stability of impulsive high-order Hopfield type neural networks with delays. By establishing the impulsive delay differential inequalities and using the Lyapunov method, two sufficient conditions that guarantee global exponential stability of these networks are given, and the exponential convergence rate is also obtained. A numerical example is given to demonstrate the validity of the results

    Almost periodic solutions of retarded SICNNs with functional response on piecewise constant argument

    Get PDF
    We consider a new model for shunting inhibitory cellular neural networks, retarded functional differential equations with piecewise constant argument. The existence and exponential stability of almost periodic solutions are investigated. An illustrative example is provided.Comment: 24 pages, 1 figur

    Recent Advances and Applications of Fractional-Order Neural Networks

    Get PDF
    This paper focuses on the growth, development, and future of various forms of fractional-order neural networks. Multiple advances in structure, learning algorithms, and methods have been critically investigated and summarized. This also includes the recent trends in the dynamics of various fractional-order neural networks. The multiple forms of fractional-order neural networks considered in this study are Hopfield, cellular, memristive, complex, and quaternion-valued based networks. Further, the application of fractional-order neural networks in various computational fields such as system identification, control, optimization, and stability have been critically analyzed and discussed

    Synthesis of neural networks for spatio-temporal spike pattern recognition and processing

    Get PDF
    The advent of large scale neural computational platforms has highlighted the lack of algorithms for synthesis of neural structures to perform predefined cognitive tasks. The Neural Engineering Framework offers one such synthesis, but it is most effective for a spike rate representation of neural information, and it requires a large number of neurons to implement simple functions. We describe a neural network synthesis method that generates synaptic connectivity for neurons which process time-encoded neural signals, and which makes very sparse use of neurons. The method allows the user to specify, arbitrarily, neuronal characteristics such as axonal and dendritic delays, and synaptic transfer functions, and then solves for the optimal input-output relationship using computed dendritic weights. The method may be used for batch or online learning and has an extremely fast optimization process. We demonstrate its use in generating a network to recognize speech which is sparsely encoded as spike times.Comment: In submission to Frontiers in Neuromorphic Engineerin
    • ā€¦
    corecore