1,777 research outputs found

    Global exponential stability of impulsive high-order Hopfield typeneural networks with delays

    Get PDF
    AbstractIn this paper, we investigate the global exponential stability of impulsive high-order Hopfield type neural networks with delays. By establishing the impulsive delay differential inequalities and using the Lyapunov method, two sufficient conditions that guarantee global exponential stability of these networks are given, and the exponential convergence rate is also obtained. A numerical example is given to demonstrate the validity of the results

    Global Exponential Stability of Weighted Pseudo-Almost Periodic Solutions of Neutral Type High-Order Hopfield Neural Networks with Distributed Delays

    Get PDF
    Some sufficient conditions are obtained for the existence, uniqueness, and global exponential stability of weighted pseudo-almost periodic solutions to a class of neutral type high-order Hopfield neural networks with distributed delays by employing fixed point theorem and differential inequality techniques. The results of this paper are new and they complement previously known results. Moreover, an example is given to show the effectiveness of the proposed method and results

    Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier LtdThis Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunovā€“Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China (05KJB110154), the NSF of Jiangsu Province of China (BK2006064), and the National Natural Science Foundation of China (10471119)

    Non-Convex Multi-species Hopfield models

    Full text link
    In this work we introduce a multi-species generalization of the Hopfield model for associative memory, where neurons are divided into groups and both inter-groups and intra-groups pair-wise interactions are considered, with different intensities. Thus, this system contains two of the main ingredients of modern Deep neural network architectures: Hebbian interactions to store patterns of information and multiple layers coding different levels of correlations. The model is completely solvable in the low-load regime with a suitable generalization of the Hamilton-Jacobi technique, despite the Hamiltonian can be a non-definite quadratic form of the magnetizations. The family of multi-species Hopfield model includes, as special cases, the 3-layers Restricted Boltzmann Machine (RBM) with Gaussian hidden layer and the Bidirectional Associative Memory (BAM) model.Comment: This is a pre-print of an article published in J. Stat. Phy

    Design of exponential state estimators for neural networks with mixed time delays

    Get PDF
    This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd.In this Letter, the state estimation problem is dealt with for a class of recurrent neural networks (RNNs) with mixed discrete and distributed delays. The activation functions are assumed to be neither monotonic, nor differentiable, nor bounded. We aim at designing a state estimator to estimate the neuron states, through available output measurements, such that the dynamics of the estimation error is globally exponentially stable in the presence of mixed time delays. By using the Laypunovā€“Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions to guarantee the existence of the state estimators. We show that both the existence conditions and the explicit expression of the desired estimator can be characterized in terms of the solution to an LMI. A simulation example is exploited to show the usefulness of the derived LMI-based stability conditions.This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the Alexander von Humboldt Foundation of Germany, the Natural Science Foundation of Jiangsu Education Committee of China under Grants 05KJB110154 and BK2006064, and the National Natural Science Foundation of China under Grants 10471119 and 10671172

    Dynamical Analysis for High-Order Delayed Hopfield Neural Networks with Impulses

    Get PDF
    The global exponential stability and uniform stability of the equilibrium point for high-order delayed Hopfield neural networks with impulses are studied. By utilizing Lyapunov functional method, the quality of negative definite matrix, and the linear matrix inequality approach, some new stability criteria for such system are derived. The results are related to the size of delays and impulses. Two examples are also given to illustrate the effectiveness of our results

    Collective stability of networks of winner-take-all circuits

    Full text link
    The neocortex has a remarkably uniform neuronal organization, suggesting that common principles of processing are employed throughout its extent. In particular, the patterns of connectivity observed in the superficial layers of the visual cortex are consistent with the recurrent excitation and inhibitory feedback required for cooperative-competitive circuits such as the soft winner-take-all (WTA). WTA circuits offer interesting computational properties such as selective amplification, signal restoration, and decision making. But, these properties depend on the signal gain derived from positive feedback, and so there is a critical trade-off between providing feedback strong enough to support the sophisticated computations, while maintaining overall circuit stability. We consider the question of how to reason about stability in very large distributed networks of such circuits. We approach this problem by approximating the regular cortical architecture as many interconnected cooperative-competitive modules. We demonstrate that by properly understanding the behavior of this small computational module, one can reason over the stability and convergence of very large networks composed of these modules. We obtain parameter ranges in which the WTA circuit operates in a high-gain regime, is stable, and can be aggregated arbitrarily to form large stable networks. We use nonlinear Contraction Theory to establish conditions for stability in the fully nonlinear case, and verify these solutions using numerical simulations. The derived bounds allow modes of operation in which the WTA network is multi-stable and exhibits state-dependent persistent activities. Our approach is sufficiently general to reason systematically about the stability of any network, biological or technological, composed of networks of small modules that express competition through shared inhibition.Comment: 7 Figure
    • ā€¦
    corecore