93,380 research outputs found

    Chaos and Asymptotical Stability in Discrete-time Neural Networks

    Full text link
    This paper aims to theoretically prove by applying Marotto's Theorem that both transiently chaotic neural networks (TCNN) and discrete-time recurrent neural networks (DRNN) have chaotic structure. A significant property of TCNN and DRNN is that they have only one fixed point, when absolute values of the self-feedback connection weights in TCNN and the difference time in DRNN are sufficiently large. We show that this unique fixed point can actually evolve into a snap-back repeller which generates chaotic structure, if several conditions are satisfied. On the other hand, by using the Lyapunov functions, we also derive sufficient conditions on asymptotical stability for symmetrical versions of both TCNN and DRNN, under which TCNN and DRNN asymptotically converge to a fixed point. Furthermore, generic bifurcations are also considered in this paper. Since both of TCNN and DRNN are not special but simple and general, the obtained theoretical results hold for a wide class of discrete-time neural networks. To demonstrate the theoretical results of this paper better, several numerical simulations are provided as illustrating examples.Comment: This paper will be published in Physica D. Figures should be requested to the first autho

    Non-Euclidean Contraction Analysis of Continuous-Time Neural Networks

    Full text link
    Critical questions in dynamical neuroscience and machine learning are related to the study of continuous-time neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis. This paper develops a comprehensive non-Euclidean contraction theory for continuous-time neural networks. First, for non-Euclidean ℓ1/ℓ∞\ell_{1}/\ell_{\infty} logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the logarithmic norm of their Jacobian. Third and final, we apply these general results to classes of continuous-time neural networks, including Hopfield, firing rate, Persidskii, Lur'e and other models. For each model, we compute the optimal contraction rate and corresponding weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on the Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute, connective, and total contraction properties

    Cluster synchronization in an ensemble of neurons interacting through chemical synapses

    Full text link
    In networks of periodically firing spiking neurons that are interconnected with chemical synapses, we analyze cluster state, where an ensemble of neurons are subdivided into a few clusters, in each of which neurons exhibit perfect synchronization. To clarify stability of cluster state, we decompose linear stability of the solution into two types of stabilities: stability of mean state and stabilities of clusters. Computing Floquet matrices for these stabilities, we clarify the total stability of cluster state for any types of neurons and any strength of interactions even if the size of networks is infinitely large. First, we apply this stability analysis to investigating synchronization in the large ensemble of integrate-and-fire (IF) neurons. In one-cluster state we find the change of stability of a cluster, which elucidates that in-phase synchronization of IF neurons occurs with only inhibitory synapses. Then, we investigate entrainment of two clusters of IF neurons with different excitability. IF neurons with fast decaying synapses show the low entrainment capability, which is explained by a pitchfork bifurcation appearing in two-cluster state with change of synapse decay time constant. Second, we analyze one-cluster state of Hodgkin-Huxley (HH) neurons and discuss the difference in synchronization properties between IF neurons and HH neurons.Comment: Notation for Jacobi matrix is changed. Accepted for publication in Phys. Rev.

    Sensitivity and stability: A signal propagation sweet spot in a sheet of recurrent centre crossing neurons

    No full text
    In this paper we demonstrate that signal propagation across a laminar sheet of recurrent neurons is maximised when two conditions are met. First, neurons must be in the so-called centre crossing configuration. Second, the network’s topology and weights must be such that the network comprises strongly coupled nodes, yet lies within the weakly coupled regime. We develop tools from linear stability analysis with which to describe this regime, and use them to examine the apparent tension between the sensitivity and instability of centre crossing networks

    Non-Euclidean Contractivity of Recurrent Neural Networks

    Get PDF
    Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis.This paper develops a comprehensive contraction theory for recurrent neural networks. First, for non-Euclidean ℓ 1 /ℓ ∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the logarithmic norm of their Jacobian. Third and final, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur’e and other models. For each model, we compute the optimal contraction rate and corresponding weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on the Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute, connective, and total contraction properties

    Consensus analysis of multiagent networks via aggregated and pinning approaches

    Get PDF
    This is the post-print version of of the Article - Copyright @ 2011 IEEEIn this paper, the consensus problem of multiagent nonlinear directed networks (MNDNs) is discussed in the case that a MNDN does not have a spanning tree to reach the consensus of all nodes. By using the Lie algebra theory, a linear node-and-node pinning method is proposed to achieve a consensus of a MNDN for all nonlinear functions satisfying a given set of conditions. Based on some optimal algorithms, large-size networks are aggregated to small-size ones. Then, by applying the principle minor theory to the small-size networks, a sufficient condition is given to reduce the number of controlled nodes. Finally, simulation results are given to illustrate the effectiveness of the developed criteria.This work was jointly supported by CityU under a research grant (7002355) and GRF funding (CityU 101109)

    Global Exponential Stability of Delayed Periodic Dynamical Systems

    Full text link
    In this paper, we discuss delayed periodic dynamical systems, compare capability of criteria of global exponential stability in terms of various LpL^{p} (1≤p<∞1\le p<\infty) norms. A general approach to investigate global exponential stability in terms of various LpL^{p} (1≤p<∞1\le p<\infty) norms is given. Sufficient conditions ensuring global exponential stability are given, too. Comparisons of various stability criteria are given. More importantly, it is pointed out that sufficient conditions in terms of L1L^{1} norm are enough and easy to implement in practice
    • …
    corecore