11,616 research outputs found

    Non-Euclidean Contractivity of Recurrent Neural Networks

    Get PDF
    Critical questions in dynamical neuroscience and machine learning are related to the study of recurrent neural networks and their stability, robustness, and computational efficiency. These properties can be simultaneously established via a contraction analysis.This paper develops a comprehensive contraction theory for recurrent neural networks. First, for non-Euclidean ℓ 1 /ℓ ∞ logarithmic norms, we establish quasiconvexity with respect to positive diagonal weights and closed-form worst-case expressions over certain matrix polytopes. Second, for locally Lipschitz maps (e.g., arising as activation functions), we show that their one-sided Lipschitz constant equals the essential supremum of the logarithmic norm of their Jacobian. Third and final, we apply these general results to classes of recurrent neural circuits, including Hopfield, firing rate, Persidskii, Lur’e and other models. For each model, we compute the optimal contraction rate and corresponding weighted non-Euclidean norm via a linear program or, in some special cases, via a Hurwitz condition on the Metzler majorant of the synaptic matrix. Our non-Euclidean analysis establishes also absolute, connective, and total contraction properties

    Stability and synchronization of discrete-time Markovian jumping neural networks with mixed mode-dependent time delays

    Get PDF
    Copyright [2009] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.In this paper, we introduce a new class of discrete-time neural networks (DNNs) with Markovian jumping parameters as well as mode-dependent mixed time delays (both discrete and distributed time delays). Specifically, the parameters of the DNNs are subject to the switching from one to another at different times according to a Markov chain, and the mixed time delays consist of both discrete and distributed delays that are dependent on the Markovian jumping mode. We first deal with the stability analysis problem of the addressed neural networks. A special inequality is developed to account for the mixed time delays in the discrete-time setting, and a novel Lyapunov-Krasovskii functional is put forward to reflect the mode-dependent time delays. Sufficient conditions are established in terms of linear matrix inequalities (LMIs) that guarantee the stochastic stability. We then turn to the synchronization problem among an array of identical coupled Markovian jumping neural networks with mixed mode-dependent time delays. By utilizing the Lyapunov stability theory and the Kronecker product, it is shown that the addressed synchronization problem is solvable if several LMIs are feasible. Hence, different from the commonly used matrix norm theories (such as the M-matrix method), a unified LMI approach is developed to solve the stability analysis and synchronization problems of the class of neural networks under investigation, where the LMIs can be easily solved by using the available Matlab LMI toolbox. Two numerical examples are presented to illustrate the usefulness and effectiveness of the main results obtained

    Sensitivity and stability: A signal propagation sweet spot in a sheet of recurrent centre crossing neurons

    No full text
    In this paper we demonstrate that signal propagation across a laminar sheet of recurrent neurons is maximised when two conditions are met. First, neurons must be in the so-called centre crossing configuration. Second, the network’s topology and weights must be such that the network comprises strongly coupled nodes, yet lies within the weakly coupled regime. We develop tools from linear stability analysis with which to describe this regime, and use them to examine the apparent tension between the sensitivity and instability of centre crossing networks

    Synchronization of coupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delays

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the links below - Copyright @ 2013 IEEE.In this paper, the synchronization problem is studied for an array of N identical delayed neutral-type neural networks with Markovian jumping parameters. The coupled networks involve both the mode-dependent discrete-time delays and the mode-dependent unbounded distributed time delays. All the network parameters including the coupling matrix are also dependent on the Markovian jumping mode. By introducing novel Lyapunov-Krasovskii functionals and using some analytical techniques, sufficient conditions are derived to guarantee that the coupled networks are asymptotically synchronized in mean square. The derived sufficient conditions are closely related with the discrete-time delays, the distributed time delays, the mode transition probability, and the coupling structure of the networks. The obtained criteria are given in terms of matrix inequalities that can be efficiently solved by employing the semidefinite program method. Numerical simulations are presented to further demonstrate the effectiveness of the proposed approach.This work was supported in part by the Royal Society of the U.K., the National Natural Science Foundation of China under Grants 61074129, 61174136 and 61134009, and the Natural Science Foundation of Jiangsu Province of China under Grants BK2010313 and BK2011598
    corecore