10 research outputs found

    Analyzing Stability of Equilibrium Points in Neural Networks: A General Approach

    Full text link
    Networks of coupled neural systems represent an important class of models in computational neuroscience. In some applications it is required that equilibrium points in these networks remain stable under parameter variations. Here we present a general methodology to yield explicit constraints on the coupling strengths to ensure the stability of the equilibrium point. Two models of coupled excitatory-inhibitory oscillators are used to illustrate the approach.Comment: 20 pages, 4 figure

    Short term synaptic depression improves information transfer in perceptual multistability

    Get PDF
    Competitive neural networks are often used to model the dynamics of perceptual bistability. Switching between percepts can occur through fluctuations and/or a slow adaptive process. Here, we analyze switching statistics in competitive networks with short term synaptic depression and noise. We start by analyzing a ring model that yields spatially structured solutions and complement this with a study of a space-free network whose populations are coupled with mutual inhibition. Dominance times arising from depression driven switching can be approximated using a separation of timescales in the ring and space-free model. For purely noise-driven switching, we use energy arguments to justify how dominance times are exponentially related to input strength. We also show that a combination of depression and noise generates realistic distributions of dominance times. Unimodal functions of dominance times are more easily differentiated from one another using Bayesian sampling, suggesting synaptic depression induced switching transfers more information about stimuli than noise-driven switching. Finally, we analyze a competitive network model of perceptual tristability, showing depression generates a memory of previous percepts based on the ordering of percepts.Comment: 26 pages, 15 figure

    Dynamical Analysis of DTNN with Impulsive Effect

    Get PDF
    We present dynamical analysis of discrete-time delayed neural networks with impulsive effect. Under impulsive effect, we derive some new criteria for the invariance and attractivity of discretetime neural networks by using decomposition approach and delay difference inequalities. Our results improve or extend the existing ones

    Modelling and Contractivity of Neural-Synaptic Networks with Hebbian Learning

    Full text link
    This paper is concerned with the modelling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories -- for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum in-degree, and the maximum synaptic strength. Then, we show that the models satisfy Dale's Principle. Finally, we illustrate the effectiveness of our results via a numerical example.Comment: 24 pages, 4 figure
    corecore