270 research outputs found

    Analysis of Synaptic Scaling in Combination with Hebbian Plasticity in Several Simple Networks

    Get PDF
    Conventional synaptic plasticity in combination with synaptic scaling is a biologically plausible plasticity rule that guides the development of synapses toward stability. Here we analyze the development of synaptic connections and the resulting activity patterns in different feed-forward and recurrent neural networks, with plasticity and scaling. We show under which constraints an external input given to a feed-forward network forms an input trace similar to a cell assembly (Hebb, 1949) by enhancing synaptic weights to larger stable values as compared to the rest of the network. For instance, a weak input creates a less strong representation in the network than a strong input which produces a trace along large parts of the network. These processes are strongly influenced by the underlying connectivity. For example, when embedding recurrent structures (excitatory rings, etc.) into a feed-forward network, the input trace is extended into more distant layers, while inhibition shortens it. These findings provide a better understanding of the dynamics of generic network structures where plasticity is combined with scaling. This makes it also possible to use this rule for constructing an artificial network with certain desired storage properties

    Phenomenological modeling of diverse and heterogeneous synaptic dynamics at natural density

    Full text link
    This chapter sheds light on the synaptic organization of the brain from the perspective of computational neuroscience. It provides an introductory overview on how to account for empirical data in mathematical models, implement them in software, and perform simulations reflecting experiments. This path is demonstrated with respect to four key aspects of synaptic signaling: the connectivity of brain networks, synaptic transmission, synaptic plasticity, and the heterogeneity across synapses. Each step and aspect of the modeling and simulation workflow comes with its own challenges and pitfalls, which are highlighted and addressed in detail.Comment: 35 pages, 3 figure

    Synaptic plasticity in a recurrent neural network for versatile and adaptive behaviors of a walking robot

    Get PDF
    Walking animals, like insects, with little neural computing can effectively perform complex behaviors. They can walk around their environment, escape from corners/deadlocks, and avoid or climb over obstacles. While performing all these behaviors, they can also adapt their movements to deal with an unknown situation. As a consequence, they successfully navigate through their complex environment. The versatile and adaptive abilities are the result of an integration of several ingredients embedded in their sensorimotor loop. Biological studies reveal that the ingredients include neural dynamics, plasticity, sensory feedback, and biomechanics. Generating such versatile and adaptive behaviors for a walking robot is a challenging task. In this study, we present a bio-inspired approach to solve this task. Specifically, the approach combines neural mechanisms with plasticity, sensory feedback, and biomechanics. The neural mechanisms consist of adaptive neural sensory processing and modular neural locomotion control. The sensory processing is based on a small recurrent network consisting of two fully connected neurons. Online correlation-based learning with synaptic scaling is applied to adequately change the connections of the network. By doing so, we can effectively exploit neural dynamics (i.e., hysteresis effects and single attractors) in the network to generate different turning angles with short-term memory for a biomechanical walking robot. The turning information is transmitted as descending steering signals to the locomotion control which translates the signals into motor actions. As a result, the robot can walk around and adapt its turning angle for avoiding obstacles in different situations as well as escaping from sharp corners or deadlocks. Using backbone joint control embedded in the locomotion control allows the robot to climb over small obstacles. Consequently, it can successfully explore and navigate in complex environments

    Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer's disease

    Full text link
    The impairment of cognitive function in Alzheimer's is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the network's activity statistics, we find that the loss of excitatory synapses on excitatory neurons shifts the network dynamic towards the stable regime. The decrease in sensitivity to small perturbations to time varying input can be considered as an indication of a reduction of computational capacity. A full recovery of the network performance can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. By analysing the stability of the linearized network dynamics, we explain how homeostasis can simultaneously maintain the network's firing rate and sensitivity to small perturbations

    An Improved Test for Detecting Multiplicative Homeostatic Synaptic Scaling

    Get PDF
    Homeostatic scaling of synaptic strengths is essential for maintenance of network “gain”, but also poses a risk of losing the distinctions among relative synaptic weights, which are possibly cellular correlates of memory storage. Multiplicative scaling of all synapses has been proposed as a mechanism that would preserve the relative weights among them, because they would all be proportionately adjusted. It is crucial for this hypothesis that all synapses be affected identically, but whether or not this actually occurs is difficult to determine directly. Mathematical tests for multiplicative synaptic scaling are presently carried out on distributions of miniature synaptic current amplitudes, but the accuracy of the test procedure has not been fully validated. We now show that the existence of an amplitude threshold for empirical detection of miniature synaptic currents limits the use of the most common method for detecting multiplicative changes. Our new method circumvents the problem by discarding the potentially distorting subthreshold values after computational scaling. This new method should be useful in assessing the underlying neurophysiological nature of a homeostatic synaptic scaling transformation, and therefore in evaluating its functional significance

    Synaptic Plasticity in Neural Networks Needs Homeostasis with a Fast Rate Detector

    Get PDF
    Hebbian changes of excitatory synapses are driven by and further enhance correlations between pre- and postsynaptic activities. Hence, Hebbian plasticity forms a positive feedback loop that can lead to instability in simulated neural networks. To keep activity at healthy, low levels, plasticity must therefore incorporate homeostatic control mechanisms. We find in numerical simulations of recurrent networks with a realistic triplet-based spike-timing-dependent plasticity rule (triplet STDP) that homeostasis has to detect rate changes on a timescale of seconds to minutes to keep the activity stable. We confirm this result in a generic mean-field formulation of network activity and homeostatic plasticity. Our results strongly suggest the existence of a homeostatic regulatory mechanism that reacts to firing rate changes on the order of seconds to minutes

    Depression-Biased Reverse Plasticity Rule Is Required for Stable Learning at Top-down Connections

    Get PDF
    Top-down synapses are ubiquitous throughout neocortex and play a central role in cognition, yet little is known about their development and specificity. During sensory experience, lower neocortical areas are activated before higher ones, causing top-down synapses to experience a preponderance of post-synaptic activity preceding pre-synaptic activity. This timing pattern is the opposite of that experienced by bottom-up synapses, which suggests that different versions of spike-timing dependent synaptic plasticity (STDP) rules may be required at top-down synapses. We consider a two-layer neural network model and investigate which STDP rules can lead to a distribution of top-down synaptic weights that is stable, diverse and avoids strong loops. We introduce a temporally reversed rule (rSTDP) where top-down synapses are potentiated if post-synaptic activity precedes pre-synaptic activity. Combining analytical work and integrate-and-fire simulations, we show that only depression-biased rSTDP (and not classical STDP) produces stable and diverse top-down weights. The conclusions did not change upon addition of homeostatic mechanisms, multiplicative STDP rules or weak external input to the top neurons. Our prediction for rSTDP at top-down synapses, which are distally located, is supported by recent neurophysiological evidence showing the existence of temporally reversed STDP in synapses that are distal to the post-synaptic cell body

    Integrating Hebbian and homeostatic plasticity: the current state of the field and future research directions

    Get PDF
    We summarize here the results presented and subsequent discussion from the meeting on Integrating Hebbian and Homeostatic Plasticity at the Royal Society in April 2016. We first outline the major themes and results presented at the meeting. We next provide a synopsis of the outstanding questions that emerged from the discussion at the end of the meeting and finally suggest potential directions of research that we believe are most promising to develop an understanding of how these two forms of plasticity interact to facilitate functional changes in the brain.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'
    corecore