8 research outputs found
Neural circuit function redundancy in brain disorders
Redundancy is a ubiquitous property of the nervous system. This means that vastly different configurations of cellular and synaptic components can enable the same neural circuit functions. However, until recently, very little brain disorder research has considered the implications of this characteristic when designing experiments or interpreting data. Here, we first summarise the evidence for redundancy in healthy brains, explaining redundancy and three related sub-concepts: sloppiness, dependencies and multiple solutions. We then lay out key implications for brain disorder research, covering recent examples of redundancy effects in experimental studies on psychiatric disorders. Finally, we give predictions for future experiments based on these concepts
Functional consequences of pre- and postsynaptic expression of synaptic plasticity
Growing experimental evidence shows that both homeostatic and Hebbian synaptic plasticity can be expressed presynaptically as well as postsynaptically. In this review, we start by discussing this evidence and methods used to determine expression loci. Next, we discuss the functional consequences of this diversity in pre- and postsynaptic expression of both homeostatic and Hebbian synaptic plasticity. In particular, we explore the functional consequences of a biologically tuned model of pre- and postsynaptically expressed spike-timing-dependent plasticity complemented with postsynaptic homeostatic control. The pre- and postsynaptic expression in this model predicts (i) more reliable receptive fields and sensory perception, (ii) rapid recovery of forgotten information (memory savings), and (iii) reduced response latencies, compared with a model with postsynaptic expression only. Finally, we discuss open questions that will require a considerable research effort to better elucidate how the specific locus of expression of homeostatic and Hebbian plasticity alters synaptic and network computations.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'
Pre- and postsynaptically expressed spike-timing-dependent plasticity contribute differentially to neuronal learning
A plethora of experimental studies have shown that long-term synaptic plasticity can be expressed pre- or postsynaptically depending on a range of factors such as developmental stage, synapse type, and activity patterns. The functional consequences of this diversity are not clear, although it is understood that whereas postsynaptic expression of plasticity predominantly affects synaptic response amplitude, presynaptic expression alters both synaptic response amplitude and short-term dynamics. In most models of neuronal learning, long-term synaptic plasticity is implemented as changes in connective weights. The consideration of long-term plasticity as a fixed change in amplitude corresponds more closely to post- than to presynaptic expression, which means theoretical outcomes based on this choice of implementation may have a postsynaptic bias. To explore the functional implications of the diversity of expression of long-term synaptic plasticity, we adapted a model of long-term plasticity, more specifically spike-timing-dependent plasticity (STDP), such that it was expressed either independently pre- or postsynaptically, or in a mixture of both ways. We compared pair-based standard STDP models and a biologically tuned triplet STDP model, and investigated the outcomes in a minimal setting, using two different learning schemes: in the first, inputs were triggered at different latencies, and in the second a subset of inputs were temporally correlated. We found that presynaptic changes adjusted the speed of learning, while postsynaptic expression was more efficient at regulating spike timing and frequency. When combining both expression loci, postsynaptic changes amplified the response range, while presynaptic plasticity allowed control over postsynaptic firing rates, potentially providing a form of activity homeostasis. Our findings highlight how the seemingly innocuous choice of implementing synaptic plasticity by single weight modification may unwittingly introduce a postsynaptic bias in modelling outcomes. We conclude that pre- and postsynaptically expressed plasticity are not interchangeable, but enable complimentary functions
Unsupervised learning in neural networks with short range synapses
Different areas of the brain are involved in specific aspects of the information being processed both in learning and in memory formation. For example, the hippocampus is important in the consolidation of information from short-term memory to long-term memory, while emotional memory seems to be dealt by the amygdala. On the microscopic scale the underlying structures in these areas differ in the kind of neurons involved, in their connectivity, or in their clustering degree but, at this level, learning and memory are attributed to neuronal synapses mediated by longterm potentiation and long-term depression. In this work we explore the properties of a short range synaptic connection network, a nearest neighbor lattice composed mostly by excitatory neurons and a fraction of inhibitory ones. The mechanism of synaptic modification responsible for the emergence of memory is Spike-Timing-Dependent Plasticity (STDP), a Hebbian-like rule, where potentiation/depression is acquired when causal/non-causal spikes happen in a synapse involving two neurons. The system is intended to store and recognize memories associated to spatial external inputs presented as simple geometrical forms. The synaptic modifications are continuously applied to excitatory connections, including a homeostasis rule and STDP. In this work we explore the different scenarios under which a network with short range connections can accomplish the task of storing and recognizing simple connected patterns
Strategies to associate memories by unsupervised learning in neural networks
In this work we study the effects of three different strategies to associate memories in a neural network composed by both excitatory and inhibitory spiking neurons, which are randomly connected through recurrent excitatory and inhibitory synapses. The system is intended to store a number of memories, associated to spatial external inputs. The strategies consist in the presentation of the input patterns through trials in: i) ordered sequence; ii) random sequence; iii) clustered sequences. In addition, an order parameter indicating the correlation between the trials' activities is introduced to compute associative memory capacities and the quality of memory retrieval
Spike timing analysis in neural networks with unsupervised synaptic plasticity
The synaptic plasticity rules that sculpt a neural network architecture are key elements to understand cortical processing, as they may explain the emergence of stable, functional activity, while avoiding runaway excitation. For an associative memory framework, they should be built in a way as to enable the network to reproduce a robust spatio-temporal trajectory in response to an external stimulus. Still, how these rules may be implemented in recurrent networks and the way they relate to their capacity of pattern recognition remains unclear. We studied the effects of three phenomenological unsupervised rules in sparsely connected recurrent networks for associative memory: spike-timing-dependent-plasticity, short-term-plasticity and an homeostatic scaling. The system stability is monitored during the learning process of the network, as the mean firing rate converges to a value determined by the homeostatic scaling. Afterwards, it is possible to measure the recovery efficiency of the activity following each initial stimulus. This is evaluated by a measure of the correlation between spike fire timings, and we analysed the full memory separation capacity and limitations of this system
Learning and retrieval behavior in recurrent neural networks with pre-synaptic dependent homeostatic plasticity
The plastic character of brain synapses is considered to be one of the foundations for the formation of memories. There are numerous kinds of such phenomenon currently described in the literature, but their role in the development of information pathways in neural networks with recurrent architectures is still not completely clear. In this paper we study the role of an activity-based process, called pre-synaptic dependent homeostatic scaling, in the organization of networks that yield precise-timed spiking patterns. It encodes spatio-temporal information in the synaptic weights as it associates a learned input with a specific response. We introduce a correlation measure to evaluate the precision of the spiking patterns and explore the effects of different inhibitory interactions and learning parameters. We find that large learning periods are important in order to improve the network learning capacity and discuss this ability in the presence of distinct inhibitory currents