9,323 research outputs found

    Emergence of Connectivity Motifs in Networks of Model Neurons with Short- and Long-term Plastic Synapses

    Get PDF
    Recent evidence in rodent cerebral cortex and olfactory bulb suggests that short-term dynamics of excitatory synaptic transmission is correlated to stereotypical connectivity motifs. It was observed that neurons with short-term facilitating synapses form predominantly reciprocal pairwise connections, while neurons with short-term depressing synapses form unidirectional pairwise connections. The cause of these structural differences in synaptic microcircuits is unknown. We propose that these connectivity motifs emerge from the interactions between short-term synaptic dynamics (SD) and long-term spike-timing dependent plasticity (STDP). While the impact of STDP on SD was shown in vitro, the mutual interactions between STDP and SD in large networks are still the subject of intense research. We formulate a computational model by combining SD and STDP, which captures faithfully short- and long-term dependence on both spike times and frequency. As a proof of concept, we simulate recurrent networks of spiking neurons with random initial connection efficacies and where synapses are either all short-term facilitating or all depressing. For identical background inputs, and as a direct consequence of internally generated activity, we find that networks with depressing synapses evolve unidirectional connectivity motifs, while networks with facilitating synapses evolve reciprocal connectivity motifs. This holds for heterogeneous networks including both facilitating and depressing synapses. Our study highlights the conditions under which SD-STDP might the correlation between facilitation and reciprocal connectivity motifs, as well as between depression and unidirectional motifs. We further suggest experiments for the validation of the proposed mechanism

    Input-specific maturation of synaptic dynamics of parvalbumin interneurons in primary visual cortex

    Get PDF
    Cortical networks consist of local recurrent circuits and long-range pathways from other brain areas. Parvalbumin-positive interneurons (PVNs) regulate the dynamic operation of local ensembles as well as the temporal precision of afferent signals. The synaptic recruitment of PVNs that support these circuit operations is not well-understood. Here we demonstrate that the synaptic dynamics of PVN recruitment in mouse visual cortex are customized according to input source with distinct maturation profiles. Whereas the long-range inputs to PVNs show strong short-term depression throughout postnatal maturation, local inputs from nearby pyramidal neurons progressively lose such depression. This enhanced local recruitment depends on PVN-mediated reciprocal inhibition and results from both pre- and postsynaptic mechanisms, including calcium-permeable AMPA receptors at PVN postsynaptic sites. Although short-term depression of long-range inputs is well-suited for afferent signal detection, the robust dynamics of local inputs may facilitate rapid and proportional PVN recruitment in regulating local circuit operations

    A three-threshold learning rule approaches the maximal capacity of recurrent neural networks

    Get PDF
    Understanding the theoretical foundations of how memories are encoded and retrieved in neural populations is a central challenge in neuroscience. A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model. The model has a poor storage capacity, compared with the capacity achieved with perceptron learning algorithms. Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal, relying only upon locally accessible information. The fully-connected network consists of excitatory binary neurons with plastic recurrent connections and non-plastic inhibitory feedback stabilizing the network dynamics; the memory patterns are presented online as strong afferent currents, producing a bimodal distribution for the neuron synaptic inputs. Synapses corresponding to active inputs are modified as a function of the value of the local fields with respect to three thresholds. Above the highest threshold, and below the lowest threshold, no plasticity occurs. In between these two thresholds, potentiation/depression occurs when the local field is above/below an intermediate threshold. We simulated and analyzed a network of binary neurons implementing this rule and measured its storage capacity for different sizes of the basins of attraction. The storage capacity obtained through numerical simulations is shown to be close to the value predicted by analytical calculations. We also measured the dependence of capacity on the strength of external inputs. Finally, we quantified the statistics of the resulting synaptic connectivity matrix, and found that both the fraction of zero weight synapses and the degree of symmetry of the weight matrix increase with the number of stored patterns.Comment: 24 pages, 10 figures, to be published in PLOS Computational Biolog

    Signal processing in local neuronal circuits based on activity-dependent noise and competition

    Full text link
    We study the characteristics of weak signal detection by a recurrent neuronal network with plastic synaptic coupling. It is shown that in the presence of an asynchronous component in synaptic transmission, the network acquires selectivity with respect to the frequency of weak periodic stimuli. For non-periodic frequency-modulated stimuli, the response is quantified by the mutual information between input (signal) and output (network's activity), and is optimized by synaptic depression. Introducing correlations in signal structure resulted in the decrease of input-output mutual information. Our results suggest that in neural systems with plastic connectivity, information is not merely carried passively by the signal; rather, the information content of the signal itself might determine the mode of its processing by a local neuronal circuit.Comment: 15 pages, 4 pages, in press for "Chaos

    Learning to Discriminate Through Long-Term Changes of Dynamical Synaptic Transmission

    Get PDF
    Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activitiesmay reflect decorrelation of the expanded stimulus space by intracortical synaptic connections

    Dynamical Synapses Enhance Neural Information Processing: Gracefulness, Accuracy and Mobility

    Full text link
    Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity, namely, short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning, and may serve as substrates for neural systems manipulating temporal information on relevant time scales. The present study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks (CANNs) and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors-the network that is initially being stimulated to an active state decays to a silent state very slowly on the time scale of STD rather than on the time scale of neural signaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.Comment: 40 pages, 17 figure
    corecore