930 research outputs found
Dynamic Adaptive Computation: Tuning network states to task requirements
Neural circuits are able to perform computations under very diverse
conditions and requirements. The required computations impose clear constraints
on their fine-tuning: a rapid and maximally informative response to stimuli in
general requires decorrelated baseline neural activity. Such network dynamics
is known as asynchronous-irregular. In contrast, spatio-temporal integration of
information requires maintenance and transfer of stimulus information over
extended time periods. This can be realized at criticality, a phase transition
where correlations, sensitivity and integration time diverge. Being able to
flexibly switch, or even combine the above properties in a task-dependent
manner would present a clear functional advantage. We propose that cortex
operates in a "reverberating regime" because it is particularly favorable for
ready adaptation of computational properties to context and task. This
reverberating regime enables cortical networks to interpolate between the
asynchronous-irregular and the critical state by small changes in effective
synaptic strength or excitation-inhibition ratio. These changes directly adapt
computational properties, including sensitivity, amplification, integration
time and correlation length within the local network. We review recent
converging evidence that cortex in vivo operates in the reverberating regime,
and that various cortical areas have adapted their integration times to
processing requirements. In addition, we propose that neuromodulation enables a
fine-tuning of the network, so that local circuits can either decorrelate or
integrate, and quench or maintain their input depending on task. We argue that
this task-dependent tuning, which we call "dynamic adaptive computation",
presents a central organization principle of cortical networks and discuss
first experimental evidence.Comment: 6 pages + references, 2 figure
Connectivity reflects coding: A model of voltage-based spike-timing-dependent-plasticity with homeostasis
Electrophysiological connectivity patterns in cortex often show a few strong connections in a sea of weak connections. In some brain areas a large fraction of strong connections are bidirectional, in others they are mainly unidirectional. In order to explain these connectivity patterns, we use a model of Spike-Timing-Dependent Plasticity where synaptic changes depend on presynaptic spike arrival and the postsynaptic membrane potential. The model describes several nonlinear effects in STDP experiments, as well as the voltage dependence of plasticity under voltage clamp and classical paradigms of LTP/LTD induction. We show that in a simulated recurrent network of spiking neurons our plasticity rule leads not only to receptive field development, but also to connectivity patterns that reflect the neural code: for temporal coding paradigms strong connections are predominantly unidirectional, whereas they are bidirectional under rate coding. Thus variable connectivity patterns in the brain could reflect different coding principles across brain areas
Modeling the Contributions of the Exocytotic Machinery and Receptor Desensitization to Short- and Long-Term Plasticity of Synapses Between Neocortical Pyramidal Neurons
Short-term synaptic depression (STD) refers to the progressive decrease in synaptic efficacy during a spike train. This decrease may be explained in terms of presynaptic and postsynaptic processes, such as a decrease in the probability of transmitter release, and postsynaptic receptor desensitization. STD may be very strong, and is release-dependent in neocortical pyramid-pyramid synapses. Using a stochastic synapse model, we suggest that the main source of depression in these synapses is the step of vesicle priming, while vesicle depletion and postsynaptic receptor desensitization are proposed to play a lesser role. Our results suggest that vesicle priming may explain not only the release-dependent nature of STD, but also the observation that an average of about one vesicle per active zone is released in central synapses, without positing forced univesicular release. We propose that the latter phenomenon is due to a low priming probability. Our results also explain the effect of paired pre- and postsynaptic activity on STD. In neocortical pyramid-pyramid synapses pairing induces a form of long-term potentiation that has been described as a redistribution of synaptic efficacy (RSE). We propose that RSE is due to a pairing-induced increase in the probability that a primed vesicle will undergo release in response to a presynaptic action potential. This increase may be due to an increased Ca^2+ influx through voltage-gated Ca^2+ channels, or to an increased sensitivity of primed vesicles to this influx. The results were obtained by constraining the model with experimentally observed levels of release probability and other synaptic variables.Defense Advanced Research Projects Agency and the Office of Naval Research (N00014-95-l-0409); Office of Naval Research (N00014-95-l-0657)
The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks
Memory serves to process and store information about experiences such that this information can be
used in future situations. The transfer from transient storage into long-term memory, which retains
information for hours, days, and even years, is called consolidation. In brains, information is primarily
stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a
transient early phase, they can be transferred to a late phase, meaning that they become stabilized
over the course of several hours. This stabilization has been explained by so-called synaptic tagging
and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise
from the synaptic structure of recurrent networks of neurons. This happens through so-called cell
assemblies, which feature particularly strong synapses. It has been proposed that the stabilization
of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in
humans and other animals in the first hours after acquiring a new memory. The exact connection
between the physiological mechanisms of STC and memory consolidation remains, however, unclear.
It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide
behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include
memory improvement, modification of memories, interference and enhancement of similar memories,
and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC,
which can be investigated by employing theoretical methods based on experimental data from the
neuronal and the behavioral level.
In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics.
Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to
guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a
variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that
neuromodulator-dependent STC can retroactively control whether information is stored in a temporal
or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and
attractor dynamics in different organizational paradigms.
In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure
of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model
implements functionality that can be related to long-term memory. Thereby, we provide a basis for the
mechanistic explanation of various neuropsychological effects.2021-09-0
Regulation of Irregular Neuronal Firing by Autaptic Transmission
The importance of self-feedback autaptic transmission in modulating
spike-time irregularity is still poorly understood. By using a biophysical
model that incorporates autaptic coupling, we here show that self-innervation
of neurons participates in the modulation of irregular neuronal firing,
primarily by regulating the occurrence frequency of burst firing. In
particular, we find that both excitatory and electrical autapses increase the
occurrence of burst firing, thus reducing neuronal firing regularity. In
contrast, inhibitory autapses suppress burst firing and therefore tend to
improve the regularity of neuronal firing. Importantly, we show that these
findings are independent of the firing properties of individual neurons, and as
such can be observed for neurons operating in different modes. Our results
provide an insightful mechanistic understanding of how different types of
autapses shape irregular firing at the single-neuron level, and they highlight
the functional importance of autaptic self-innervation in taming and modulating
neurodynamics.Comment: 27 pages, 8 figure
- ā¦