90 research outputs found
Recommended from our members
Importance of vesicle release stochasticity in neuro-spike communication.
Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.This work was supported in part by ERC project MINERVA (ERC-2013- CoG #616922), EU project CIRCLE (EU-H2020-FET-Open #665564), and TU˘ BI˙TAK graduate scholarship program (BIDEB-2215). 1MCell development is supported by the NIGMS-funded (P41GM103712) National Center for Multiscale Modeling of Biological Systems (MMBioS)
Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines
Recent studies have shown that synaptic unreliability is a robust and
sufficient mechanism for inducing the stochasticity observed in cortex. Here,
we introduce Synaptic Sampling Machines, a class of neural network models that
uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised
learning. Similar to the original formulation of Boltzmann machines, these
models can be viewed as a stochastic counterpart of Hopfield networks, but
where stochasticity is induced by a random mask over the connections. Synaptic
stochasticity plays the dual role of an efficient mechanism for sampling, and a
regularizer during learning akin to DropConnect. A local synaptic plasticity
rule implementing an event-driven form of contrastive divergence enables the
learning of generative models in an on-line fashion. Synaptic sampling machines
perform equally well using discrete-timed artificial units (as in Hopfield
networks) or continuous-timed leaky integrate & fire neurons. The learned
representations are remarkably sparse and robust to reductions in bit precision
and synapse pruning: removal of more than 75% of the weakest connections
followed by cursory re-learning causes a negligible performance loss on
benchmark classification tasks. The spiking neuron-based synaptic sampling
machines outperform existing spike-based unsupervised learners, while
potentially offering substantial advantages in terms of power and complexity,
and are thus promising models for on-line learning in brain-inspired hardware
Statistical approaches for synaptic characterization
Synapses are fascinatingly complex transmission units. One of the fundamental features of synaptic transmission is its stochasticity, as neurotransmitter release exhibits variability and possible failures. It is also quantised: postsynaptic responses to presynaptic stimulations are built up of several and similar quanta of current, each of them arising from the release of one presynaptic vesicle. Moreover, they are dynamic transmission units, as their activity depends on the history of previous spikes and stimulations, a phenomenon known as synaptic plasticity. Finally, synapses exhibit a very broad range of dynamics, features, and connection strengths, depending on neuromodulators concentration [5], the age of the subject [6], their localization in the CNS or in the PNS, or the type of neurons [7].
Addressing the complexity of synaptic transmission is a relevant problem for both biologists and theoretical neuroscientists. From a biological perspective, a finer understanding of transmission mechanisms would allow to study possibly synapse-related diseases, or to determine the locus of plasticity and homeostasis. From a theoretical perspective, different normative explanations for synaptic stochasticity have been proposed, including its possible role in uncertainty encoding, energy-efficient computation, or generalization while learning. A precise description of synaptic transmission will be critical for the validation of these theories and for understanding the functional relevance of this probabilistic and dynamical release.
A central issue, which is common to all these areas of research, is the problem of synaptic characterization. Synaptic characterization (also called synaptic interrogation [8]) refers to a set of methods for exploring synaptic functions, inferring the value of synaptic parameters, and assessing features such as plasticity and modes of release. This doctoral work sits at the crossroads of experimental and theoretical neuroscience: its main aim is to develop statistical tools and methods to improve synaptic characterization, and hence to bring quantitative solutions to biological questions.
In this thesis, we focus on model-based approaches to quantify synaptic transmission, for which different methods are reviewed in Chapter 3. By fitting a generative model of postsynaptic currents to experimental data, it is possible to infer the value of the synapse’s parameters. By performing model selection, we can compare different modelizations of a synapse and thus quantify its features. The main goal of this thesis is thus to develop theoretical and statistical tools to improve the efficiency of both model fitting and model selection.
A first question that often arises when recording synaptic currents is how to precisely observe and measure a quantal transmission. As mentioned above, synaptic transmission has been observed to be quantised. Indeed, the opening of a single presynaptic vesicle (and the release of the neurotransmitters it contains) will create a stereotypical postsynaptic current q, which is called the quantal amplitude. As the number of activated presynaptic vesicles increases, the total postsynaptic current will increase in step-like increments of amplitude q. Hence, at chemical synapses, the postsynaptic responses to presynaptic stimulations are built up of k quanta of current, where k is a random variable corresponding to the number of open vesicles. Excitatory postsynaptic current (EPSC) thus follows a multimodal distribution, where each component has its mean located to a multiple kq with k 2 N and has a width corresponding to the recording noise σ. If σ is large with respect to q, these components will fuse into a unimodal distribution, impeding the possibility to identify quantal transmission and to compute q. How to characterize the regime of parameters in which quantal transmission can be identified? This question led us to define a practical identifiability criterion for statistical model, which is presented in Chapter 4. In doing so, we also derive a mean-field approach for fast likelihood computation (Appendix A) and discuss the possibility to use the Bayesian Information Criterion (a classically used model selection criterion) with correlated observations (Appendix B).
A second question that is especially relevant for experimentalists is how to optimally stimulate the presynaptic cell in order to maximize the informativeness of the recordings. The parameters of a chemical synapse (namely, the number of presynaptic vesicles N, their release probability p, the quantal amplitude q, the short-term depression time constant τD, etc.) cannot be measured directly, but can be estimated from the synapse’s postsynaptic responses to evoked stimuli. However, these estimates critically depend on the stimulation protocol being used. For instance, if inter-spike intervals are too large, no short-term plasticity will appear in the recordings; conversely, a too high stimulation frequency will lead to a depletion of the presynaptic vesicles and to a poor informativeness of the postsynaptic currents. How to perform Optimal Experiment Design (OED) for synaptic characterization? We developed an Efficient Sampling-Based Bayesian Active Learning (ESB-BAL) framework, which is efficient enough to be used in real-time biological experiments (Chapter 5), and propose a link between our proposed definition of practical identifiability and Optimal Experiment Design for model selection (Chapter 6).
Finally, a third biological question to which we ought to bring a theoretical answer is how to make sense of the observed organization of synaptic proteins. Microscopy observations have shown that presynaptic release sites and postsynaptic receptors are organized in ring-like patterns, which are disrupted upon genetic mutations. In Chapter 7, we propose a normative approach to this protein organization, and suggest that it might optimize a certain biological cost function (e.g. the mean current or SNR after vesicle release).
The different theoretical tools and methods developed in this thesis are general enough to be applicable not only to synaptic characterization, but also to different experimental settings and systems studied in physiology. Overall, we expect to democratize and simplify the use of quantitative and normative approaches in biology, thus reducing the cost of experimentation in physiology, and paving the way to more systematic and automated experimental designs
Temporal and spatial factors affecting synaptic transmission in cortex
Synaptic transmission in cortex depends on both the history of synaptic activity and the location of individual anatomical contacts within the dendritic tree. This thesis analyses key aspects of the roles of both these factors and, in particular, extends many of the results for deterministic synaptic transmission to a more naturalistic stochastic framework.
Firstly, I consider how correlations in neurotransmitter vesicle occupancy arising from synchronous activity in a presynaptic population interact with the number of independent release sites, a parameter recently shown to be modified during long-term plasticity. I study a model of multiple-release-site short-term plasticity and derive exact results for the postsynaptic voltage variance. Using approximate results for the postsynaptic firing rate in the limits of low and high correlations, I demonstrate that short-term depression leads to a maximum response for an intermediate number of presynaptic release sites, and that this in turn leads to a tuning-curve response peaked at an optimal presynaptic synchrony set by the number of neurotransmitter release sites per presynaptic neuron. As the nervous system operates under constraints of efficient metabolism it is likely that this phenomenon provides an activity-dependent constraint on network architecture.
Secondly, I consider how synapses exhibiting short-term plasticity transmit spike trains when spike times are autocorrelated. I derive exact results for vesicle occupancy and postsynaptic voltage variance in the case that spiking is a renewal process, with uncorrelated interspike intervals (ISIs). The vesicle occupancy predictions are tested experimentally and shown to be in good agreement with the theory. I demonstrate that neurotransmitter is released at a higher rate when the presynaptic spike train is more regular, but that positively autocorrelated spike trains are better drivers of the postsynaptic voltage when the vesicle release probability is low. I provide accurate approximations to the postsynaptic firing rate, allowing future studies of neuronal circuits and networks with dynamic synapses to incorporate physiologically relevant spiking statistics.
Thirdly, I develop a Bayesian inference method for synaptic parameters. This expands on recent Bayesian approaches in that the likelihood function is exact for both the quantal and dynamic synaptic parameters. This means that it can be used to directly estimate parameters for common synaptic models with few release sites. I apply the method to simulated and real data; demonstrating a substantial improvement over analysis techniques that are based around the mean and variance.
Finally, I consider a spatially extended neuron model where the dendrites taper away from the soma. I derive an accurate asymptotic solution for the voltage profile in a dendritic cable of arbitrary radius profile and use this to determine the profile that optimally transfers voltages to the soma. I find a precise quadratic form that matches results from non-parametric numerical optimisation. The equation predicts diameter profiles from reconstructed cells, suggesting that dendritic diameters optimise passive transfer of synaptic currents
Biologically plausible attractor networks
Attractor networks have shownmuch promise as a neural network architecture
that can describe many aspects of brain function. Much of the field of study
around these networks has coalesced around pioneering work done by John
Hoprield, and therefore many approaches have been strongly linked to the field
of statistical physics. In this thesis I use existing theoretical and statistical notions
of attractor networks, and introduce several biologically inspired extensions
to an attractor network for which a mean-field solution has been previously
derived. This attractor network is a computational neuroscience model
that accounts for decision-making in the situation of two competing stimuli.
By basing our simulation studies on such a network, we are able to study situations where mean-
field solutions have been derived, and use these as the starting
case, which we then extend with large scale integrate-and-fire attractor network
simulations. The simulations are large enough to provide evidence that the results
apply to networks of the size found in the brain. One factor that has been
highlighted by previous research to be very important to brain function is that
of noise. Spiking-related noise is seen to be a factor that influences processes
such as decision-making, signal detection, short-term memory, and memory
recall even with the quite large networks found in the cerebral cortex, and this
thesis aims to measure the effects of noise on biologically plausible attractor
networks. Our results are obtained using a spiking neural network made up
of integrate-and-fire neurons, and we focus our results on the stochastic transition
that this network undergoes. In this thesis we examine two such processes
that are biologically relevant, but for which no mean-field solutions yet
exist: graded firing rates, and diluted connectivity. Representations in the cortex
are often graded, and we find that noise in these networks may be larger than
with binary representations. In further investigations it was shown that diluted
connectivity reduces the effects of noise in the situation where the number of
synapses onto each neuron is held constant. In this thesis we also use the same
attractor network framework to investigate the Communication through Coherence
hypothesis. The Communication through Coherence hypothesis states
that synchronous oscillations, especially in the gamma range, can facilitate communication
between neural systems. It is shown that information transfer from
one network to a second network occurs for a much lower strength of synaptic
coupling between the networks than is required to produce coherence. Thus,
information transmission can occur before any coherence is produced. This indicates
that coherence is not needed for information transmission between coupled
networks. This raises a major question about the Communication through
Coherence hypothesis. Overall, the results provide substantial contributions
towards understanding operation of attractor neuronal networks in the brain
Short-Term Plasticity at the Schaffer Collateral: A New Model with Implications for Hippocampal Processing
A new mathematical model of short-term synaptic plasticity (STP) at the Schaffer collateral is introduced. Like other models of STP, the new model relates short-term synaptic plasticity to an interaction between facilitative and depressive dynamic influences. Unlike previous models, the new model successfully simulates facilitative and depressive dynamics within the framework of the synaptic vesicle cycle. The novelty of the model lies in the description of a competitive interaction between calcium-sensitive proteins for binding sites on the vesicle release machinery. By attributing specific molecular causes to observable presynaptic effects, the new model of STP can predict the effects of specific alterations to the presynaptic neurotransmitter release mechanism. This understanding will guide further experiments into presynaptic functionality, and may contribute insights into the development of pharmaceuticals that target illnesses manifesting aberrant synaptic dynamics, such as Fragile-X syndrome and schizophrenia. The new model of STP will also add realism to brain circuit models that simulate cognitive processes such as attention and memory. The hippocampal processing loop is an example of a brain circuit involved in memory formation. The hippocampus filters and organizes large amounts of spatio-temporal data in real time according to contextual significance. The role of synaptic dynamics in the hippocampal system is speculated to help keep the system close to a region of instability that increases encoding capacity and discriminating capability. In particular, synaptic dynamics at the Schaffer collateral are proposed to coordinate the output of the highly dynamic CA3 region of the hippocampus with the phase-code in the CA1 that modulates communication between the hippocampus and the neocortex
Signalling properties at single synapses and within the interneuronal network in the CA1 region of the rodent hippocampus
Understanding how the complexity of connections among the neurons in the brain is
established and modified in an experience- and activity-dependent way is a challenging
task of Neuroscience. Although in the last decades many progresses have been made in
characterising the basic mechanisms of synaptic transmission, a full comprehension of
how information is transferred and processed by neurons has not been fully achieved.
In the present study, theoretical tools and patch clamp experiments were used to further
investigate synaptic transmission, focusing on quantal transmission at single synapses
and on different types of signalling at the level of a particular interneuronal network in
the CA1 area of the rodent hippocampus.
The simultaneous release of more than one vesicle from an individual presynaptic active
zone is a typical mechanism that can affect the strength and reliability of synaptic
transmission. At many central synapses, however, release caused by a single presynaptic
action potential is limited to one vesicle (univesicular release). The likelihood of
multivesicular release at a particular synapse has been tied to release probability (Pr), and
whether it can occur at Schaffer collateral\u2013CA1 synapses, at which Pr ranges widely, is
controversial. In contrast with previous findings, proofs of multivesicular release at this
synapse have been recently obtained at late developmental stages; however, in the case of
newborn hippocampus, it is still difficult to find strong evidence in one direction or
another.
In order to address this point, in the first part of this study a simple and general stochastic
model of synaptic release has been developed and analytically solved. The model
solution gives analytical mathematical expressions relating basic quantal parameters with
average values of quantities that can be measured experimentally. Comparison of these
quantities with the experimental measures allows to determine the most probable values
of the quantal parameters and to discriminate the univesicular from the multivesicular
mode of glutamate release. The model has been validated with data previously collected
at glutamatergic CA3-CA1 synapses in the hippocampus from newborn (P1-P5 old) rats.
The results strongly support a multivesicular type of release process requiring a variable
pool of immediately releasable vesicles. Moreover, computing quantities that are
functions of the model parameters, the mean amplitude of the synaptic response to the release of a single vesicle (Q) was estimated to be 5-10 pA, in very good agreement with
experimental findings. In addition, a multivesicular type of release was supported by
various experimental evidences: a high variability of the amplitude of successes, with a
coefficient of variation ranging from 0.12 to 0.73; an average potency ratio a2/a1 between
the second and first response to a pair of stimuli bigger than 1; and changes in the
potency of the synaptic response to the first stimulus when the release probability was
modified by increasing or decreasing the extracellular calcium concentration. This work
indicates that at glutamatergic CA3-CA1 synapses of the neonatal rat hippocampus a
single action potential may induce the release of more than one vesicle from the same
release site.
In a more systemic approach to the analysis of communication between neurons, it is
interesting to investigate more complex, network interactions. GABAergic interneurons
constitute a heterogeneous group of cells which exert a powerful control on network
excitability and are responsible for the oscillatory behaviour crucial for information
processing in the brain. They have been differently classified according to their
morphological, neurochemical and physiological characteristics.
In the second part of this study, whole cell patch clamp recordings were used to further
characterize, in transgenic mice expressing EGFP in a subpopulation of GABAergic
interneurons containing somatostatin (GIN mice), the functional properties of EGFPpositive
cells in stratum oriens of the CA1 region of the hippocampus, in slice cultures
obtained from P8 old animals. These cells showed passive and active membrane
properties similar to those found in stratum oriens interneurons projecting to stratum
lacunosum-moleculare. Moreover, they exhibited different firing patterns which were
maintained upon membrane depolarization: irregular (48%), regular (30%) and clustered
(22%). Paired recordings from EGFP-positive cells often revealed electrical coupling
(47% of the cases), which was abolished by carbenoxolone (200 mM). On average, the
coupling coefficient was 0.21 \ub1 0.07. When electrical coupling was particularly strong it
acted as a powerful low-pass filter, thus contributing to alter the output of individual
cells. The dynamic interaction between cells with various firing patterns may differently
control GABAergic signalling, leading, as suggested by simulation data, to a wide range
of interneuronal communication. In additional paired recordings of a presynaptic EGFP positive interneuron and a postsynaptic principal cell, trains of action potentials in
interneurons rarely evoked GABAergic postsynaptic currents (3/45 pairs) with small
amplitude and slow kinetics, and that at 20 Hz exhibited short-term depression. In
contrast, excitatory connections between principal cells and EGFP-positive interneurons
were found more often (17/55 pairs) and exhibited a frequency and use-dependent
facilitation, particularly in the gamma band. In conclusion, it appears that EGFP-positive
interneurons in stratum oriens of GIN mice constitute a heterogeneous population of cells
interconnected via electrical synapses, exhibiting particular features in their chemical and
electrical synaptic signalling. Moreover, the dynamic interaction between these
interneurons may differentially affect target cells and neuronal communication within the
hippocampal network
Noise induced processes in neural systems
Real neurons, and their networks, are far too complex to be described exactly by simple
deterministic equations. Any description of their dynamics must therefore incorporate noise
to some degree. It is my thesis that the nervous system is organized in such a way that its
performance is optimal, subject to this constraint. I further contend that neuronal dynamics
may even be enhanced by noise, when compared with their deterministic counter-parts.
To support my thesis I will present and analyze three case studies. I will show how noise
might (i) extend the dynamic range of mammalian cold-receptors and other cells that
exhibit a temperature-dependent discharge; (ii) feature in the perception of ambiguous
figures such as the Necker cube; (iii) alter the discharge pattern of single cells
- …