66 research outputs found

    Stochastic Modelling of Calcium Dynamics

    Get PDF
    Calcium (Ca2+) ist ein in eukaryotischen Zellen allgegenwärtiger sekundärer Botenstoff. Durch Inositoltrisphosphat (IP3) ausgelöste Ca2+-Signale von IP3-Rezeptoren (IP3Rs) sind eines der universellsten Zell Signalübertragungssysteme. Ca2+ Signale sind fundamental stochastisch. Dennoch hat sich die Modellierung dieser Ca2+-Signale bisher stark auf deterministische Ansätze mit gewöhnlichen Differentialgleichungen gestützt. Diese wurden als Ratengleichungen etabliert und beruhen auf räumlich gemitteltem Ca2+ Werten. Diese Ansätze vernachlässigen Rauschen und Zufall. In dieser Dissertation präsentieren wir ein stochastisches Modell zur Erzeugung von Ca2+ Spikes in Form einer linearen Zustands-Kette. Die Anzahl offener Cluster ist die Zustandsvariable und Erholung von negativem Feedback wird berücksichtigt. Wir identifizieren einen Ca2+ Spike mit dem ersten Erreichen eines kritischen Zustands und sein Interspike Intervall mit der first-passage time (FPT) zu diesem Zustand. Dafür entwickeln wir einen allgemeinen mathematischen Rahmen zur analytischen Berechnung von FPTs auf solch einer Kette. Wir finden z.B. einen allgemein verringerten CV, der ein deutliches Minimum in Abhängigkeit der Zustandskettenlänge N aufweist. Dies nennen wir resonante Länge. Danach ergänzen wir positives Feedback und wenden das Modell auf verschiedene Zelltypen an. Es erfasst alle verfügbaren allgemeinen Beobachtungen zu Ca2+ Signalvorgängen. Es erlaubt uns Einblicke in den Zusammenhang von Agonistenstärke und Puffraten. Auch werden einzelne Ca2+ Spikes in Purkinje Neuronen, welche eine Rolle für Lernen und Erinnerung spielen, als stochastisches reaction-diffusion Model in einer 3D Dornenfortsatz Geometrie modelliert. Ataxia, eine Krankheit, die zum Verlust der Feinmotorik führt, wird auf defekte IP3R zurückgeführt, die abnormale Ca2+ Spikes erzeugen. Dieser Zustand wird ebenfalls untersucht und es wird ein Weg zur Wiederherstellung normaler Ca2+ Spikes vorgeschlagen.Calcium (Ca2+) is a ubiquitous 2nd messenger molecule in all eukaryotic cells. Inositol trisphosphate (IP3)-induced Ca2+ signalling via IP3 receptors (IP3Rs) is one of the most universal signalling systems used by cells to transmit information. Ca2+ signalling is noisy and a fundamentally stochastic system. Yet, modelling of IP3-induced Ca2+ signalling has relied heavily on deterministic approaches with ordinary differential equations in the past, established as rate equations using spatially averaged Ca2+. These approaches neglect the defining features of Ca2+ signalling, noise and fluctuations. In this thesis, we propose a stochastic model of Ca2+ spike generation in terms of a linear state chain with the number of open clusters as its state variable, also including recovery from negative feedback. We identify a Ca2+ spike with reaching a critical state for the first time, and its interspike interval with the first passage time to that state. To this end, a general mathematical framework for analytically computing first-passage times of such a linear chain is developed first. A substantially reduced CV with a pronounced minimum, dependent on the chain length N, termed resonant length, are found. Positive feedback is then included into the model, and it is applied directly to various cell types. The model is fundamentally stochastic and successfully captures all available general observations on Ca2+ signalling. Also, we specifically study single Ca2+ spikes in spines of Purkinje neurons, assumed to be important for motor learning and memory, using MCell to simulate a reaction-diffusion system in a complex 3D Purkinje spine geometry. The model successfully reproduces experimentally findings on properties of Ca2+ spikes. Ataxia, a pathological condition resulting in, e.g., a loss of fine motor control, assumed to be caused by malfunctioning IP3Rs, is modelled and a possible way of recovery is suggested

    Learning in clustered spiking networks

    Get PDF
    Neurons spike on a millisecond time scale while behaviour typically spans hundreds of milliseconds to seconds and longer. Neurons have to bridge this time gap when computing and learning behaviours of interest. Recent computational work has shown that neural circuits can bridge this time gap when connected in specific ways. Moreover, the connectivity patterns can develop using plasticity rules typically considered to be biologically plausible. In this thesis, we focus on one type of connectivity where excitatory neurons are grouped in clusters. Strong recurrent connectivity within the clusters reverberates the activity and prolongs the time scales in the network. This way, the clusters of neurons become the basic functional units of the circuit, in line with an increasing number of experimental studies. We study a general architecture where plastic synapses connect the clustered network to a read-out network. We demonstrate the usefulness of this architecture for two different problems: 1) learning and replaying sequences; 2) learning statistical structure. The time scales in both problems range from hundreds of milliseconds to seconds and we address the problems through simulation and analysis of spiking networks. We show that the clustered organization circumvents the need for non-bio-plausible mathematical optimizations and instead allows the use of unsupervised spike-timing-dependent plasticity rules. Additionally, we make qualitative links to experimental findings and predictions for both problems studied. Finally, we speculate about future directions that could extend upon our findings.Open Acces

    The enhanced rise and delayed fall of memory in a model of synaptic integration: extension to discrete state synapses

    No full text
    Integrate-and-express models of synaptic plasticity propose that synapses may act as low-pass filters, integrating synaptic plasticity induction signals in order to discern trends before expressing synaptic plasticity. We have previously shown that synaptic filtering strongly controls destabilizing fluctuations in developmental models. When applied to palimpsest memory systems that learn new memories by forgetting old ones, we have also shown that with binary-strength synapses, integrative synapses lead to an initial memory signal rise before its fall back to equilibrium. Such an initial rise is in dramatic contrast to nonintegrative synapses, in which the memory signal falls monotonically. We now extend our earlier analysis of palimpsest memories with synaptic filters to consider the more general case of discrete state, multilevel synapses. We derive exact results for the memory signal dynamics and then consider various simplifying approximations. We show that multilevel synapses enhance the initial rise in the memory signal and then delay its subsequent fall by inducing a plateau-like region in the memory signal. Such dynamics significantly increase memory lifetimes, defined by a signal-to-noise ratio (SNR). We derive expressions for optimal choices of synaptic parameters (filter size, number of strength states, number of synapses) that maximize SNR memory lifetimes. However, we find that with memory lifetimes defined via mean-first-passage times, such optimality conditions do not exist, suggesting that optimality may be an artifact of SNRs

    Sample Path Analysis of Integrate-and-Fire Neurons

    Get PDF
    Computational neuroscience is concerned with answering two intertwined questions that are based on the assumption that spatio-temporal patterns of spikes form the universal language of the nervous system. First, what function does a specific neural circuitry perform in the elaboration of a behavior? Second, how do neural circuits process behaviorally-relevant information? Non-linear system analysis has proven instrumental in understanding the coding strategies of early neural processing in various sensory modalities. Yet, at higher levels of integration, it fails to help in deciphering the response of assemblies of neurons to complex naturalistic stimuli. If neural activity can be assumed to be primarily driven by the stimulus at early stages of processing, the intrinsic activity of neural circuits interacts with their high-dimensional input to transform it in a stochastic non-linear fashion at the cortical level. As a consequence, any attempt to fully understand the brain through a system analysis approach becomes illusory. However, it is increasingly advocated that neural noise plays a constructive role in neural processing, facilitating information transmission. This prompts to gain insight into the neural code by studying the stochasticity of neuronal activity, which is viewed as biologically relevant. Such an endeavor requires the design of guiding theoretical principles to assess the potential benefits of neural noise. In this context, meeting the requirements of biological relevance and computational tractability, while providing a stochastic description of neural activity, prescribes the adoption of the integrate-and-fire model. In this thesis, founding ourselves on the path-wise description of neuronal activity, we propose to further the stochastic analysis of the integrate-and fire model through a combination of numerical and theoretical techniques. To begin, we expand upon the path-wise construction of linear diffusions, which offers a natural setting to describe leaky integrate-and-fire neurons, as inhomogeneous Markov chains. Based on the theoretical analysis of the first-passage problem, we then explore the interplay between the internal neuronal noise and the statistics of injected perturbations at the single unit level, and examine its implications on the neural coding. At the population level, we also develop an exact event-driven implementation of a Markov network of perfect integrate-and-fire neurons with both time delayed instantaneous interactions and arbitrary topology. We hope our approach will provide new paradigms to understand how sensory inputs perturb neural intrinsic activity and accomplish the goal of developing a new technique for identifying relevant patterns of population activity. From a perturbative perspective, our study shows how injecting frozen noise in different flavors can help characterize internal neuronal noise, which is presumably functionally relevant to information processing. From a simulation perspective, our event-driven framework is amenable to scrutinize the stochastic behavior of simple recurrent motifs as well as temporal dynamics of large scale networks under spike-timing-dependent plasticity

    Implications of stochastic ion channel gating and dendritic spine plasticity for neural information processing and storage

    Get PDF
    On short timescales, the brain represents, transmits, and processes information through the electrical activity of its neurons. On long timescales, the brain stores information in the strength of the synaptic connections between its neurons. This thesis examines the surprising implications of two separate, well documented microscopic processes — the stochastic gating of ion channels and the plasticity of dendritic spines — for neural information processing and storage. Electrical activity in neurons is mediated by many small membrane proteins called ion channels. Although single ion channels are known to open and close stochastically, the macroscopic behaviour of populations of ion channels are often approximated as deterministic. This is based on the assumption that the intrinsic noise introduced by stochastic ion channel gating is so weak as to be negligible. In this study we take advantage of newly developed efficient computer simulation methods to examine cases where this assumption breaks down. We find that ion channel noise can mediate spontaneous action potential firing in small nerve fibres, and explore its possible implications for neuropathic pain disorders of peripheral nerves. We then characterise the magnitude of ion channel noise for single neurons in the central nervous system, and demonstrate through simulation that channel noise is sufficient to corrupt synaptic integration, spike timing and spike reliability in dendritic neurons. The second topic concerns neural information storage. Learning and memory in the brain has long been believed to be mediated by changes in the strengths of synaptic connections between neurons — a phenomenon termed synaptic plasticity. Most excitatory synapses in the brain are hosted on small membrane structures called dendritic spines, and plasticity of these synapses is dependent on calcium concentration changes within the dendritic spine. In the last decade, it has become clear that spines are highly dynamic structures that appear and disappear, and can shrink and enlarge on rapid timescales. It is also clear that this spine structural plasticity is intimately linked to synaptic plasticity. Small spines host weak synapses, and large spines host strong synapses. Because spine size is one factor which determines synaptic calcium concentration, it is likely that spine structural plasticity influences the rules of synaptic plasticity. We theoretically study the consequences of this observation, and find that different spine-size to synaptic-strength relationships can lead to qualitative differences in long-term synaptic strength dynamics and information storage. This novel theory unifies much existing disparate data, including the unimodal distribution of synaptic strength, the saturation of synaptic plasticity, and the stability of strong synapses

    Mesoscale Systems, Finite Size Effects, and Balanced Neural Networks

    Get PDF
    Cortical populations are typically in an asynchronous state, sporadically interrupted by brief epochs of coordinated population activity. Current cortical models are at a loss to explain this combination of states. At one extreme are network models where recurrent in- hibition dynamically stabilizes an asynchronous low activity state. While these networks are widely used they cannot produce the coherent population-wide activity that is reported in a variety of datasets. At the other extreme are models where short term synaptic depression between excitatory neurons can generate the epochs of population-wide activity. However, in these networks inhibition plays only a perfunctory role in network stability, which is at odds with many reports across cortex. In this study we analyze spontaneously active in vitro preparations of primary auditory cortex that show dynamics that are emblematic of this mix- ture of states. To capture this complex population activity we consider models where large excitation is balanced by recurrent inhibition yet we include short term synaptic depression dynamics of the excitatory connections. This model gives very rich nonlinear behavior that mimics the core features of the in vitro data, including the possibility of low frequency (2- 12 Hz) rhythmic dynamics within population events. Our study extends balanced network models to account for nonlinear, population-wide correlated activity, thereby providing a critical step in a mechanistic theory of realistic cortical activity. We further investigate an extension of this model that l exhibits clearly non-Arrhenius behavior, whereby lower noise systems may exhibit faster escape from a stable state. We show that this behavior is due to the system size dependent vector field, intrinsically linking noise and dynamics

    Contributions of synaptic filters to models of synaptically stored memory

    No full text
    The question of how neural systems encode memories in one-shot without immediately disrupting previously stored information has puzzled theoretical neuroscientists for years and it is the central topic of this thesis. Previous attempts on this topic, have proposed that synapses probabilistically update in response to plasticity inducing stimuli to effectively delay the degradation of old memories in the face of ongoing memory storage. Indeed, experiments have shown that synapses do not immediately respond to plasticity inducing stimuli, since these must be presented many times before synaptic plasticity is expressed. Such a delay could be due to the stochastic nature of synaptic plasticity or perhaps because induction signals are integrated before overt strength changes occur.The later approach has been previously applied to control fluctuations in neural development by low-pass filtering induction signals before plasticity is expressed. In this thesis we consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals to a threshold before expressing plasticity. We report novel recall dynamics and considerable improvements in memory lifetimes against a prominent model of synaptically stored memory. With integrating synapses the memory trace initially rises before reaching a maximum and then falls. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. Furthermore, we find that integrating synapses possess natural timescales that can be used to consider the transition to late-phase plasticity under spaced repetition patterns known to lead to optimal storage conditions. We find that threshold crossing statistics differentiate between massed and spaced memory repetition patterns. However, isolated integrative synapses obtain an insufficient statistical sample to detect the stimulation pattern within a few memory repetitions. We extend the modelto consider the cooperation of well-known intracellular signalling pathways in detecting storage conditions by utilizing the profile of postsynaptic depolarization. We find that neuron wide signalling and local synaptic signals can be combined to detect optimal storage conditions that lead to stable forms of plasticity in a synapse specific manner.These models can be further extended to consider heterosynaptic and neuromodulatory interactions for late-phase plasticity.<br/

    Memory nearly on a spring: A mean first passage time approach to memory lifetimes

    No full text
    We study memory lifetimes in a perceptron-based framework with binary synapses, using the mean first passage time for the perceptron's total input to fall below firing threshold to define memory lifetimes. Working with the simplest memory-related model of synaptic plasticity, we may obtain exact results for memory lifetimes or, working in the continuum limit, good analytical approximations that afford either much qualitative insight or extremely good quantitative agreement. In one particular limit, we find that memory dynamics reduce to the well-understood Ornstein-Uhlenbeck process. We show that asymptotically, the lifetimes of memories grow logarithmically in the number of synapses when the perceptron's firing threshold is zero, reproducing standard results from signal-to-noise ratio analyses. However, this is only an asymptotically valid result, and we show that extending its application outside the range of its validity leads to a massive overestimate of the minimum number of synapses required for successful memory encoding. In the case that the perceptron's firing threshold is positive, we find the remarkable result that memory lifetimes are strictly bounded from above. Asymptotically, the dependence of memory lifetimes on the number of synapses drops out entirely, and this asymptotic result provides a strict upper bound on memory lifetimes away from this asymptotic regime. The classic logarithmic growth of memory lifetimes in the simplest, palimpsest memories is therefore untypical and nongeneric: memory lifetimes are typically strictly bounded from above. </jats:p
    corecore