307 research outputs found

    A biophysical model explains the spontaneous bursting behavior in the developing retina

    Full text link
    During early development, waves of activity propagate across the retina and play a key role in the proper wiring of the early visual system. During the stage II these waves are triggered by a transient network of neurons, called Starburst Amacrine Cells (SACs), showing a bursting activity which disappears upon further maturation. While several models have attempted to reproduce retinal waves, none of them is able to mimic the rhythmic autonomous bursting of individual SACs and reveal how these cells change their intrinsic properties during development. Here, we introduce a mathematical model, grounded on biophysics, which enables us to reproduce the bursting activity of SACs and to propose a plausible, generic and robust, mechanism that generates it. The core parameters controlling repetitive firing are fast depolarizing VV-gated calcium channels and hyperpolarizing VV-gated potassium channels. The quiescent phase of bursting is controlled by a slow after hyperpolarization (sAHP), mediated by calcium-dependent potassium channels. Based on a bifurcation analysis we show how biophysical parameters, regulating calcium and potassium activity, control the spontaneously occurring fast oscillatory activity followed by long refractory periods in individual SACs. We make a testable experimental prediction on the role of voltage-dependent potassium channels on the excitability properties of SACs and on the evolution of this excitability along development. We also propose an explanation on how SACs can exhibit a large variability in their bursting periods, as observed experimentally within a SACs network as well as across different species, yet based on a simple, unique, mechanism. As we discuss, these observations at the cellular level have a deep impact on the retinal waves description.Comment: 25 pages, 13 figures, submitte

    Stochastic Representations of Ion Channel Kinetics and Exact Stochastic Simulation of Neuronal Dynamics

    Full text link
    In this paper we provide two representations for stochastic ion channel kinetics, and compare the performance of exact simulation with a commonly used numerical approximation strategy. The first representation we present is a random time change representation, popularized by Thomas Kurtz, with the second being analogous to a "Gillespie" representation. Exact stochastic algorithms are provided for the different representations, which are preferable to either (a) fixed time step or (b) piecewise constant propensity algorithms, which still appear in the literature. As examples, we provide versions of the exact algorithms for the Morris-Lecar conductance based model, and detail the error induced, both in a weak and a strong sense, by the use of approximate algorithms on this model. We include ready-to-use implementations of the random time change algorithm in both XPP and Matlab. Finally, through the consideration of parametric sensitivity analysis, we show how the representations presented here are useful in the development of further computational methods. The general representations and simulation strategies provided here are known in other parts of the sciences, but less so in the present setting.Comment: 39 pages, 6 figures, appendix with XPP and Matlab cod

    Statistical-Mechanical Measure of Stochastic Spiking Coherence in A Population of Inhibitory Subthreshold Neurons

    Full text link
    By varying the noise intensity, we study stochastic spiking coherence (i.e., collective coherence between noise-induced neural spikings) in an inhibitory population of subthreshold neurons (which cannot fire spontaneously without noise). This stochastic spiking coherence may be well visualized in the raster plot of neural spikes. For a coherent case, partially-occupied "stripes" (composed of spikes and indicating collective coherence) are formed in the raster plot. This partial occupation occurs due to "stochastic spike skipping" which is well shown in the multi-peaked interspike interval histogram. The main purpose of our work is to quantitatively measure the degree of stochastic spiking coherence seen in the raster plot. We introduce a new spike-based coherence measure MsM_s by considering the occupation pattern and the pacing pattern of spikes in the stripes. In particular, the pacing degree between spikes is determined in a statistical-mechanical way by quantifying the average contribution of (microscopic) individual spikes to the (macroscopic) ensemble-averaged global potential. This "statistical-mechanical" measure MsM_s is in contrast to the conventional measures such as the "thermodynamic" order parameter (which concerns the time-averaged fluctuations of the macroscopic global potential), the "microscopic" correlation-based measure (based on the cross-correlation between the microscopic individual potentials), and the measures of precise spike timing (based on the peri-stimulus time histogram). In terms of MsM_s, we quantitatively characterize the stochastic spiking coherence, and find that MsM_s reflects the degree of collective spiking coherence seen in the raster plot very well. Hence, the "statistical-mechanical" spike-based measure MsM_s may be used usefully to quantify the degree of stochastic spiking coherence in a statistical-mechanical way.Comment: 16 pages, 5 figures, to appear in the J. Comput. Neurosc

    Dynamics of Coupled Noisy Neural Oscillators with Heterogeneous Phase Resetting Curves

    Get PDF
    Pulse-coupled phase oscillators have been utilized in a variety of contexts. Motivated by neuroscience, we study a network of pulse-coupled phase oscillators receiving independent and correlated noise. An additional physiological attribute, heterogeneity, is incorporated in the phase resetting curve (PRC), which is a vital entity for modeling the biophysical dynamics of oscillators. An accurate probability density or mean field description is large dimensional, requiring reduction methods for tractability. We present a reduction method to capture the pairwise synchrony via the probability density of the phase differences, and explore the robustness of the method. We find the reduced methods can capture some of the synchronous dynamics in these networks. The variance of the noisy period (or spike times) in this network is also considered. In particular, we find phase oscillators with predominately positive PRCs (type 1) have larger variance with inhibitory pulse- coupling than PRCs with a larger negative regions (type 2), but with excitatory pulse-coupling the opposite happens – type 1 oscillators have lower variability than type 2. Analysis of this phenomena is provided via an asymptotic approximation with weak noise and weak coupling, where we demonstrate how the individual PRC alters variability with pulse-coupling. We make comparisons of the phase oscillators to full oscillator networks and discuss the utility and shortcomings

    Amplification of asynchronous inhibition-mediated synchronization by feedback in recurrent networks

    Get PDF
    Synchronization of 30-80 Hz oscillatory activity of the principle neurons in the olfactory bulb (mitral cells) is believed to be important for odor discrimination. Previous theoretical studies of these fast rhythms in other brain areas have proposed that principle neuron synchrony can be mediated by short-latency, rapidly decaying inhibition. This phasic inhibition provides a narrow time window for the principle neurons to fire, thus promoting synchrony. However, in the olfactory bulb, the inhibitory granule cells produce long lasting, small amplitude, asynchronous and aperiodic inhibitory input and thus the narrow time window that is required to synchronize spiking does not exist. Instead, it has been suggested that correlated output of the granule cells could serve to synchronize uncoupled mitral cells through a mechanism called "stochastic synchronization", wherein the synchronization arises through correlation of inputs to two neural oscillators. Almost all work on synchrony due to correlations presumes that the correlation is imposed and fixed. Building on theory and experiments that we and others have developed, we show that increased synchrony in the mitral cells could produce an increase in granule cell activity for those granule cells that share a synchronous group of mitral cells. Common granule cell input increases the input correlation to the mitral cells and hence their synchrony by providing a positive feedback loop in correlation. Thus we demonstrate the emergence and temporal evolution of input correlation in recurrent networks with feedback. We explore several theoretical models of this idea, ranging from spiking models to an analytically tractable model. © 2010 Marella, Ermentrout

    Deep hybrid modeling of neuronal dynamics using generative adversarial networks

    Get PDF
    Mechanistic modeling and machine learning methods are powerful techniques for approximating biological systems and making accurate predictions from data. However, when used in isolation these approaches suffer from distinct shortcomings: model and parameter uncertainty limit mechanistic modeling, whereas machine learning methods disregard the underlying biophysical mechanisms. This dissertation constructs Deep Hybrid Models that address these shortcomings by combining deep learning with mechanistic modeling. In particular, this dissertation uses Generative Adversarial Networks (GANs) to provide an inverse mapping of data to mechanistic models and identifies the distributions of mechanistic model parameters coherent to the data. Chapter 1 provides background information on the major ideas that are important for this dissertation. It provides an introduction to parameter inference techniques and highlights some of the methodologies available for solving stochastic inverse problems. Chapter 2 starts with a brief overview of the Hodgkin-Huxley model, and then introduces other conductance-based models that are used in the dissertation. The first part of Chapter 3 focuses on methodologies for global sensitivity analysis and global optimization, in particular Sobol sensitivity analysis and Differential Evolution. The second part of this chapter explains how the Markov chain Monte Carlo (MCMC) algorithm can be used for parameter inference and then introduces a novel parameter inference tool based on conditional Generative Adversarial Networks (cGANs). In Chapter 4, the performance of cGAN and MCMC are compared on synthetic targets. Chapter 5 then uses cGAN to infer biophysicalparameters from experimental data recorded at the single-cell and network levels from neurons involved in the regulation of circadian (~24-hour) rhythms and from brain regions associated with neurodegenerative diseases. Finally, conclusions and suggestions for further research are presented in Chapter 6

    Macroscopic Models and Phase Resetting of Coupled Biological Oscillators

    Full text link
    This thesis concerns the derivation and analysis of macroscopic mathematical models for coupled biological oscillators. Circadian rhythms, heart beats, and brain waves are all examples of biological rhythms formed through the aggregation of the rhythmic contributions of thousands of cellular oscillations. These systems evolve in an extremely high-dimensional phase space having at least as many degrees of freedom as the number of oscillators. This high-dimensionality often contrasts with the low-dimensional behavior observed on the collective or macroscopic scale. Moreover, the macroscopic dynamics are often of greater interest in biological applications. Therefore, it is imperative that mathematical techniques are developed to extract low-dimensional models for the macroscopic behavior of these systems. One such mathematical technique is the Ott-Antonsen ansatz. The Ott-Antonsen ansatz may be applied to high-dimensional systems of heterogeneous coupled oscillators to derive an exact low-dimensional description of the system in terms of macroscopic variables. We apply the Ott-Antonsen technique to determine the sensitivity of collective oscillations to perturbations with applications to neuroscience. The power of the Ott-Antonsen technique comes at the expense of several limitations which could limit its applicability to biological systems. To address this we compare the Ott-Antonsen ansatz with experimental measurements of circadian rhythms and numerical simulations of several other biological systems. This analysis reveals that a key assumption of the Ott-Antonsen approach is violated in these systems. However, we discover a low-dimensional structure in these data sets and characterize its emergence through a simple argument depending only on general phase-locking behavior in coupled oscillator systems. We further demonstrate the structure's emergence in networks of noisy heterogeneous oscillators with complex network connectivity. We show how this structure may be applied as an ansatz to derive low-dimensional macroscopic models for oscillator population activity. This approach allows for the incorporation of cellular-level experimental data into the macroscopic model whose parameters and variables can then be directly associated with tissue- or organism-level properties, thereby elucidating the core properties driving the collective behavior of the system. We first apply our ansatz to study the impact of light on the mammalian circadian system. To begin we derive a low-dimensional macroscopic model for the core circadian clock in mammals. Significantly, the variables and parameters in our model have physiological interpretations and may be compared with experimental results. We focus on the effect of four key factors which help shape the mammalian phase response to light: heterogeneity in the population of oscillators, the structure of the typical light phase response curve, the fraction of oscillators which receive direct light input and changes in the coupling strengths associated with seasonal day-lengths. We find these factors can explain several experimental results and provide insight into the processing of light information in the mammalian circadian system. In a second application of our ansatz we derive a pair of low-dimensional models for human circadian rhythms. We fit the model parameters to measurements of light sensitivity in human subjects, and validate these parameter fits with three additional data sets. We compare our model predictions with those made by previous phenomenological models for human circadian rhythms. We find our models make new predictions concerning the amplitude dynamics of the human circadian clock and the light entrainment properties of the clock. These results could have applications to the development of light-based therapies for circadian disorders.PHDApplied and Interdisciplinary MathematicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/138766/1/khannay_1.pd
    corecore