163 research outputs found

    Data-true Characterization Of Neuronal Models

    Get PDF
    In this thesis, a weighted least squares approach is initially presented to estimate the parameters of an adaptive quadratic neuronal model. By casting the discontinuities in the state variables at the spiking instants as an impulse train driving the system dynamics, the neuronal output is represented as a linearly parameterized model that depends on filtered versions of the input current and the output voltage at the cell membrane. A prediction errorbased weighted least squares method is formulated for the model. This method allows for rapid estimation of model parameters under a persistently exciting input current injection. Simulation results show the feasibility of this approach to predict multiple neuronal firing patterns. Results of the method using data from a detailed ion-channel based model showed issues that served as the basis for the more robust resonate-and-fire model presented. A second method is proposed to overcome some of the issues found in the adaptive quadratic model presented. The original quadratic model is replaced by a linear resonateand-fire model -with stochastic threshold- that is both computational efficient and suitable for larger network simulations. The parameter estimation method presented here consists of different stages where the set of parameters is divided in to two. The first set of parameters is assumed to represent the subthreshold dynamics of the model, and it is estimated using a nonlinear least squares algorithm, while the second set is associated with the threshold and iii reset parameters as its estimated using maximum likelihood formulations. The validity of the estimation method is then tested using detailed Hodgkin-Huxley model data as well as experimental voltage recordings from rat motoneurons

    Characterization of a Spiking Neuron Model via a Linear Approach

    Get PDF
    In the past decade, characterizing spiking neuron models has been extensively researched as an essential issue in computational neuroscience. In this thesis, we examine the estimation problem of two different neuron models. In Chapter 2, We propose a modified Izhikevich model with an adaptive threshold. In our two-stage estimation approach, a linear least squares method and a linear model of the threshold are derived to predict the location of neuronal spikes. However, desired results are not obtained and the predicted model is unsuccessful in duplicating the spike locations. Chapter 3 is focused on the parameter estimation problem of a multi-timescale adaptive threshold (MAT) neuronal model. Using the dynamics of a non-resetting leaky integrator equipped with an adaptive threshold, a constrained iterative linear least squares method is implemented to fit the model to the reference data. Through manipulation of the system dynamics, the threshold voltage can be obtained as a realizable model that is linear in the unknown parameters. This linearly parametrized realizable model is then utilized inside a prediction error based framework to identify the threshold parameters with the purpose of predicting single neuron precise firing times. This estimation scheme is evaluated using both synthetic data obtained from an exact model as well as the experimental data obtained from in vitro rat somatosensory cortical neurons. Results show the ability of this approach to fit the MAT model to different types of reference data

    Low-dimensional models of single neurons: A review

    Full text link
    The classical Hodgkin-Huxley (HH) point-neuron model of action potential generation is four-dimensional. It consists of four ordinary differential equations describing the dynamics of the membrane potential and three gating variables associated to a transient sodium and a delayed-rectifier potassium ionic currents. Conductance-based models of HH type are higher-dimensional extensions of the classical HH model. They include a number of supplementary state variables associated with other ionic current types, and are able to describe additional phenomena such as sub-threshold oscillations, mixed-mode oscillations (subthreshold oscillations interspersed with spikes), clustering and bursting. In this manuscript we discuss biophysically plausible and phenomenological reduced models that preserve the biophysical and/or dynamic description of models of HH type and the ability to produce complex phenomena, but the number of effective dimensions (state variables) is lower. We describe several representative models. We also describe systematic and heuristic methods of deriving reduced models from models of HH type

    A new Mathematical Framework to Understand Single Neuron Computations

    Get PDF
    An important feature of the nervous system is its ability to adapt to new stimuli. This adaptation allows for optimal encoding of the incoming information by dynamically changing the coding strategy based upon the incoming inputs to the neuron. At the level of single cells, this widespread phenomena is often referred to as spike-frequency adaptation, since it manifests as a history-dependent modulation of the neurons firing frequency. In this thesis I focus on how a neuron is able to adapt its activity to a specific input as well as on the function of such adaptive mechanisms. To study these adaptive processes different approaches have been used, from empirical observations of neural activities to detailed modeling of single cells. Here, I approach these problems by using simplified threshold models. In particular, I introduced a new generalization of the integrate-and-fire model (GIF) along with a convex fitting method allowing for efficient estimation of model parameters. Despite its relative simplicity I show that this neuron model is able to reproduce neuron behaviors with a high degree of accuracy. Moreover, using this method I was able to show that cortical neurons are equipped with two distinct adaptation mechanisms. First, a spike-triggered current that captures the complex influx of ions generated after the emission of a spike. While the second is a movement of the firing threshold, which possibly reflects the slow inactivation of sodium channels induced by the spiking activity. The precise dynamics of these adaptation processes is cell-type specific, explaining the difference of firing activity reported in different neuron types. Consequently, neuronal types can be classified based on model parameters. In Pyramidal neurons spike-dependent adaptation lasts for seconds and follows a scale-free dynamics, which is optimally tuned to encodes the natural inputs that pyramidal neurons receive in vivo. Finally using an extended version of the GIF model, I show that adaptation is not only a spike-dependent phenomenon, but also acts at the subthreshold level. In Pyramidal neurons the dynamics of the firing threshold is influenced by the subthreshold membrane potential. Spike-dependent and voltage-dependent adaptation interact in an activity-dependent way to ultimately shape the filtering properties of the membrane on the input statistics. Equipped with such a mechanism, Pyramidal neurons behave as integrators at low inputs and as a coincidence detectors at high inputs, maintaining sensitivity to input fluctuations across all regimes

    Simulation and Theory of Large-Scale Cortical Networks

    Get PDF
    Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly interconnected: every neuron receives, on average, input from thousands or more presynaptic neurons. In fact, to support such a number of connections, a majority of the volume in the cortical gray matter is filled by axons and dendrites. Besides the networks, neurons themselves are also highly complex. They possess an elaborate spatial structure and support various types of active processes and nonlinearities. In the face of such complexity, it seems necessary to abstract away some of the details and to investigate simplified models. In this thesis, such simplified models of neuronal networks are examined on varying levels of abstraction. Neurons are modeled as point neurons, both rate-based and spike-based, and networks are modeled as block-structured random networks. Crucially, on this level of abstraction, the models are still amenable to analytical treatment using the framework of dynamical mean-field theory. The main focus of this thesis is to leverage the analytical tractability of random networks of point neurons in order to relate the network structure, and the neuron parameters, to the dynamics of the neurons—in physics parlance, to bridge across the scales from neurons to networks. More concretely, four different models are investigated: 1) fully connected feedforward networks and vanilla recurrent networks of rate neurons; 2) block-structured networks of rate neurons in continuous time; 3) block-structured networks of spiking neurons; and 4) a multi-scale, data-based network of spiking neurons. We consider the first class of models in the light of Bayesian supervised learning and compute their kernel in the infinite-size limit. In the second class of models, we connect dynamical mean-field theory with large-deviation theory, calculate beyond mean-field fluctuations, and perform parameter inference. For the third class of models, we develop a theory for the autocorrelation time of the neurons. Lastly, we consolidate data across multiple modalities into a layer- and population-resolved model of human cortex and compare its activity with cortical recordings. In two detours from the investigation of these four network models, we examine the distribution of neuron densities in cerebral cortex and present a software toolbox for mean-field analyses of spiking networks

    Automated High-Throughput Characterization of Single Neurons by Means of Simplified Spiking Models

    Get PDF
    Single-neuron models are useful not only for studying the emergent properties of neural circuits in large-scale simulations, but also for extracting and summarizing in a principled way the information contained in electrophysiological recordings. Here we demonstrate that, using a convex optimization procedure we previously introduced, a Generalized Integrate-and-Fire model can be accurately fitted with a limited amount of data. The model is capable of predicting both the spiking activity and the subthreshold dynamics of different cell types, and can be used for online characterization of neuronal properties. A protocol is proposed that, combined with emergent technologies for automatic patch-clamp recordings, permits automated, in vitro high-throughput characterization of single neurons

    The Dynamics of Adapting Neurons

    Get PDF
    How do neurons dynamically encode and treat information? Each neuron communicates with its distinctive language made of long silences intermitted by occasional spikes. The spikes are prompted by the pooled effect of a population of pre-synaptic neurons. To understand the operation made by single neurons is to create a quantitative description of their dynamics. The results presented in this thesis describe the necessary elements for a quantitative description of single neurons. Almost all chapters can be unified under the theme of adaptation. Neuronal adaptation plays an important role in the transduction of a given stimulation into a spike train. The work described here shows how adaptation is brought by every spike in a stereotypical fashion. The spike-triggered adaptation is then measured in three main types of cortical neurons. I analyze in detail how the different adaptation profiles can reproduce the diversity of firing patterns observed in real neurons. I also summarize the most recent results concerning the spike-time prediction in real neurons, resulting in a well-founded single-neuron model. This model is then analyzed to understand how populations can encode time-dependent signals and how time-dependent signals can be decoded from the activity of populations. Finally, two lines of investigation in progress are described, the first expands the study of spike-triggered adaptation on longer time scales and the second extends the quantitative neuron models to models with active dendrites
    • …
    corecore