158 research outputs found

    Sample Path Analysis of Integrate-and-Fire Neurons

    Get PDF
    Computational neuroscience is concerned with answering two intertwined questions that are based on the assumption that spatio-temporal patterns of spikes form the universal language of the nervous system. First, what function does a specific neural circuitry perform in the elaboration of a behavior? Second, how do neural circuits process behaviorally-relevant information? Non-linear system analysis has proven instrumental in understanding the coding strategies of early neural processing in various sensory modalities. Yet, at higher levels of integration, it fails to help in deciphering the response of assemblies of neurons to complex naturalistic stimuli. If neural activity can be assumed to be primarily driven by the stimulus at early stages of processing, the intrinsic activity of neural circuits interacts with their high-dimensional input to transform it in a stochastic non-linear fashion at the cortical level. As a consequence, any attempt to fully understand the brain through a system analysis approach becomes illusory. However, it is increasingly advocated that neural noise plays a constructive role in neural processing, facilitating information transmission. This prompts to gain insight into the neural code by studying the stochasticity of neuronal activity, which is viewed as biologically relevant. Such an endeavor requires the design of guiding theoretical principles to assess the potential benefits of neural noise. In this context, meeting the requirements of biological relevance and computational tractability, while providing a stochastic description of neural activity, prescribes the adoption of the integrate-and-fire model. In this thesis, founding ourselves on the path-wise description of neuronal activity, we propose to further the stochastic analysis of the integrate-and fire model through a combination of numerical and theoretical techniques. To begin, we expand upon the path-wise construction of linear diffusions, which offers a natural setting to describe leaky integrate-and-fire neurons, as inhomogeneous Markov chains. Based on the theoretical analysis of the first-passage problem, we then explore the interplay between the internal neuronal noise and the statistics of injected perturbations at the single unit level, and examine its implications on the neural coding. At the population level, we also develop an exact event-driven implementation of a Markov network of perfect integrate-and-fire neurons with both time delayed instantaneous interactions and arbitrary topology. We hope our approach will provide new paradigms to understand how sensory inputs perturb neural intrinsic activity and accomplish the goal of developing a new technique for identifying relevant patterns of population activity. From a perturbative perspective, our study shows how injecting frozen noise in different flavors can help characterize internal neuronal noise, which is presumably functionally relevant to information processing. From a simulation perspective, our event-driven framework is amenable to scrutinize the stochastic behavior of simple recurrent motifs as well as temporal dynamics of large scale networks under spike-timing-dependent plasticity

    Nonstationary stochastic resonance viewed through the lens of information theory

    Full text link
    In biological systems, information is frequently transferred with Poisson like spike processes (shot noise) modulated in time by information-carrying signals. How then to quantify information transfer for the output for such nonstationary input signals of finite duration? Is there some minimal length of the input signal duration versus its strength? Can such signals be better detected when immersed in noise stemming from the surroundings by increasing the stochastic intensity? These are some basic questions which we attempt to address within an analytical theory based on the Kullback-Leibler information concept applied to random processes

    Minimum-error, energy-constrained source coding by sensory neurons

    Get PDF
    Neural coding, the process by which neurons represent, transmit, and manipulate physical signals, is critical to the function of the nervous system. Despite years of study, neural coding is still not fully understood. Efforts to model neural coding could improve both the understanding of the nervous system and the design of artificial devices which interact with neurons. Sensory receptors and neurons transduce physical signals into a sequence of action potentials, called a spike train. The principles which underly the translation from signal to spike train are still under investigation. From the perspective of an organism, neural codes which maximize the fidelity of the encoded signal (minimize encoding error), provide a competitive advantage. Selective pressure over evolutionary timescales has likely encouraged neural codes which minimize encoding error. At the same time, neural coding is metabolically expensive, which suggests that selective pressure would also encourage neural codes which minimize energy. Based on these assumptions, this work proposes a principle of neural coding which captures the trade-off between error and energy as a constrained optimization problem of minimizing encoding error while satisfying a constraint on energy. A solution to the proposed optimization problem is derived in the limit of high spike-rates. The solution is to track the instantaneous reconstruction error, and to time spikes when the error crosses a threshold value. In the limit of large signals, the threshold level is a constant, but in general it is signal dependent. This coding model, called the neural source coder, implies neurons should be able to track reconstruction error internally, using the error signal to precisely time spikes. Mathematically, this model is similar to existing adaptive threshold models, but it provides a new way to understand coding by sensory neurons. Comparing the predictions of the neural source coder to experimental data recorded from a peripheral neuron, the coder is able to predict spike times with considerable accuracy. Intriguingly, this is also true for a cortical neuron which has a low spike-rate. Reconstructions using the neural source coder show lower error than other spiking neuron models. The neural source coder also predicts the asymmetric spike-rate adaptation seen in sensory neurons (the primary-like response). An alternative expression for the neural source coder is as an instantaneous-rate coder of a rate function which depends on the signal, signal derivative, and encoding parameters. The instantaneous rate closely predicts experimental peri-stimulus time histograms. The addition of a stochastic threshold to the neural source coder accounts for the spike-time jitter observed in experimental datasets. Jittered spike-trains from the neural source coder show long-term interval statistics which closely match experimental recordings from a peripheral neuron. Moreover, the spike trains have strongly anti-correlated intervals, a feature observed in experimental data. Interestingly, jittered spike-trains do not improve reconstruction error for an individual neuron, but reconstruction error is reduced in simulations of small populations of independent neurons. This suggests that jittered spike-trains provide a method for small populations of sensory neurons to improve encoding error. Finally, a sound coding method for applying the neural source coder to timing spikes for cochlear implants is proposed. For each channel of the cochlear implant, a neural source coder can be used to time pulses to follow the patterns expected by peripheral neurons. Simulations show reduced reconstruction error compared to standard approaches using the signal envelope. Initial experiments with normal-hearing subjects show that a vocoder simulating this cochlear implant sound coding approach results in better speech perception thresholds when compared to a standard noise vocoder. Although further experiments with cochlear implant users are critical, initial results encourage further study of the proposed sound-coding method. Overall, the proposed principle of minimum-error, energy-constrained encoding for sensory neural coding can be implemented by a spike-timing model with a feedback loop which computes reconstruction error. This model of neural source coding predicts a wide range of experimental observations from both peripheral and cortical neurons. The close agreement between experimental data and the predictions of the neural source coder suggests a fundamental principle underlying neural coding

    Discrimination and control in stochastic neuron models

    Get PDF
    Major topics of great interest in neuroscience involve understanding the brain function in stimuli coding, perceptive discrimination, and movement control through neuronal activities. Many researchers are designing biophysical and psychological experiments to study the activities of neurons in the presence of various stimuli. People have also been trying to link the neural responses to human perceptual and behavioral level. In addition, mathematical models and neural networks have been developed to investigate how neurons respond and communicate with each other. In this thesis, my aim is to understand how the central nervous system performs discrimination tasks and achieves precise control of movement, using noisy neural signals. I have studied, both through experimental and modelling approaches, how neurons respond to external stimuli. I worked in three aspects in details. The first is the neuronal coding mechanism of input stimuli with different temporal frequencies. Intracellular recordings of single neurons were performed with patch-clamp techniques to study the neural activities in rats somatosensory cortices in vitro, and the simplest possible neural model—integrate-and-fire model—was used to simulate the observations. The results obtained from the simulation were very consistent with that in the experiments. Another focus of this work is the link between the psychophysical response and its simultaneous neural discharges. I derived that under a widely accepted psychophysical law (Weber’s law), the neural activities were less variable than a Poisson process (which is often used to describe the neuron spiking process). My work shows how psychophysical behaviour reflects intrinsic neural activities quantitatively. Finally, the focus is on the control of movements by neural signals. A generalized approach to solve optimal movement control problems is proposed in my work, where pulses are used as neural signals to achieve a precise control. The simulation results clearly illustrate the advantage of this generalized control. In this thesis, I have raised novel, insightful yet simple approaches to study and explain the underlying mechanism behind the complexity of neural system, from three examples on sensory discrimination and neural movement control

    Inference of a mesoscopic population model from population spike trains

    No full text
    To understand how rich dynamics emerge in neural populations, we require models exhibiting a wide range of activity patterns while remaining interpretable in terms of connectivity and single-neuron dynamics. However, it has been challenging to fit such mechanistic spiking networks at the single neuron scale to empirical population data. To close this gap, we propose to fit such data at a meso scale, using a mechanistic but low-dimensional and hence statistically tractable model. The mesoscopic representation is obtained by approximating a population of neurons as multiple homogeneous `pools' of neurons, and modelling the dynamics of the aggregate population activity within each pool. We derive the likelihood of both single-neuron and connectivity parameters given this activity, which can then be used to either optimize parameters by gradient ascent on the log-likelihood, or to perform Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. We illustrate this approach using a model of generalized integrate-and-fire neurons for which mesoscopic dynamics have been previously derived, and show that both single-neuron and connectivity parameters can be recovered from simulated data. In particular, our inference method extracts posterior correlations between model parameters, which define parameter subsets able to reproduce the data. We compute the Bayesian posterior for combinations of parameters using MCMC sampling and investigate how the approximations inherent to a mesoscopic population model impact the accuracy of the inferred single-neuron parameters

    Discrimination and control in stochastic neuron models

    Get PDF
    Major topics of great interest in neuroscience involve understanding the brain function in stimuli coding, perceptive discrimination, and movement control through neuronal activities. Many researchers are designing biophysical and psychological experiments to study the activities of neurons in the presence of various stimuli. People have also been trying to link the neural responses to human perceptual and behavioral level. In addition, mathematical models and neural networks have been developed to investigate how neurons respond and communicate with each other. In this thesis, my aim is to understand how the central nervous system performs discrimination tasks and achieves precise control of movement, using noisy neural signals. I have studied, both through experimental and modelling approaches, how neurons respond to external stimuli. I worked in three aspects in details. The first is the neuronal coding mechanism of input stimuli with different temporal frequencies. Intracellular recordings of single neurons were performed with patch-clamp techniques to study the neural activities in rats somatosensory cortices in vitro, and the simplest possible neural model—integrate-and-fire model—was used to simulate the observations. The results obtained from the simulation were very consistent with that in the experiments. Another focus of this work is the link between the psychophysical response and its simultaneous neural discharges. I derived that under a widely accepted psychophysical law (Weber’s law), the neural activities were less variable than a Poisson process (which is often used to describe the neuron spiking process). My work shows how psychophysical behaviour reflects intrinsic neural activities quantitatively. Finally, the focus is on the control of movements by neural signals. A generalized approach to solve optimal movement control problems is proposed in my work, where pulses are used as neural signals to achieve a precise control. The simulation results clearly illustrate the advantage of this generalized control. In this thesis, I have raised novel, insightful yet simple approaches to study and explain the underlying mechanism behind the complexity of neural system, from three examples on sensory discrimination and neural movement control.EThOS - Electronic Theses Online ServiceUniversity of Warwick (UoW)GBUnited Kingdo

    Neuromorphic Engineering Editors' Pick 2021

    Get PDF
    This collection showcases well-received spontaneous articles from the past couple of years, which have been specially handpicked by our Chief Editors, Profs. André van Schaik and Bernabé Linares-Barranco. The work presented here highlights the broad diversity of research performed across the section and aims to put a spotlight on the main areas of interest. All research presented here displays strong advances in theory, experiment, and methodology with applications to compelling problems. This collection aims to further support Frontiers’ strong community by recognizing highly deserving authors

    The Dynamics of Adapting Neurons

    Get PDF
    How do neurons dynamically encode and treat information? Each neuron communicates with its distinctive language made of long silences intermitted by occasional spikes. The spikes are prompted by the pooled effect of a population of pre-synaptic neurons. To understand the operation made by single neurons is to create a quantitative description of their dynamics. The results presented in this thesis describe the necessary elements for a quantitative description of single neurons. Almost all chapters can be unified under the theme of adaptation. Neuronal adaptation plays an important role in the transduction of a given stimulation into a spike train. The work described here shows how adaptation is brought by every spike in a stereotypical fashion. The spike-triggered adaptation is then measured in three main types of cortical neurons. I analyze in detail how the different adaptation profiles can reproduce the diversity of firing patterns observed in real neurons. I also summarize the most recent results concerning the spike-time prediction in real neurons, resulting in a well-founded single-neuron model. This model is then analyzed to understand how populations can encode time-dependent signals and how time-dependent signals can be decoded from the activity of populations. Finally, two lines of investigation in progress are described, the first expands the study of spike-triggered adaptation on longer time scales and the second extends the quantitative neuron models to models with active dendrites
    • …
    corecore