721 research outputs found

    Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons

    Get PDF
    The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter—describing somatic integration—and the spike-history filter—accounting for spike-frequency adaptation—dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Computational principles of single neuron adaptation

    Get PDF
    Cortical neurons continuously transform sets of incoming spike trains into output spike trains. This input-output transformation is referred to as single-neuron computation and constitutes one of the most fundamental process in the brain. A deep understanding of single-neuron dynamics is therefore required to study how neural circuits support complex behaviors such as sensory perception, learning and memory. The results presented in this thesis focus on single-neuron computation. In particular, I address the question of how and why cortical neurons adapt their coding strategies to the statistical properties of their inputs. A new spiking model and a new fitting procedure are introduced that enable reliable nonparametric feature extraction from in vitro intracellular recordings. By applying this method to a new set of data from L5 pyramidal neurons, I found that cortical neurons adapt their firing rate over multiple timescales, ranging from tens of milliseconds to tens of second. This behavior results from two cellular processes, which are triggered by the emission of individual action potentials and decay according to a power-law. An analysis performed on in vivo intracellular recordings further indicates that power-law adaptation is near-optimally tuned to efficiently encode natural inputs received by single neurons in biologically relevant situations. These results shade light on the functional role of spike-frequency adaptation in the cortex. The second part of this thesis focuses on the long-standing question of whether cortical neurons act as temporal integrators or coincidence detectors. According to standard theories relying on simplified spiking models, cortical neurons are expected to feature both coding strategies, depending on the statistical properties of their inputs. A model-based analysis performed on a second set of in vitro recordings demonstrates that the spike initiation dynamics implements a complex form of adaptation to make cortical neurons act as coincidence detectors, regardless of the input statistics. This result indicates that cortical neurons are well-suited to support a temporal code in which the relevant information is carried by the precise timing of spikes. The spiking model introduced in this thesis was not designed to study a particular aspect of single-neuron computation and achieves good performances in predicting the spiking activity of different neuronal types. The proposed method for parameter estimation is efficient and only requires a limited amount of data. If applied on large datasets, the mathematical framework presented in this thesis could therefore lead to automated high-throughput single-neuron characterization

    A new Mathematical Framework to Understand Single Neuron Computations

    Get PDF
    An important feature of the nervous system is its ability to adapt to new stimuli. This adaptation allows for optimal encoding of the incoming information by dynamically changing the coding strategy based upon the incoming inputs to the neuron. At the level of single cells, this widespread phenomena is often referred to as spike-frequency adaptation, since it manifests as a history-dependent modulation of the neurons firing frequency. In this thesis I focus on how a neuron is able to adapt its activity to a specific input as well as on the function of such adaptive mechanisms. To study these adaptive processes different approaches have been used, from empirical observations of neural activities to detailed modeling of single cells. Here, I approach these problems by using simplified threshold models. In particular, I introduced a new generalization of the integrate-and-fire model (GIF) along with a convex fitting method allowing for efficient estimation of model parameters. Despite its relative simplicity I show that this neuron model is able to reproduce neuron behaviors with a high degree of accuracy. Moreover, using this method I was able to show that cortical neurons are equipped with two distinct adaptation mechanisms. First, a spike-triggered current that captures the complex influx of ions generated after the emission of a spike. While the second is a movement of the firing threshold, which possibly reflects the slow inactivation of sodium channels induced by the spiking activity. The precise dynamics of these adaptation processes is cell-type specific, explaining the difference of firing activity reported in different neuron types. Consequently, neuronal types can be classified based on model parameters. In Pyramidal neurons spike-dependent adaptation lasts for seconds and follows a scale-free dynamics, which is optimally tuned to encodes the natural inputs that pyramidal neurons receive in vivo. Finally using an extended version of the GIF model, I show that adaptation is not only a spike-dependent phenomenon, but also acts at the subthreshold level. In Pyramidal neurons the dynamics of the firing threshold is influenced by the subthreshold membrane potential. Spike-dependent and voltage-dependent adaptation interact in an activity-dependent way to ultimately shape the filtering properties of the membrane on the input statistics. Equipped with such a mechanism, Pyramidal neurons behave as integrators at low inputs and as a coincidence detectors at high inputs, maintaining sensitivity to input fluctuations across all regimes

    Two-photon imaging and analysis of neural network dynamics

    Full text link
    The glow of a starry night sky, the smell of a freshly brewed cup of coffee or the sound of ocean waves breaking on the beach are representations of the physical world that have been created by the dynamic interactions of thousands of neurons in our brains. How the brain mediates perceptions, creates thoughts, stores memories and initiates actions remains one of the most profound puzzles in biology, if not all of science. A key to a mechanistic understanding of how the nervous system works is the ability to analyze the dynamics of neuronal networks in the living organism in the context of sensory stimulation and behaviour. Dynamic brain properties have been fairly well characterized on the microscopic level of individual neurons and on the macroscopic level of whole brain areas largely with the help of various electrophysiological techniques. However, our understanding of the mesoscopic level comprising local populations of hundreds to thousands of neurons (so called 'microcircuits') remains comparably poor. In large parts, this has been due to the technical difficulties involved in recording from large networks of neurons with single-cell spatial resolution and near- millisecond temporal resolution in the brain of living animals. In recent years, two-photon microscopy has emerged as a technique which meets many of these requirements and thus has become the method of choice for the interrogation of local neural circuits. Here, we review the state-of-research in the field of two-photon imaging of neuronal populations, covering the topics of microscope technology, suitable fluorescent indicator dyes, staining techniques, and in particular analysis techniques for extracting relevant information from the fluorescence data. We expect that functional analysis of neural networks using two-photon imaging will help to decipher fundamental operational principles of neural microcircuits.Comment: 36 pages, 4 figures, accepted for publication in Reports on Progress in Physic

    Inhibition Controls Asynchronous States of Neuronal Networks

    Get PDF
    Computations in cortical circuits require action potentials from excitatory and inhibitory neurons. In this mini-review, I first provide a quick overview of findings that indicate that GABAergic neurons play a fundamental role in coordinating spikes and generating synchronized network activity. Next, I argue that these observations helped popularize the notion that network oscillations require a high degree of spike correlations among interneurons which, in turn, produce synchronous inhibition of the local microcircuit. The aim of this text is to discuss some recent experimental and computational findings that support a complementary view: one in which interneurons participate actively in producing asynchronous states in cortical networks. This requires a proper mixture of shared excitation and inhibition leading to asynchronous activity between neighboring cells. Such contribution from interneurons would be extremely important because it would tend to reduce the spike correlation between neighboring pyramidal cells, a drop in redundancy that could enhance the information-processing capacity of neural networks

    Determination of the dynamic gain function of cortical interneurons with distinct electrical types

    Get PDF

    Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience

    Get PDF
    This essay is presented with two principal objectives in mind: first, to document the prevalence of fractals at all levels of the nervous system, giving credence to the notion of their functional relevance; and second, to draw attention to the as yet still unresolved issues of the detailed relationships among power law scaling, self-similarity, and self-organized criticality. As regards criticality, I will document that it has become a pivotal reference point in Neurodynamics. Furthermore, I will emphasize the not yet fully appreciated significance of allometric control processes. For dynamic fractals, I will assemble reasons for attributing to them the capacity to adapt task execution to contextual changes across a range of scales. The final Section consists of general reflections on the implications of the reviewed data, and identifies what appear to be issues of fundamental importance for future research in the rapidly evolving topic of this review

    Investigating the role of fast-spiking interneurons in neocortical dynamics

    Get PDF
    PhD ThesisFast-spiking interneurons are the largest interneuronal population in neocortex. It is well documented that this population is crucial in many functions of the neocortex by subserving all aspects of neural computation, like gain control, and by enabling dynamic phenomena, like the generation of high frequency oscillations. Fast-spiking interneurons, which represent mainly the parvalbumin-expressing, soma-targeting basket cells, are also implicated in pathological dynamics, like the propagation of seizures or the impaired coordination of activity in schizophrenia. In the present thesis, I investigate the role of fast-spiking interneurons in such dynamic phenomena by using computational and experimental techniques. First, I introduce a neural mass model of the neocortical microcircuit featuring divisive inhibition, a gain control mechanism, which is thought to be delivered mainly by the soma-targeting interneurons. Its dynamics were analysed at the onset of chaos and during the phenomena of entrainment and long-range synchronization. It is demonstrated that the mechanism of divisive inhibition reduces the sensitivity of the network to parameter changes and enhances the stability and exibility of oscillations. Next, in vitro electrophysiology was used to investigate the propagation of activity in the network of electrically coupled fast-spiking interneurons. Experimental evidence suggests that these interneurons and their gap junctions are involved in the propagation of seizures. Using multi-electrode array recordings and optogenetics, I investigated the possibility of such propagating activity under the conditions of raised extracellular K+ concentration which applies during seizures. Propagated activity was recorded and the involvement of gap junctions was con rmed by pharmacological manipulations. Finally, the interaction between two oscillations was investigated. Two oscillations with di erent frequencies were induced in cortical slices by directly activating the pyramidal cells using optogenetics. Their interaction suggested the possibility of a coincidence detection mechanism at the circuit level. Pharmacological manipulations were used to explore the role of the inhibitory interneurons during this phenomenon. The results, however, showed that the observed phenomenon was not a result of synaptic activity. Nevertheless, the experiments provided some insights about the excitability of the tissue through scattered light while using optogenetics. This investigation provides new insights into the role of fast-spiking interneurons in the neocortex. In particular, it is suggested that the gain control mechanism is important for the physiological oscillatory dynamics of the network and that the gap junctions between these interneurons can potentially contribute to the inhibitory restraint during a seizure.Wellcome Trust
    • …
    corecore