5,955 research outputs found

    Extracting non-linear integrate-and-fire models from experimental data using dynamic I–V curves

    Get PDF
    The dynamic I–V curve method was recently introduced for the efficient experimental generation of reduced neuron models. The method extracts the response properties of a neuron while it is subject to a naturalistic stimulus that mimics in vivo-like fluctuating synaptic drive. The resulting history-dependent, transmembrane current is then projected onto a one-dimensional current–voltage relation that provides the basis for a tractable non-linear integrate-and-fire model. An attractive feature of the method is that it can be used in spike-triggered mode to quantify the distinct patterns of post-spike refractoriness seen in different classes of cortical neuron. The method is first illustrated using a conductance-based model and is then applied experimentally to generate reduced models of cortical layer-5 pyramidal cells and interneurons, in injected-current and injected- conductance protocols. The resulting low-dimensional neuron models—of the refractory exponential integrate-and-fire type—provide highly accurate predictions for spike-times. The method therefore provides a useful tool for the construction of tractable models and rapid experimental classification of cortical neurons

    Membrane resonance enables stable and robust gamma oscillations

    Get PDF
    Neuronal mechanisms underlying beta/gamma oscillations (20-80 Hz) are not completely understood. Here, we show that in vivo beta/gamma oscillations in the cat visual cortex sometimes exhibit remarkably stable frequency even when inputs fluctuate dramatically. Enhanced frequency stability is associated with stronger oscillations measured in individual units and larger power in the local field potential. Simulations of neuronal circuitry demonstrate that membrane properties of inhibitory interneurons strongly determine the characteristics of emergent oscillations. Exploration of networks containing either integrator or resonator inhibitory interneurons revealed that: (i) Resonance, as opposed to integration, promotes robust oscillations with large power and stable frequency via a mechanism called RING (Resonance INduced Gamma); resonance favors synchronization by reducing phase delays between interneurons and imposes bounds on oscillation cycle duration; (ii) Stability of frequency and robustness of the oscillation also depend on the relative timing of excitatory and inhibitory volleys within the oscillation cycle; (iii) RING can reproduce characteristics of both Pyramidal INterneuron Gamma (PING) and INterneuron Gamma (ING), transcending such classifications; (iv) In RING, robust gamma oscillations are promoted by slow but are impaired by fast inputs. Results suggest that interneuronal membrane resonance can be an important ingredient for generation of robust gamma oscillations having stable frequency

    Entropy-based parametric estimation of spike train statistics

    Full text link
    We consider the evolution of a network of neurons, focusing on the asymptotic behavior of spikes dynamics instead of membrane potential dynamics. The spike response is not sought as a deterministic response in this context, but as a conditional probability : "Reading out the code" consists of inferring such a probability. This probability is computed from empirical raster plots, by using the framework of thermodynamic formalism in ergodic theory. This gives us a parametric statistical model where the probability has the form of a Gibbs distribution. In this respect, this approach generalizes the seminal and profound work of Schneidman and collaborators. A minimal presentation of the formalism is reviewed here, while a general algorithmic estimation method is proposed yielding fast convergent implementations. It is also made explicit how several spike observables (entropy, rate, synchronizations, correlations) are given in closed-form from the parametric estimation. This paradigm does not only allow us to estimate the spike statistics, given a design choice, but also to compare different models, thus answering comparative questions about the neural code such as : "are correlations (or time synchrony or a given set of spike patterns, ..) significant with respect to rate coding only ?" A numerical validation of the method is proposed and the perspectives regarding spike-train code analysis are also discussed.Comment: 37 pages, 8 figures, submitte

    The chronotron: a neuron that learns to fire temporally-precise spike patterns

    Get PDF
    In many cases, neurons process information carried by the precise timing of spikes. Here we show how neurons can learn to generate specific temporally-precise output spikes in response to input spike patterns, thus processing and memorizing information that is fully temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons), one that is analytically-derived and highly efficient, and one that has a high degree of biological plausibility. We show how chronotrons can learn to classify their inputs and we study their memory capacity

    Inhibitory synchrony as a mechanism for attentional gain modulation

    Get PDF
    Recordings from area V4 of monkeys have revealed that when the focus of attention is on a visual stimulus within the receptive field of a cortical neuron, two distinct changes can occur: The firing rate of the neuron can change and there can be an increase in the coherence between spikes and the local field potential in the gamma-frequency range (30-50 Hz). The hypothesis explored here is that these observed effects of attention could be a consequence of changes in the synchrony of local interneuron networks. We performed computer simulations of a Hodgkin-Huxley type neuron driven by a constant depolarizing current, I, representing visual stimulation and a modulatory inhibitory input representing the effects of attention via local interneuron networks. We observed that the neuron's firing rate and the coherence of its output spike train with the synaptic inputs was modulated by the degree of synchrony of the inhibitory inputs. The model suggest that the observed changes in firing rate and coherence of neurons in the visual cortex could be controlled by top-down inputs that regulated the coherence in the activity of a local inhibitory network discharging at gamma frequencies.Comment: J.Physiology (Paris) in press, 11 figure

    Soma-Axon Coupling Configurations That Enhance Neuronal Coincidence Detection

    Get PDF
    Coincidence detector neurons transmit timing information by responding preferentially to concurrent synaptic inputs. Principal cells of the medial superior olive (MSO) in the mammalian auditory brainstem are superb coincidence detectors. They encode sound source location with high temporal precision, distinguishing submillisecond timing differences among inputs. We investigate computationally how dynamic coupling between the input region (soma and dendrite) and the spike-generating output region (axon and axon initial segment) can enhance coincidence detection in MSO neurons. To do this, we formulate a two-compartment neuron model and characterize extensively coincidence detection sensitivity throughout a parameter space of coupling configurations. We focus on the interaction between coupling configuration and two currents that provide dynamic, voltage-gated, negative feedback in subthreshold voltage range: sodium current with rapid inactivation and low-threshold potassium current, IKLT. These currents reduce synaptic summation and can prevent spike generation unless inputs arrive with near simultaneity. We show that strong soma-to-axon coupling promotes the negative feedback effects of sodium inactivation and is, therefore, advantageous for coincidence detection. Furthermore, the feedforward combination of strong soma-to-axon coupling and weak axon-to-soma coupling enables spikes to be generated efficiently (few sodium channels needed) and with rapid recovery that enhances high-frequency coincidence detection. These observations detail the functional benefit of the strongly feedforward configuration that has been observed in physiological studies of MSO neurons. We find that IKLT further enhances coincidence detection sensitivity, but with effects that depend on coupling configuration. For instance, in models with weak soma-to-axon and weak axon-to-soma coupling, IKLT in the axon enhances coincidence detection more effectively than IKLT in the soma. By using a minimal model of soma-to-axon coupling, we connect structure, dynamics, and computation. Although we consider the particular case of MSO coincidence detectors, our method for creating and exploring a parameter space of two-compartment models can be applied to other neurons

    Scaling of a large-scale simulation of synchronous slow-wave and asynchronous awake-like activity of a cortical model with long-range interconnections

    Full text link
    Cortical synapse organization supports a range of dynamic states on multiple spatial and temporal scales, from synchronous slow wave activity (SWA), characteristic of deep sleep or anesthesia, to fluctuating, asynchronous activity during wakefulness (AW). Such dynamic diversity poses a challenge for producing efficient large-scale simulations that embody realistic metaphors of short- and long-range synaptic connectivity. In fact, during SWA and AW different spatial extents of the cortical tissue are active in a given timespan and at different firing rates, which implies a wide variety of loads of local computation and communication. A balanced evaluation of simulation performance and robustness should therefore include tests of a variety of cortical dynamic states. Here, we demonstrate performance scaling of our proprietary Distributed and Plastic Spiking Neural Networks (DPSNN) simulation engine in both SWA and AW for bidimensional grids of neural populations, which reflects the modular organization of the cortex. We explored networks up to 192x192 modules, each composed of 1250 integrate-and-fire neurons with spike-frequency adaptation, and exponentially decaying inter-modular synaptic connectivity with varying spatial decay constant. For the largest networks the total number of synapses was over 70 billion. The execution platform included up to 64 dual-socket nodes, each socket mounting 8 Intel Xeon Haswell processor cores @ 2.40GHz clock rates. Network initialization time, memory usage, and execution time showed good scaling performances from 1 to 1024 processes, implemented using the standard Message Passing Interface (MPI) protocol. We achieved simulation speeds of between 2.3x10^9 and 4.1x10^9 synaptic events per second for both cortical states in the explored range of inter-modular interconnections.Comment: 22 pages, 9 figures, 4 table
    • 

    corecore