133 research outputs found

    Learning intrinsic excitability in medium spiny neurons

    Full text link
    We present an unsupervised, local activation-dependent learning rule for intrinsic plasticity (IP) which affects the composition of ion channel conductances for single neurons in a use-dependent way. We use a single-compartment conductance-based model for medium spiny striatal neurons in order to show the effects of parametrization of individual ion channels on the neuronal activation function. We show that parameter changes within the physiological ranges are sufficient to create an ensemble of neurons with significantly different activation functions. We emphasize that the effects of intrinsic neuronal variability on spiking behavior require a distributed mode of synaptic input and can be eliminated by strongly correlated input. We show how variability and adaptivity in ion channel conductances can be utilized to store patterns without an additional contribution by synaptic plasticity (SP). The adaptation of the spike response may result in either "positive" or "negative" pattern learning. However, read-out of stored information depends on a distributed pattern of synaptic activity to let intrinsic variability determine spike response. We briefly discuss the implications of this conditional memory on learning and addiction.Comment: 20 pages, 8 figure

    Power-Law Inter-Spike Interval Distributions Infer a Conditional Maximization of Entropy in Cortical Neurons

    Get PDF
    The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI), which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously) and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE) solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates) at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains

    Spontaneous Local Gamma Oscillation Selectively Enhances Neural Network Responsiveness

    Get PDF
    Synchronized oscillation is very commonly observed in many neuronal systems and might play an important role in the response properties of the system. We have studied how the spontaneous oscillatory activity affects the responsiveness of a neuronal network, using a neural network model of the visual cortex built from Hodgkin-Huxley type excitatory (E-) and inhibitory (I-) neurons. When the isotropic local E-I and I-E synaptic connections were sufficiently strong, the network commonly generated gamma frequency oscillatory firing patterns in response to random feed-forward (FF) input spikes. This spontaneous oscillatory network activity injects a periodic local current that could amplify a weak synaptic input and enhance the network's responsiveness. When E-E connections were added, we found that the strength of oscillation can be modulated by varying the FF input strength without any changes in single neuron properties or interneuron connectivity. The response modulation is proportional to the oscillation strength, which leads to self-regulation such that the cortical network selectively amplifies various FF inputs according to its strength, without requiring any adaptation mechanism. We show that this selective cortical amplification is controlled by E-E cell interactions. We also found that this response amplification is spatially localized, which suggests that the responsiveness modulation may also be spatially selective. This suggests a generalized mechanism by which neural oscillatory activity can enhance the selectivity of a neural network to FF inputs

    The response of a classical Hodgkin–Huxley neuron to an inhibitory input pulse

    Get PDF
    A population of uncoupled neurons can often be brought close to synchrony by a single strong inhibitory input pulse affecting all neurons equally. This mechanism is thought to underlie some brain rhythms, in particular gamma frequency (30–80 Hz) oscillations in the hippocampus and neocortex. Here we show that synchronization by an inhibitory input pulse often fails for populations of classical Hodgkin–Huxley neurons. Our reasoning suggests that in general, synchronization by inhibitory input pulses can fail when the transition of the target neurons from rest to spiking involves a Hopf bifurcation, especially when inhibition is shunting, not hyperpolarizing. Surprisingly, synchronization is more likely to fail when the inhibitory pulse is stronger or longer-lasting. These findings have potential implications for the question which neurons participate in brain rhythms, in particular in gamma oscillations
    • …
    corecore