18,444 research outputs found

    A universal model for spike-frequency adaptation

    Get PDF
    Spike-frequency adaptation is a prominent feature of neural dynamics. Among other mechanisms, various ionic currents modulating spike generation cause this type of neural adaptation. Prominent examples are voltage-gated potassium currents (M-type currents), the interplay of calcium currents and intracellular calcium dynamics with calcium-gated potassium channels (AHP-type currents), and the slow recovery from inactivation of the fast sodium current. While recent modeling studies have focused on the effects of specific adaptation currents, we derive a universal model for the firing-frequency dynamics of an adapting neuron that is independent of the specific adaptation process and spike generator. The model is completely defined by the neuron's onset f-I curve, the steady-state f-I curve, and the time constant of adaptation. For a specific neuron, these parameters can be easily determined from electrophysiological measurements without any pharmacological manipulations. At the same time, the simplicity of the model allows one to analyze mathematically how adaptation influences signal processing on the single-neuron level. In particular, we elucidate the specific nature of high-pass filter properties caused by spike-frequency adaptation. The model is limited to firing frequencies higher than the reciprocal adaptation time constant and to moderate fluctuations of the adaptation and the input current. As an extension of the model, we introduce a framework for combining an arbitrary spike generator with a generalized adaptation current

    Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience

    Get PDF
    This essay is presented with two principal objectives in mind: first, to document the prevalence of fractals at all levels of the nervous system, giving credence to the notion of their functional relevance; and second, to draw attention to the as yet still unresolved issues of the detailed relationships among power law scaling, self-similarity, and self-organized criticality. As regards criticality, I will document that it has become a pivotal reference point in Neurodynamics. Furthermore, I will emphasize the not yet fully appreciated significance of allometric control processes. For dynamic fractals, I will assemble reasons for attributing to them the capacity to adapt task execution to contextual changes across a range of scales. The final Section consists of general reflections on the implications of the reviewed data, and identifies what appear to be issues of fundamental importance for future research in the rapidly evolving topic of this review

    Logarithmic distributions prove that intrinsic learning is Hebbian

    Full text link
    In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears to be a general, functional property in all cases analyzed. We then created a generic neural model to investigate adaptive learning rules that create and maintain lognormal distributions. We conclusively demonstrate that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This provides a solution to the long-standing question about the type of plasticity exhibited by intrinsic excitability

    The Impact Of Spike-Frequency Adaptation On Balanced Network Dynamics

    Get PDF
    A dynamic balance between strong excitatory and inhibitory neuronal inputs is hypothesized to play a pivotal role in information processing in the brain. While there is evidence of the existence of a balanced operating regime in several cortical areas and idealized neuronal network models, it is important for the theory of balanced networks to be reconciled with more physiological neuronal modeling assumptions. In this work, we examine the impact of spike-frequency adaptation, observed widely across neurons in the brain, on balanced dynamics. We incorporate adaptation into binary and integrate-and-fire neuronal network models, analyzing the theoretical effect of adaptation in the large network limit and performing an extensive numerical investigation of the model adaptation parameter space. Our analysis demonstrates that balance is well preserved for moderate adaptation strength even if the entire network exhibits adaptation. In the common physiological case in which only excitatory neurons undergo adaptation, we show that the balanced operating regime in fact widens relative to the non-adaptive case. We hypothesize that spike-frequency adaptation may have been selected through evolution to robustly facilitate balanced dynamics across diverse cognitive operating states

    Noise-induced synchronization and anti-resonance in excitable systems; Implications for information processing in Parkinson's Disease and Deep Brain Stimulation

    Full text link
    We study the statistical physics of a surprising phenomenon arising in large networks of excitable elements in response to noise: while at low noise, solutions remain in the vicinity of the resting state and large-noise solutions show asynchronous activity, the network displays orderly, perfectly synchronized periodic responses at intermediate level of noise. We show that this phenomenon is fundamentally stochastic and collective in nature. Indeed, for noise and coupling within specific ranges, an asymmetry in the transition rates between a resting and an excited regime progressively builds up, leading to an increase in the fraction of excited neurons eventually triggering a chain reaction associated with a macroscopic synchronized excursion and a collective return to rest where this process starts afresh, thus yielding the observed periodic synchronized oscillations. We further uncover a novel anti-resonance phenomenon: noise-induced synchronized oscillations disappear when the system is driven by periodic stimulation with frequency within a specific range. In that anti-resonance regime, the system is optimal for measures of information capacity. This observation provides a new hypothesis accounting for the efficiency of Deep Brain Stimulation therapies in Parkinson's disease, a neurodegenerative disease characterized by an increased synchronization of brain motor circuits. We further discuss the universality of these phenomena in the class of stochastic networks of excitable elements with confining coupling, and illustrate this universality by analyzing various classical models of neuronal networks. Altogether, these results uncover some universal mechanisms supporting a regularizing impact of noise in excitable systems, reveal a novel anti-resonance phenomenon in these systems, and propose a new hypothesis for the efficiency of high-frequency stimulation in Parkinson's disease

    Input-driven components of spike-frequency adaptation can be unmasked in vivo

    Get PDF
    Spike-frequency adaptation affects the response characteristics of many sensory neurons, and different biophysical processes contribute to this phenomenon. Many cellular mechanisms underlying adaptation are triggered by the spike output of the neuron in a feedback manner (e.g., specific potassium currents that are primarily activated by the spiking activity). In contrast, other components of adaptation may be caused by, in a feedforward way, the sensory or synaptic input, which the neuron receives. Examples include viscoelasticity of mechanoreceptors, transducer adaptation in hair cells, and short-term synaptic depression. For a functional characterization of spike-frequency adaptation, it is essential to understand the dependence of adaptation on the input and output of the neuron. Here, we demonstrate how an input-driven component of adaptation can be uncovered in vivo from recordings of spike trains in an insect auditory receptor neuron, even if the total adaptation is dominated by output-driven components. Our method is based on the identification of different inputs that yield the same output and sudden switches between these inputs. In particular, we determined for different sound frequencies those intensities that are required to yield a predefined steady-state firing rate of the neuron. We then found that switching between these sound frequencies causes transient deviations of the firing rate. These firing-rate deflections are evidence of input-driven adaptation and can be used to quantify how this adaptation component affects the neural activity. Based on previous knowledge of the processes in auditory transduction, we conclude that for the investigated auditory receptor neurons, this adaptation phenomenon is of mechanical origin

    Universal Statistical Behavior of Neural Spike Trains

    Get PDF
    We construct a model that predicts the statistical properties of spike trains generated by a sensory neuron. The model describes the combined effects of the neuron's intrinsic properties, the noise in the surrounding, and the external driving stimulus. We show that the spike trains exhibit universal statistical behavior over short times, modulated by a strongly stimulus-dependent behavior over long times. These predictions are confirmed in experiments on H1, a motion-sensitive neuron in the fly visual system.Comment: 7 pages, 4 figure
    • …
    corecore