125 research outputs found

    Inhibition of rhythmic neural spiking by noise: the occurrence of a minimum in activity with increasing noise

    Get PDF
    The effects of noise on neuronal dynamical systems are of much current interest. Here, we investigate noise-induced changes in the rhythmic firing activity of single Hodgkin–Huxley neurons. With additive input current, there is, in the absence of noise, a critical mean value µ = µc above which sustained periodic firing occurs. With initial conditions as resting values, for a range of values of the mean µ near the critical value, we have found that the firing rate is greatly reduced by noise, even of quite small amplitudes. Furthermore, the firing rate may undergo a pronounced minimum as the noise increases. This behavior has the opposite character to stochastic resonance and coherence resonance. We found that these phenomena occurred even when the initial conditions were chosen randomly or when the noise was switched on at a random time, indicating the robustness of the results. We also examined the effects of conductance-based noise on Hodgkin–Huxley neurons and obtained similar results, leading to the conclusion that the phenomena occur across a wide range of neuronal dynamical systems. Further, these phenomena will occur in diverse applications where a stable limit cycle coexists with a stable focus

    Branching dendrites with resonant membrane: a “sum-over-trips” approach

    Get PDF
    Dendrites form the major components of neurons. They are complex branching structures that receive and process thousands of synaptic inputs from other neurons. It is well known that dendritic morphology plays an important role in the function of dendrites. Another important contribution to the response characteristics of a single neuron comes from the intrinsic resonant properties of dendritic membrane. In this paper we combine the effects of dendritic branching and resonant membrane dynamics by generalising the “sum-over-trips” approach (Abbott et al. in Biol Cybernetics 66, 49–60 1991). To illustrate how this formalism can shed light on the role of architecture and resonances in determining neuronal output we consider dual recording and reconstruction data from a rat CA1 hippocampal pyramidal cell. Specifically we explore the way in which an Ih current contributes to a voltage overshoot at the soma

    Local variation of hashtag spike trains and popularity in Twitter

    Full text link
    We draw a parallel between hashtag time series and neuron spike trains. In each case, the process presents complex dynamic patterns including temporal correlations, burstiness, and all other types of nonstationarity. We propose the adoption of the so-called local variation in order to uncover salient dynamics, while properly detrending for the time-dependent features of a signal. The methodology is tested on both real and randomized hashtag spike trains, and identifies that popular hashtags present regular and so less bursty behavior, suggesting its potential use for predicting online popularity in social media.Comment: 7 pages, 7 figure

    A Markovian event-based framework for stochastic spiking neural networks

    Full text link
    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks

    Information transmission in oscillatory neural activity

    Full text link
    Periodic neural activity not locked to the stimulus or to motor responses is usually ignored. Here, we present new tools for modeling and quantifying the information transmission based on periodic neural activity that occurs with quasi-random phase relative to the stimulus. We propose a model to reproduce characteristic features of oscillatory spike trains, such as histograms of inter-spike intervals and phase locking of spikes to an oscillatory influence. The proposed model is based on an inhomogeneous Gamma process governed by a density function that is a product of the usual stimulus-dependent rate and a quasi-periodic function. Further, we present an analysis method generalizing the direct method (Rieke et al, 1999; Brenner et al, 2000) to assess the information content in such data. We demonstrate these tools on recordings from relay cells in the lateral geniculate nucleus of the cat.Comment: 18 pages, 8 figures, to appear in Biological Cybernetic

    The what and where of adding channel noise to the Hodgkin-Huxley equations

    Get PDF
    One of the most celebrated successes in computational biology is the Hodgkin-Huxley framework for modeling electrically active cells. This framework, expressed through a set of differential equations, synthesizes the impact of ionic currents on a cell's voltage -- and the highly nonlinear impact of that voltage back on the currents themselves -- into the rapid push and pull of the action potential. Latter studies confirmed that these cellular dynamics are orchestrated by individual ion channels, whose conformational changes regulate the conductance of each ionic current. Thus, kinetic equations familiar from physical chemistry are the natural setting for describing conductances; for small-to-moderate numbers of channels, these will predict fluctuations in conductances and stochasticity in the resulting action potentials. At first glance, the kinetic equations provide a far more complex (and higher-dimensional) description than the original Hodgkin-Huxley equations. This has prompted more than a decade of efforts to capture channel fluctuations with noise terms added to the Hodgkin-Huxley equations. Many of these approaches, while intuitively appealing, produce quantitative errors when compared to kinetic equations; others, as only very recently demonstrated, are both accurate and relatively simple. We review what works, what doesn't, and why, seeking to build a bridge to well-established results for the deterministic Hodgkin-Huxley equations. As such, we hope that this review will speed emerging studies of how channel noise modulates electrophysiological dynamics and function. We supply user-friendly Matlab simulation code of these stochastic versions of the Hodgkin-Huxley equations on the ModelDB website (accession number 138950) and http://www.amath.washington.edu/~etsb/tutorials.html.Comment: 14 pages, 3 figures, review articl

    Intrinsic gain modulation and adaptive neural coding

    Get PDF
    In many cases, the computation of a neural system can be reduced to a receptive field, or a set of linear filters, and a thresholding function, or gain curve, which determines the firing probability; this is known as a linear/nonlinear model. In some forms of sensory adaptation, these linear filters and gain curve adjust very rapidly to changes in the variance of a randomly varying driving input. An apparently similar but previously unrelated issue is the observation of gain control by background noise in cortical neurons: the slope of the firing rate vs current (f-I) curve changes with the variance of background random input. Here, we show a direct correspondence between these two observations by relating variance-dependent changes in the gain of f-I curves to characteristics of the changing empirical linear/nonlinear model obtained by sampling. In the case that the underlying system is fixed, we derive relationships relating the change of the gain with respect to both mean and variance with the receptive fields derived from reverse correlation on a white noise stimulus. Using two conductance-based model neurons that display distinct gain modulation properties through a simple change in parameters, we show that coding properties of both these models quantitatively satisfy the predicted relationships. Our results describe how both variance-dependent gain modulation and adaptive neural computation result from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio

    Stochastic Theory of Early Viral Infection: Continuous versus Burst Production of Virions

    Get PDF
    Viral production from infected cells can occur continuously or in a burst that generally kills the cell. For HIV infection, both modes of production have been suggested. Standard viral dynamic models formulated as sets of ordinary differential equations can not distinguish between these two modes of viral production, as the predicted dynamics is identical as long as infected cells produce the same total number of virions over their lifespan. Here we show that in stochastic models of viral infection the two modes of viral production yield different early term dynamics. Further, we analytically determine the probability that infections initiated with any number of virions and infected cells reach extinction, the state when both the population of virions and infected cells vanish, and show this too has different solutions for continuous and burst production. We also compute the distributions of times to establish infection as well as the distribution of times to extinction starting from both a single virion as well as from a single infected cell for both modes of virion production

    First-passage times in complex scale-invariant media

    Full text link
    How long does it take a random walker to reach a given target point? This quantity, known as a first passage time (FPT), has led to a growing number of theoretical investigations over the last decade1. The importance of FPTs originates from the crucial role played by first encounter properties in various real situations, including transport in disordered media, neuron firing dynamics, spreading of diseases or target search processes. Most methods to determine the FPT properties in confining domains have been limited to effective 1D geometries, or for space dimensions larger than one only to homogeneous media1. Here we propose a general theory which allows one to accurately evaluate the mean FPT (MFPT) in complex media. Remarkably, this analytical approach provides a universal scaling dependence of the MFPT on both the volume of the confining domain and the source-target distance. This analysis is applicable to a broad range of stochastic processes characterized by length scale invariant properties. Our theoretical predictions are confirmed by numerical simulations for several emblematic models of disordered media, fractals, anomalous diffusion and scale free networks.Comment: Submitted version. Supplementary Informations available on Nature websit
    corecore