33,984 research outputs found

    Effect of noise on neuron transient response

    Get PDF
    A good approximation to the integrate-and-fire model with diffusive noise can be obtained using a noisy threshold model. This allows the response of a population of noisy neurons to a current transient to be described using a linear filter. Here we apply these analytical results to the peristimulus time histogram (PSTH) of a single neuron. The effect of the noise on the PSTH in our model is similar to that seen in experimental findings of Poliakov et al. (J. Physiol., Part 1,495 (1996) 143–157) on hypoglossal and cat lumbar motoneurons and could help in interpreting their results

    Detection of subthreshold pulses in neurons with channel noise

    Full text link
    Neurons are subject to various kinds of noise. In addition to synaptic noise, the stochastic opening and closing of ion channels represents an intrinsic source of noise that affects the signal processing properties of the neuron. In this paper, we studied the response of a stochastic Hodgkin-Huxley neuron to transient input subthreshold pulses. It was found that the average response time decreases but variance increases as the amplitude of channel noise increases. In the case of single pulse detection, we show that channel noise enables one neuron to detect the subthreshold signals and an optimal membrane area (or channel noise intensity) exists for a single neuron to achieve optimal performance. However, the detection ability of a single neuron is limited by large errors. Here, we test a simple neuronal network that can enhance the pulse detecting abilities of neurons and find dozens of neurons can perfectly detect subthreshold pulses. The phenomenon of intrinsic stochastic resonance is also found both at the level of single neurons and at the level of networks. At the network level, the detection ability of networks can be optimized for the number of neurons comprising the network.Comment: 14 pages, 9 figure

    Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control

    Get PDF
    It is widely accepted that the complex dynamics characteristic of recurrent neural circuits contributes in a fundamental manner to brain function. Progress has been slow in understanding and exploiting the computational power of recurrent dynamics for two main reasons: nonlinear recurrent networks often exhibit chaotic behavior and most known learning rules do not work in robust fashion in recurrent networks. Here we address both these problems by demonstrating how random recurrent networks (RRN) that initially exhibit chaotic dynamics can be tuned through a supervised learning rule to generate locally stable neural patterns of activity that are both complex and robust to noise. The outcome is a novel neural network regime that exhibits both transiently stable and chaotic trajectories. We further show that the recurrent learning rule dramatically increases the ability of RRNs to generate complex spatiotemporal motor patterns, and accounts for recent experimental data showing a decrease in neural variability in response to stimulus onset

    Stochastic Resonance of Ensemble Neurons for Transient Spike Trains: A Wavelet Analysis

    Full text link
    By using the wavelet transformation (WT), we have analyzed the response of an ensemble of NN (=1, 10, 100 and 500) Hodgkin-Huxley (HH) neurons to {\it transient} MM-pulse spike trains (M=13M=1-3) with independent Gaussian noises. The cross-correlation between the input and output signals is expressed in terms of the WT expansion coefficients. The signal-to-noise ratio (SNR) is evaluated by using the {\it denoising} method within the WT, by which the noise contribution is extracted from output signals. Although the response of a single (N=1) neuron to sub-threshold transient signals with noises is quite unreliable, the transmission fidelity assessed by the cross-correlation and SNR is shown to be much improved by increasing the value of NN: a population of neurons play an indispensable role in the stochastic resonance (SR) for transient spike inputs. It is also shown that in a large-scale ensemble, the transmission fidelity for supra-threshold transient spikes is not significantly degraded by a weak noise which is responsible to SR for sub-threshold inputs.Comment: 20 pages, 4 figure

    Synthetic reverberating activity patterns embedded in networks of cortical neurons

    Full text link
    Synthetic reverberating activity patterns are experimentally generated by stimulation of a subset of neurons embedded in a spontaneously active network of cortical cells in-vitro. The neurons are artificially connected by means of conditional stimulation matrix, forming a synthetic local circuit with a predefined programmable connectivity and time-delays. Possible uses of this experimental design are demonstrated, analyzing the sensitivity of these deterministic activity patterns to transmission delays and to the nature of ongoing network dynamics.Comment: 8 pages, 5 figure

    Revisiting chaos in stimulus-driven spiking networks: signal encoding and discrimination

    Full text link
    Highly connected recurrent neural networks often produce chaotic dynamics, meaning their precise activity is sensitive to small perturbations. What are the consequences for how such networks encode streams of temporal stimuli? On the one hand, chaos is a strong source of randomness, suggesting that small changes in stimuli will be obscured by intrinsically generated variability. On the other hand, recent work shows that the type of chaos that occurs in spiking networks can have a surprisingly low-dimensional structure, suggesting that there may be "room" for fine stimulus features to be precisely resolved. Here we show that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10's of ms, one can easily distinguish responses to very similar inputs. Moreover, recurrence serves to distribute signals throughout chaotic networks so that small groups of cells can encode substantial information about signals arriving elsewhere. A conclusion is that the presence of strong chaos in recurrent networks does not prohibit precise stimulus encoding.Comment: 8 figure

    Emergence of Synchronous Oscillations in Neural Networks Excited by Noise

    Full text link
    The presence of noise in non linear dynamical systems can play a constructive role, increasing the degree of order and coherence or evoking improvements in the performance of the system. An example of this positive influence in a biological system is the impulse transmission in neurons and the synchronization of a neural network. Integrating numerically the Fokker-Planck equation we show a self-induced synchronized oscillation. Such an oscillatory state appears in a neural network coupled with a feedback term, when this system is excited by noise and the noise strength is within a certain range.Comment: 12 pages, 18 figure

    Double Inverse Stochastic Resonance with Dynamic Synapses

    Full text link
    We investigate the behavior of a model neuron that receives a biophysically-realistic noisy post-synaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity, and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.Comment: 12 pages, 7 figure
    corecore