2,909 research outputs found

    The impact of spike timing variability on the signal-encoding performance of neural spiking models

    Get PDF
    It remains unclear whether the variability of neuronal spike trains in vivo arises due to biological noise sources or represents highly precise encoding of temporally varying synaptic input signals. Determining the variability of spike timing can provide fundamental insights into the nature of strategies used in the brain to represent and transmit information in the form of discrete spike trains. In this study, we employ a signal estimation paradigm to determine how variability in spike timing affects encoding of random time-varying signals. We assess this for two types of spiking models: an integrate-and-fire model with random threshold and a more biophysically realistic stochastic ion channel model. Using the coding fraction and mutual information as information-theoretic measures, we quantify the efficacy of optimal linear decoding of random inputs from the model outputs and study the relationship between efficacy and variability in the output spike train. Our findings suggest that variability does not necessarily hinder signal decoding for the biophysically plausible encoders examined and that the functional role of spiking variability depends intimately on the nature of the encoder and the signal processing task; variability can either enhance or impede decoding performance

    Signal Propagation in Feedforward Neuronal Networks with Unreliable Synapses

    Full text link
    In this paper, we systematically investigate both the synfire propagation and firing rate propagation in feedforward neuronal network coupled in an all-to-all fashion. In contrast to most earlier work, where only reliable synaptic connections are considered, we mainly examine the effects of unreliable synapses on both types of neural activity propagation in this work. We first study networks composed of purely excitatory neurons. Our results show that both the successful transmission probability and excitatory synaptic strength largely influence the propagation of these two types of neural activities, and better tuning of these synaptic parameters makes the considered network support stable signal propagation. It is also found that noise has significant but different impacts on these two types of propagation. The additive Gaussian white noise has the tendency to reduce the precision of the synfire activity, whereas noise with appropriate intensity can enhance the performance of firing rate propagation. Further simulations indicate that the propagation dynamics of the considered neuronal network is not simply determined by the average amount of received neurotransmitter for each neuron in a time instant, but also largely influenced by the stochastic effect of neurotransmitter release. Second, we compare our results with those obtained in corresponding feedforward neuronal networks connected with reliable synapses but in a random coupling fashion. We confirm that some differences can be observed in these two different feedforward neuronal network models. Finally, we study the signal propagation in feedforward neuronal networks consisting of both excitatory and inhibitory neurons, and demonstrate that inhibition also plays an important role in signal propagation in the considered networks.Comment: 33pages, 16 figures; Journal of Computational Neuroscience (published

    Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses

    Full text link
    Spiking neural networks (SNN) are artificial computational models that have been inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventional artificial neural networks, though their full computational capabilities are yet to be explored. Recently, computational memory architectures based on non-volatile memory crossbar arrays have shown great promise to implement parallel computations in artificial and spiking neural networks. In this work, we experimentally demonstrate for the first time, the feasibility to realize high-performance event-driven in-situ supervised learning systems using nanoscale and stochastic phase-change synapses. Our SNN is trained to recognize audio signals of alphabets encoded using spikes in the time domain and to generate spike trains at precise time instances to represent the pixel intensities of their corresponding images. Moreover, with a statistical model capturing the experimental behavior of the devices, we investigate architectural and systems-level solutions for improving the training and inference performance of our computational memory-based system. Combining the computational potential of supervised SNNs with the parallel compute power of computational memory, the work paves the way for next-generation of efficient brain-inspired systems

    Statistical-Mechanical Measure of Stochastic Spiking Coherence in A Population of Inhibitory Subthreshold Neurons

    Full text link
    By varying the noise intensity, we study stochastic spiking coherence (i.e., collective coherence between noise-induced neural spikings) in an inhibitory population of subthreshold neurons (which cannot fire spontaneously without noise). This stochastic spiking coherence may be well visualized in the raster plot of neural spikes. For a coherent case, partially-occupied "stripes" (composed of spikes and indicating collective coherence) are formed in the raster plot. This partial occupation occurs due to "stochastic spike skipping" which is well shown in the multi-peaked interspike interval histogram. The main purpose of our work is to quantitatively measure the degree of stochastic spiking coherence seen in the raster plot. We introduce a new spike-based coherence measure MsM_s by considering the occupation pattern and the pacing pattern of spikes in the stripes. In particular, the pacing degree between spikes is determined in a statistical-mechanical way by quantifying the average contribution of (microscopic) individual spikes to the (macroscopic) ensemble-averaged global potential. This "statistical-mechanical" measure MsM_s is in contrast to the conventional measures such as the "thermodynamic" order parameter (which concerns the time-averaged fluctuations of the macroscopic global potential), the "microscopic" correlation-based measure (based on the cross-correlation between the microscopic individual potentials), and the measures of precise spike timing (based on the peri-stimulus time histogram). In terms of MsM_s, we quantitatively characterize the stochastic spiking coherence, and find that MsM_s reflects the degree of collective spiking coherence seen in the raster plot very well. Hence, the "statistical-mechanical" spike-based measure MsM_s may be used usefully to quantify the degree of stochastic spiking coherence in a statistical-mechanical way.Comment: 16 pages, 5 figures, to appear in the J. Comput. Neurosc

    Revisiting chaos in stimulus-driven spiking networks: signal encoding and discrimination

    Full text link
    Highly connected recurrent neural networks often produce chaotic dynamics, meaning their precise activity is sensitive to small perturbations. What are the consequences for how such networks encode streams of temporal stimuli? On the one hand, chaos is a strong source of randomness, suggesting that small changes in stimuli will be obscured by intrinsically generated variability. On the other hand, recent work shows that the type of chaos that occurs in spiking networks can have a surprisingly low-dimensional structure, suggesting that there may be "room" for fine stimulus features to be precisely resolved. Here we show that strongly chaotic networks produce patterned spikes that reliably encode time-dependent stimuli: using a decoder sensitive to spike times on timescales of 10's of ms, one can easily distinguish responses to very similar inputs. Moreover, recurrence serves to distribute signals throughout chaotic networks so that small groups of cells can encode substantial information about signals arriving elsewhere. A conclusion is that the presence of strong chaos in recurrent networks does not prohibit precise stimulus encoding.Comment: 8 figure

    Consequences of converting graded to action potentials upon neural information coding and energy efficiency

    Get PDF
    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ~50% in generator potentials, to ~3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation

    Transient Resetting: A Novel Mechanism for Synchrony and Its Biological Examples

    Get PDF
    The study of synchronization in biological systems is essential for the understanding of the rhythmic phenomena of living organisms at both molecular and cellular levels. In this paper, by using simple dynamical systems theory, we present a novel mechanism, named transient resetting, for the synchronization of uncoupled biological oscillators with stimuli. This mechanism not only can unify and extend many existing results on (deterministic and stochastic) stimulus-induced synchrony, but also may actually play an important role in biological rhythms. We argue that transient resetting is a possible mechanism for the synchronization in many biological organisms, which might also be further used in medical therapy of rhythmic disorders. Examples on the synchronization of neural and circadian oscillators are presented to verify our hypothesis.Comment: 17 pages, 7 figure

    Comparison of Langevin and Markov channel noise models for neuronal signal generation

    Full text link
    The stochastic opening and closing of voltage-gated ion channels produces noise in neurons. The effect of this noise on the neuronal performance has been modelled using either approximate or Langevin model, based on stochastic differential equations or an exact model, based on a Markov process model of channel gating. Yet whether the Langevin model accurately reproduces the channel noise produced by the Markov model remains unclear. Here we present a comparison between Langevin and Markov models of channel noise in neurons using single compartment Hodgkin-Huxley models containing either Na+Na^{+} and K+K^{+}, or only K+K^{+} voltage-gated ion channels. The performance of the Langevin and Markov models was quantified over a range of stimulus statistics, membrane areas and channel numbers. We find that in comparison to the Markov model, the Langevin model underestimates the noise contributed by voltage-gated ion channels, overestimating information rates for both spiking and non-spiking membranes. Even with increasing numbers of channels the difference between the two models persists. This suggests that the Langevin model may not be suitable for accurately simulating channel noise in neurons, even in simulations with large numbers of ion channels

    Shaping bursting by electrical coupling and noise

    Full text link
    Gap-junctional coupling is an important way of communication between neurons and other excitable cells. Strong electrical coupling synchronizes activity across cell ensembles. Surprisingly, in the presence of noise synchronous oscillations generated by an electrically coupled network may differ qualitatively from the oscillations produced by uncoupled individual cells forming the network. A prominent example of such behavior is the synchronized bursting in islets of Langerhans formed by pancreatic \beta-cells, which in isolation are known to exhibit irregular spiking. At the heart of this intriguing phenomenon lies denoising, a remarkable ability of electrical coupling to diminish the effects of noise acting on individual cells. In this paper, we derive quantitative estimates characterizing denoising in electrically coupled networks of conductance-based models of square wave bursting cells. Our analysis reveals the interplay of the intrinsic properties of the individual cells and network topology and their respective contributions to this important effect. In particular, we show that networks on graphs with large algebraic connectivity or small total effective resistance are better equipped for implementing denoising. As a by-product of the analysis of denoising, we analytically estimate the rate with which trajectories converge to the synchronization subspace and the stability of the latter to random perturbations. These estimates reveal the role of the network topology in synchronization. The analysis is complemented by numerical simulations of electrically coupled conductance-based networks. Taken together, these results explain the mechanisms underlying synchronization and denoising in an important class of biological models
    corecore