475 research outputs found

    Neuronal Synchronization Can Control the Energy Efficiency of Inter-Spike Interval Coding

    Get PDF
    The role of synchronous firing in sensory coding and cognition remains controversial. While studies, focusing on its mechanistic consequences in attentional tasks, suggest that synchronization dynamically boosts sensory processing, others failed to find significant synchronization levels in such tasks. We attempt to understand both lines of evidence within a coherent theoretical framework. We conceptualize synchronization as an independent control parameter to study how the postsynaptic neuron transmits the average firing activity of a presynaptic population, in the presence of synchronization. We apply the Berger-Levy theory of energy efficient information transmission to interpret simulations of a Hodgkin-Huxley-type postsynaptic neuron model, where we varied the firing rate and synchronization level in the presynaptic population independently. We find that for a fixed presynaptic firing rate the simulated postsynaptic interspike interval distribution depends on the synchronization level and is well-described by a generalized extreme value distribution. For synchronization levels of 15% to 50%, we find that the optimal distribution of presynaptic firing rate, maximizing the mutual information per unit cost, is maximized at ~30% synchronization level. These results suggest that the statistics and energy efficiency of neuronal communication channels, through which the input rate is communicated, can be dynamically adapted by the synchronization level.Comment: 47 pages, 14 figures, 2 Table

    The Physics of Living Neural Networks

    Full text link
    Improvements in technique in conjunction with an evolution of the theoretical and conceptual approach to neuronal networks provide a new perspective on living neurons in culture. Organization and connectivity are being measured quantitatively along with other physical quantities such as information, and are being related to function. In this review we first discuss some of these advances, which enable elucidation of structural aspects. We then discuss two recent experimental models that yield some conceptual simplicity. A one-dimensional network enables precise quantitative comparison to analytic models, for example of propagation and information transport. A two-dimensional percolating network gives quantitative information on connectivity of cultured neurons. The physical quantities that emerge as essential characteristics of the network in vitro are propagation speeds, synaptic transmission, information creation and capacity. Potential application to neuronal devices is discussed.Comment: PACS: 87.18.Sn, 87.19.La, 87.80.-y, 87.80.Xa, 64.60.Ak Keywords: complex systems, neuroscience, neural networks, transport of information, neural connectivity, percolation http://www.weizmann.ac.il/complex/tlusty/papers/PhysRep2007.pdf http://www.weizmann.ac.il/complex/EMoses/pdf/PhysRep-448-56.pd

    Consequences of converting graded to action potentials upon neural information coding and energy efficiency

    Get PDF
    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ~50% in generator potentials, to ~3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation

    Detecting and Estimating Signals in Noisy Cable Structures, II: Information Theoretical Analysis

    Get PDF
    This is the second in a series of articles that seek to recast classical single-neuron biophysics in information-theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike initiation zone. On the other hand, we are interested in analyzing the amount of information lost about the signal in this process due to the presence of various noise sources distributed throughout the neuronal membrane. We use a stochastic version of the linear one-dimensional cable equation to derive closed-form expressions for the second-order moments of the fluctuations of the membrane potential associated with different membrane current noise sources: thermal noise, noise due to the random opening and closing of sodium and potassium channels, and noise due to the presence of “spontaneous” synaptic input. We consider two different scenarios. In the signal estimation paradigm, the time course of the membrane potential at a location on the cable is used to reconstruct the detailed time course of a random, band-limited current injected some distance away. Estimation performance is characterized in terms of the coding fraction and the mutual information. In the signal detection paradigm, the membrane potential is used to determine whether a distant synaptic event occurred within a given observation interval. In the light of our analytical results, we speculate that the length of weakly active apical dendrites might be limited by the information loss due to the accumulated noise between distal synaptic input sites and the soma and that the presence of dendritic nonlinearities probably serves to increase dendritic information transfer

    Information Transmission in Cercal Giant Interneurons Is Unaffected by Axonal Conduction Noise

    Get PDF
    What are the fundamental constraints on the precision and accuracy with which nervous systems can process information? One constraint must reflect the intrinsic “noisiness” of the mechanisms that transmit information between nerve cells. Most neurons transmit information through the probabilistic generation and propagation of spikes along axons, and recent modeling studies suggest that noise from spike propagation might pose a significant constraint on the rate at which information could be transmitted between neurons. However, the magnitude and functional significance of this noise source in actual cells remains poorly understood. We measured variability in conduction time along the axons of identified neurons in the cercal sensory system of the cricket Acheta domesticus, and used information theory to calculate the effects of this variability on sensory coding. We found that the variability in spike propagation speed is not large enough to constrain the accuracy of neural encoding in this system

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007
    corecore