475 research outputs found
Recommended from our members
Rate region analysis of multi-terminal neuronal nanoscale molecular communication channel
© 2017 IEEE. In this paper, we investigate the communication channel capacity among hippocampal pyramidal neurons. To this aim, we study the processes included in this communication and model them with realistic communication system components based on the existing reports in the physiology literature. We consider the communication between two neurons and reveal the effects of the existence of multiple terminals between these neurons on the achievable rate per spike. To this objective, we derive the power spectral density (PSD) of the signal in the output neuron and utilize it to calculate the rate region of the channel. Moreover, we evaluate the impacts of vesicle availability on the achievable rate by deriving the expected number of available vesicles in input neuron using a realistic vesicle release model. Simulation results show that number of available vesicles for release does not affect the achievable rate of neuro-spike communication with univesicular release model. However, in neurons that multiple vesicles can release from each synaptic terminal, achievable rate is significantly affected by depletion of vesicles. Moreover, we show that increasing the number of synaptic terminals between two neurons makes the synaptic connection stronger. Hence, it is an important factor in learning and memory, which occur in the hippocampal region of the brain based on the synaptic connectivity
Neuronal Synchronization Can Control the Energy Efficiency of Inter-Spike Interval Coding
The role of synchronous firing in sensory coding and cognition remains
controversial. While studies, focusing on its mechanistic consequences in
attentional tasks, suggest that synchronization dynamically boosts sensory
processing, others failed to find significant synchronization levels in such
tasks. We attempt to understand both lines of evidence within a coherent
theoretical framework. We conceptualize synchronization as an independent
control parameter to study how the postsynaptic neuron transmits the average
firing activity of a presynaptic population, in the presence of
synchronization. We apply the Berger-Levy theory of energy efficient
information transmission to interpret simulations of a Hodgkin-Huxley-type
postsynaptic neuron model, where we varied the firing rate and synchronization
level in the presynaptic population independently. We find that for a fixed
presynaptic firing rate the simulated postsynaptic interspike interval
distribution depends on the synchronization level and is well-described by a
generalized extreme value distribution. For synchronization levels of 15% to
50%, we find that the optimal distribution of presynaptic firing rate,
maximizing the mutual information per unit cost, is maximized at ~30%
synchronization level. These results suggest that the statistics and energy
efficiency of neuronal communication channels, through which the input rate is
communicated, can be dynamically adapted by the synchronization level.Comment: 47 pages, 14 figures, 2 Table
Recommended from our members
Impacts of Spike Shape Variations on Synaptic Communication.
Understanding the communication theoretical capabilities of information transmission among neurons, known as neuro-spike communication, is a significant step in developing bio-inspired solutions for nanonetworking. In this paper, we focus on a part of this communication known as synaptic transmission for pyramidal neurons in the Cornu Ammonis area of the hippocampus location in the brain and propose a communication-based model for it that includes effects of spike shape variation on neural calcium signaling and the vesicle release process downstream of it. For this aim, we find impacts of spike shape variation on opening of voltage-dependent calcium channels, which control the release of vesicles from the pre-synaptic neuron by changing the influx of calcium ions. Moreover, we derive the structure of the optimum receiver based on the Neyman-Pearson detection method to find the effects of spike shape variations on the functionality of neuro-spike communication. Numerical results depict that changes in both spike width and amplitude affect the error detection probability. Moreover, these two factors do not control the performance of the system independently. Hence, a proper model for neuro-spike communication should contain effects of spike shape variations during axonal transmission on both synaptic propagation and spike generation mechanisms to enable us to accurately explain the performance of this communication paradigm
Recommended from our members
An Optimized Structure-Function Design Principle Underlies Efficient Signaling Dynamics in Neurons.
Dynamic signaling on branching axons is critical for rapid and efficient communication between neurons in the brain. Efficient signaling in axon arbors depends on a trade-off between the time it takes action potentials to reach synaptic terminals (temporal cost) and the amount of cellular material associated with the wiring path length of the neuron's morphology (material cost). However, where the balance between structural and dynamical considerations for achieving signaling efficiency is, and the design principle that neurons optimize to preserve this balance, is still elusive. In this work, we introduce a novel analysis that compares morphology and signaling dynamics in axonal networks to address this open problem. We show that in Basket cell neurons the design principle being optimized is the ratio between the refractory period of the membrane, and action potential latencies between the initial segment and the synaptic terminals. Our results suggest that the convoluted paths taken by axons reflect a design compensation by the neuron to slow down signaling latencies in order to optimize this ratio. Deviations in this ratio may result in a breakdown of signaling efficiency in the cell. These results pave the way to new approaches for investigating more complex neurophysiological phenomena that involve considerations of neuronal structure-function relationships
The Physics of Living Neural Networks
Improvements in technique in conjunction with an evolution of the theoretical
and conceptual approach to neuronal networks provide a new perspective on
living neurons in culture. Organization and connectivity are being measured
quantitatively along with other physical quantities such as information, and
are being related to function. In this review we first discuss some of these
advances, which enable elucidation of structural aspects. We then discuss two
recent experimental models that yield some conceptual simplicity. A
one-dimensional network enables precise quantitative comparison to analytic
models, for example of propagation and information transport. A two-dimensional
percolating network gives quantitative information on connectivity of cultured
neurons. The physical quantities that emerge as essential characteristics of
the network in vitro are propagation speeds, synaptic transmission, information
creation and capacity. Potential application to neuronal devices is discussed.Comment: PACS: 87.18.Sn, 87.19.La, 87.80.-y, 87.80.Xa, 64.60.Ak Keywords:
complex systems, neuroscience, neural networks, transport of information,
neural connectivity, percolation
http://www.weizmann.ac.il/complex/tlusty/papers/PhysRep2007.pdf
http://www.weizmann.ac.il/complex/EMoses/pdf/PhysRep-448-56.pd
Consequences of converting graded to action potentials upon neural information coding and energy efficiency
Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ~50% in generator potentials, to ~3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation
Detecting and Estimating Signals in Noisy Cable Structures, II: Information Theoretical Analysis
This is the second in a series of articles that seek to recast classical single-neuron biophysics in information-theoretical terms. Classical cable theory focuses on analyzing the voltage or current attenuation of a synaptic signal as it propagates from its dendritic input location to the spike initiation zone. On the other hand, we are interested in analyzing the amount of information lost about the signal in this process due to the presence of various noise sources distributed throughout the neuronal membrane. We use a stochastic version of the linear one-dimensional cable equation to derive closed-form expressions for the second-order moments of the fluctuations of the membrane potential associated with different membrane current noise sources: thermal noise, noise due to the random opening and closing of sodium and potassium channels, and noise due to the presence of “spontaneous” synaptic input.
We consider two different scenarios. In the signal estimation paradigm, the time course of the membrane potential at a location on the cable is used to reconstruct the detailed time course of a random, band-limited current injected some distance away. Estimation performance is characterized in terms of the coding fraction and the mutual information. In the signal detection paradigm, the membrane potential is used to determine whether a distant synaptic event occurred within a given observation interval. In the light of our analytical results, we speculate that the length of weakly active apical dendrites might be limited by the information loss due to the accumulated noise between distal synaptic input sites and the soma and that the presence of dendritic nonlinearities probably serves to increase dendritic information transfer
Information Transmission in Cercal Giant Interneurons Is Unaffected by Axonal Conduction Noise
What are the fundamental constraints on the precision and accuracy with which nervous systems can process information? One constraint must reflect the intrinsic “noisiness” of the mechanisms that transmit information between nerve cells. Most neurons transmit information through the probabilistic generation and propagation of spikes along axons, and recent modeling studies suggest that noise from spike propagation might pose a significant constraint on the rate at which information could be transmitted between neurons. However, the magnitude and functional significance of this noise source in actual cells remains poorly understood. We measured variability in conduction time along the axons of identified neurons in the cercal sensory system of the cricket Acheta domesticus, and used information theory to calculate the effects of this variability on sensory coding. We found that the variability in spike propagation speed is not large enough to constrain the accuracy of neural encoding in this system
Simulation of networks of spiking neurons: A review of tools and strategies
We review different aspects of the simulation of spiking neural networks. We
start by reviewing the different types of simulation strategies and algorithms
that are currently implemented. We next review the precision of those
simulation strategies, in particular in cases where plasticity depends on the
exact timing of the spikes. We overview different simulators and simulation
environments presently available (restricted to those freely available, open
source and documented). For each simulation tool, its advantages and pitfalls
are reviewed, with an aim to allow the reader to identify which simulator is
appropriate for a given task. Finally, we provide a series of benchmark
simulations of different types of networks of spiking neurons, including
Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based
or conductance-based synapses, using clock-driven or event-driven integration
strategies. The same set of models are implemented on the different simulators,
and the codes are made available. The ultimate goal of this review is to
provide a resource to facilitate identifying the appropriate integration
strategy and simulation tool to use for a given modeling problem related to
spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of
Computational Neuroscience, in press (2007
- …