6 research outputs found
Efficiency characterization of a large neuronal network: a causal information approach
When inhibitory neurons constitute about 40% of neurons they could have an
important antinociceptive role, as they would easily regulate the level of
activity of other neurons. We consider a simple network of cortical spiking
neurons with axonal conduction delays and spike timing dependent plasticity,
representative of a cortical column or hypercolumn with large proportion of
inhibitory neurons. Each neuron fires following a Hodgkin-Huxley like dynamics
and it is interconnected randomly to other neurons. The network dynamics is
investigated estimating Bandt and Pompe probability distribution function
associated to the interspike intervals and taking different degrees of
inter-connectivity across neurons. More specifically we take into account the
fine temporal ``structures'' of the complex neuronal signals not just by using
the probability distributions associated to the inter spike intervals, but
instead considering much more subtle measures accounting for their causal
information: the Shannon permutation entropy, Fisher permutation information
and permutation statistical complexity. This allows us to investigate how the
information of the system might saturate to a finite value as the degree of
inter-connectivity across neurons grows, inferring the emergent dynamical
properties of the system.Comment: 26 pages, 3 Figures; Physica A, in pres
Noise Suppression and Surplus Synchrony by Coincidence Detection
The functional significance of correlations between action potentials of
neurons is still a matter of vivid debates. In particular it is presently
unclear how much synchrony is caused by afferent synchronized events and how
much is intrinsic due to the connectivity structure of cortex. The available
analytical approaches based on the diffusion approximation do not allow to
model spike synchrony, preventing a thorough analysis. Here we theoretically
investigate to what extent common synaptic afferents and synchronized inputs
each contribute to closely time-locked spiking activity of pairs of neurons. We
employ direct simulation and extend earlier analytical methods based on the
diffusion approximation to pulse-coupling, allowing us to introduce precisely
timed correlations in the spiking activity of the synaptic afferents. We
investigate the transmission of correlated synaptic input currents by pairs of
integrate-and-fire model neurons, so that the same input covariance can be
realized by common inputs or by spiking synchrony. We identify two distinct
regimes: In the limit of low correlation linear perturbation theory accurately
determines the correlation transmission coefficient, which is typically smaller
than unity, but increases sensitively even for weakly synchronous inputs. In
the limit of high afferent correlation, in the presence of synchrony a
qualitatively new picture arises. As the non-linear neuronal response becomes
dominant, the output correlation becomes higher than the total correlation in
the input. This transmission coefficient larger unity is a direct consequence
of non-linear neural processing in the presence of noise, elucidating how
synchrony-coded signals benefit from these generic properties present in
cortical networks
Efficiency characterization of a large neuronal network: A causal information approach
When inhibitory neurons constitute about 40% of neurons they could have an important antinociceptive role, as they would easily regulate the level of activity of other neurons. We consider a simple network of cortical spiking neurons with axonal conduction delays and spike timing dependent plasticity, representative of a cortical column or hypercolumn with a large proportion of inhibitory neurons. Each neuron fires following a Hodgkin–Huxley like dynamics and it is interconnected randomly to other neurons. The network dynamics is investigated estimating Bandt and Pompe probability distribution function associated to the interspike intervals and taking different degrees of interconnectivity across neurons. More specifically we take into account the fine temporal “structures” of the complex neuronal signals not just by using the probability distributions associated to the interspike intervals, but instead considering much more subtle measures accounting for their causal information: the Shannon permutation entropy, Fisher permutation information and permutation statistical complexity. This allows us to investigate how the information of the system might saturate to a finite value as the degree of interconnectivity across neurons grows, inferring the emergent dynamical properties of the system.Facultad de Ciencias ExactasInstituto de Física de Líquidos y Sistemas Biológico
Correlation codes in neuronal populations
Population codes often rely on the tuning of the mean responses to the stimulus parameters. However, this information can be greatly suppressed by long range correlations. Here we study the efficiency of coding information in the second order statistics of the population responses. We show that the Fisher Information of this system grows linearly with the size of the system. We propose a bilinear readout model for extracting information from correlation codes, and evaluate its performance in discrimination and estimation tasks. It is shown that the main source of information in this system is the stimulus dependence of the variances of the single neuron responses.