12,386 research outputs found
Neural population coding: combining insights from microscopic and mass signals
Behavior relies on the distributed and coordinated activity of neural populations. Population activity can be measured using multi-neuron recordings and neuroimaging. Neural recordings reveal how the heterogeneity, sparseness, timing, and correlation of population activity shape information processing in local networks, whereas neuroimaging shows how long-range coupling and brain states impact on local activity and perception. To obtain an integrated perspective on neural information processing we need to combine knowledge from both levels of investigation. We review recent progress of how neural recordings, neuroimaging, and computational approaches begin to elucidate how interactions between local neural population activity and large-scale dynamics shape the structure and coding capacity of local information representations, make them state-dependent, and control distributed populations that collectively shape behavior
Information flow through a model of the C. elegans klinotaxis circuit
Understanding how information about external stimuli is transformed into
behavior is one of the central goals of neuroscience. Here we characterize the
information flow through a complete sensorimotor circuit: from stimulus, to
sensory neurons, to interneurons, to motor neurons, to muscles, to motion.
Specifically, we apply a recently developed framework for quantifying
information flow to a previously published ensemble of models of salt
klinotaxis in the nematode worm C. elegans. The models are grounded in the
neuroanatomy and currently known neurophysiology of the worm. The unknown model
parameters were optimized to reproduce the worm's behavior. Information flow
analysis reveals several key principles underlying how the models operate: (1)
Interneuron class AIY is responsible for integrating information about positive
and negative changes in concentration, and exhibits a strong left/right
information asymmetry. (2) Gap junctions play a crucial role in the transfer of
information responsible for the information symmetry observed in interneuron
class AIZ. (3) Neck motor neuron class SMB implements an information gating
mechanism that underlies the circuit's state-dependent response. (4) The neck
carries non-uniform distribution about changes in concentration. Thus, not all
directions of movement are equally informative. Each of these findings
corresponds to an experimental prediction that could be tested in the worm to
greatly refine our understanding of the neural circuit underlying klinotaxis.
Information flow analysis also allows us to explore how information flow
relates to underlying electrophysiology. Despite large variations in the neural
parameters of individual circuits, the overall information flow architecture
circuit is remarkably consistent across the ensemble, suggesting that
information flow analysis captures general principles of operation for the
klinotaxis circuit
Neuronal assembly dynamics in supervised and unsupervised learning scenarios
The dynamic formation of groups of neuronsāneuronal assembliesāis believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus on the temporal structure of neuronal signals. In this context, we investigate neuronal assembly dynamics in two complementary scenarios: the first, a supervised spike pattern classification task, in which noisy variations of a collection of spikes have to be correctly labeled; the second, an unsupervised, minimally cognitive evolutionary robotics tasks, in which an evolved agent has to cope with multiple, possibly conflicting, objectives. In both cases, the more traditional dynamical analysis of the systemās variables is paired with information-theoretic techniques in order to get a broader picture of the ongoing interactions with and within the network. The neural network model is inspired by the Kuramoto model of coupled phase oscillators and allows one to fine-tune the network synchronization dynamics and assembly configuration. The experiments explore the computational power, redundancy, and generalization capability of neuronal circuits, demonstrating that performance depends nonlinearly on the number of assemblies and neurons in the network and showing that the framework can be exploited to generate minimally cognitive behaviors, with dynamic assembly formation accounting for varying degrees of stimuli modulation of the sensorimotor interactions
Homunculus strides again: why āinformation transmittedā in neuroscience tells us nothing
Purpose ā For half a century, neuroscientists have used Shannon Information Theory to calculate āinformation transmitted,ā a hypothetical measure of how well neurons ādiscriminateā amongst stimuli. Neuroscientistsā computations, however, fail to meet even the technical requirements for credibility. Ultimately, the reasons must be conceptual. That conclusion is confirmed here, with crucial implications for neuroscience. The paper aims to discuss these issues.
Design/methodology/approach ā Shannon Information Theory depends upon a physical model, Shannonās āgeneral communication system.ā Neuroscientistsā interpretation of that model is scrutinized here.
Findings ā In Shannonās system, a recipient receives a message composed of symbols. The symbols received, the symbols sent, and their hypothetical occurrence probabilities altogether allow calculation of āinformation transmitted.ā Significantly, Shannonās systemās āreceptionā (decoding) side physically mirrors its ātransmissionā (encoding) side. However, neurons lack the āreceptionā side; neuroscientists nonetheless insisted that decoding must happen. They turned to Homunculus, an internal humanoid who infers stimuli from neuronal firing. However, Homunculus must contain a Homunculus, and so on ad infinitum ā unless it is super-human. But any need for Homunculi, as in ātheories of consciousness,ā is obviated if consciousness proves to be āemergent.ā
Research limitations/implications ā Neuroscientistsā āinformation transmittedā indicates, at best, how well neuroscientists themselves can use neuronal firing to discriminate amongst the stimuli given to the research animal.
Originality/value ā A long-overdue examination unmasks a hidden element in neuroscientistsā use of Shannon Information Theory, namely, Homunculus. Almost 50 yearsā worth of computations are recognized as irrelevant, mandating fresh approaches to understanding ādiscriminability.
Predictability, complexity and learning
We define {\em predictive information} as the mutual
information between the past and the future of a time series. Three
qualitatively different behaviors are found in the limit of large observation
times : can remain finite, grow logarithmically, or grow
as a fractional power law. If the time series allows us to learn a model with a
finite number of parameters, then grows logarithmically with
a coefficient that counts the dimensionality of the model space. In contrast,
power--law growth is associated, for example, with the learning of infinite
parameter (or nonparametric) models such as continuous functions with
smoothness constraints. There are connections between the predictive
information and measures of complexity that have been defined both in learning
theory and in the analysis of physical systems through statistical mechanics
and dynamical systems theory. Further, in the same way that entropy provides
the unique measure of available information consistent with some simple and
plausible conditions, we argue that the divergent part of
provides the unique measure for the complexity of dynamics underlying a time
series. Finally, we discuss how these ideas may be useful in different problems
in physics, statistics, and biology.Comment: 53 pages, 3 figures, 98 references, LaTeX2
Information transmission in oscillatory neural activity
Periodic neural activity not locked to the stimulus or to motor responses is
usually ignored. Here, we present new tools for modeling and quantifying the
information transmission based on periodic neural activity that occurs with
quasi-random phase relative to the stimulus. We propose a model to reproduce
characteristic features of oscillatory spike trains, such as histograms of
inter-spike intervals and phase locking of spikes to an oscillatory influence.
The proposed model is based on an inhomogeneous Gamma process governed by a
density function that is a product of the usual stimulus-dependent rate and a
quasi-periodic function. Further, we present an analysis method generalizing
the direct method (Rieke et al, 1999; Brenner et al, 2000) to assess the
information content in such data. We demonstrate these tools on recordings from
relay cells in the lateral geniculate nucleus of the cat.Comment: 18 pages, 8 figures, to appear in Biological Cybernetic
- ā¦