2,436 research outputs found
Perspective: network-guided pattern formation of neural dynamics
The understanding of neural activity patterns is fundamentally linked to an
understanding of how the brain's network architecture shapes dynamical
processes. Established approaches rely mostly on deviations of a given network
from certain classes of random graphs. Hypotheses about the supposed role of
prominent topological features (for instance, the roles of modularity, network
motifs, or hierarchical network organization) are derived from these
deviations. An alternative strategy could be to study deviations of network
architectures from regular graphs (rings, lattices) and consider the
implications of such deviations for self-organized dynamic patterns on the
network. Following this strategy, we draw on the theory of spatiotemporal
pattern formation and propose a novel perspective for analyzing dynamics on
networks, by evaluating how the self-organized dynamics are confined by network
architecture to a small set of permissible collective states. In particular, we
discuss the role of prominent topological features of brain connectivity, such
as hubs, modules and hierarchy, in shaping activity patterns. We illustrate the
notion of network-guided pattern formation with numerical simulations and
outline how it can facilitate the understanding of neural dynamics
Homeostatic plasticity and external input shape neural network dynamics
In vitro and in vivo spiking activity clearly differ. Whereas networks in
vitro develop strong bursts separated by periods of very little spiking
activity, in vivo cortical networks show continuous activity. This is puzzling
considering that both networks presumably share similar single-neuron dynamics
and plasticity rules. We propose that the defining difference between in vitro
and in vivo dynamics is the strength of external input. In vitro, networks are
virtually isolated, whereas in vivo every brain area receives continuous input.
We analyze a model of spiking neurons in which the input strength, mediated by
spike rate homeostasis, determines the characteristics of the dynamical state.
In more detail, our analytical and numerical results on various network
topologies show consistently that under increasing input, homeostatic
plasticity generates distinct dynamic states, from bursting, to
close-to-critical, reverberating and irregular states. This implies that the
dynamic state of a neural network is not fixed but can readily adapt to the
input strengths. Indeed, our results match experimental spike recordings in
vitro and in vivo: the in vitro bursting behavior is consistent with a state
generated by very low network input (< 0.1%), whereas in vivo activity suggests
that on the order of 1% recorded spikes are input-driven, resulting in
reverberating dynamics. Importantly, this predicts that one can abolish the
ubiquitous bursts of in vitro preparations, and instead impose dynamics
comparable to in vivo activity by exposing the system to weak long-term
stimulation, thereby opening new paths to establish an in vivo-like assay in
vitro for basic as well as neurological studies.Comment: 14 pages, 8 figures, accepted at Phys. Rev.
Neuronal Synchronization Can Control the Energy Efficiency of Inter-Spike Interval Coding
The role of synchronous firing in sensory coding and cognition remains
controversial. While studies, focusing on its mechanistic consequences in
attentional tasks, suggest that synchronization dynamically boosts sensory
processing, others failed to find significant synchronization levels in such
tasks. We attempt to understand both lines of evidence within a coherent
theoretical framework. We conceptualize synchronization as an independent
control parameter to study how the postsynaptic neuron transmits the average
firing activity of a presynaptic population, in the presence of
synchronization. We apply the Berger-Levy theory of energy efficient
information transmission to interpret simulations of a Hodgkin-Huxley-type
postsynaptic neuron model, where we varied the firing rate and synchronization
level in the presynaptic population independently. We find that for a fixed
presynaptic firing rate the simulated postsynaptic interspike interval
distribution depends on the synchronization level and is well-described by a
generalized extreme value distribution. For synchronization levels of 15% to
50%, we find that the optimal distribution of presynaptic firing rate,
maximizing the mutual information per unit cost, is maximized at ~30%
synchronization level. These results suggest that the statistics and energy
efficiency of neuronal communication channels, through which the input rate is
communicated, can be dynamically adapted by the synchronization level.Comment: 47 pages, 14 figures, 2 Table
An Interneuron Circuit Reproducing Essential Spectral Features of Field Potentials
This document is the Accepted Manuscript version of the following article: Reinoud Maex, ‘An Interneuron Circuit Reproducing Essential Spectral Features of Field Potentials’, Neural Computation, March 2018. Under embargo until 22 June 2018. The final, definitive version of this paper is available online at doi: https://doi.org/10.1162/NECO_a_01068. © 2018 Massachusetts Institute of Technology. Content in the UH Research Archive is made available for personal research, educational, and non-commercial purposes only. Unless otherwise stated, all content is protected by copyright, and in the absence of an open license, permissions for further re-use should be sought from the publisher, the author, or other copyright holder.Recent advances in engineering and signal processing have renewed the interest in invasive and surface brain recordings, yet many features of cortical field potentials remain incompletely understood. In the present computational study, we show that a model circuit of interneurons, coupled via both GABA(A) receptor synapses and electrical synapses, reproduces many essential features of the power spectrum of local field potential (LFP) recordings, such as 1/f power scaling at low frequency (< 10 Hz) , power accumulation in the γ-frequency band (30–100 Hz), and a robust α rhythm in the absence of stimulation. The low-frequency 1/f power scaling depends on strong reciprocal inhibition, whereas the α rhythm is generated by electrical coupling of intrinsically active neurons. As in previous studies, the γ power arises through the amplifica- tion of single-neuron spectral properties, owing to the refractory period, by parameters that favour neuronal synchrony, such as delayed inhibition. The present study also confirms that both synaptic and voltage-gated membrane currents substantially contribute to the LFP, and that high-frequency signals such as action potentials quickly taper off with distance. Given the ubiquity of electrically coupled interneuron circuits in the mammalian brain, they may be major determinants of the recorded potentials.Peer reviewe
Songbird organotypic culture as an in vitro model for interrogating sparse sequencing networks
Sparse sequences of neuronal activity are fundamental features of neural circuit computation; however, the underlying homeostatic mechanisms remain poorly understood. To approach these questions, we have developed a method for cellular-resolution imaging in organotypic cultures of the adult zebra finch brain, including portions of the intact song circuit. These in vitro networks can survive for weeks, and display mature neuron morphologies. Neurons within the organotypic slices exhibit a diversity of spontaneous and pharmacologically induced activity that can be easily monitored using the genetically encoded calcium indicator GCaMP6. In this study, we primarily focus on the classic song sequence generator HVC and the surrounding areas. We describe proof of concept experiments including physiological, optical, and pharmacological manipulation of these exposed networks. This method may allow the cellular rules underlying sparse, stereotyped neural sequencing to be examined with new degrees of experimental control
Structure of Spontaneous UP and DOWN Transitions Self-Organizing in a Cortical Network Model
Synaptic plasticity is considered to play a crucial role in the experience-dependent self-organization of local cortical networks. In the absence of sensory stimuli, cerebral cortex exhibits spontaneous membrane potential transitions between an UP and a DOWN state. To reveal how cortical networks develop spontaneous activity, or conversely, how spontaneous activity structures cortical networks, we analyze the self-organization of a recurrent network model of excitatory and inhibitory neurons, which is realistic enough to replicate UP–DOWN states, with spike-timing-dependent plasticity (STDP). The individual neurons in the self-organized network exhibit a variety of temporal patterns in the two-state transitions. In addition, the model develops a feed-forward network-like structure that produces a diverse repertoire of precise sequences of the UP state. Our model shows that the self-organized activity well resembles the spontaneous activity of cortical networks if STDP is accompanied by the pruning of weak synapses. These results suggest that the two-state membrane potential transitions play an active role in structuring local cortical circuits
- …