2,566 research outputs found
Jensenâs force and the statistical mechanics of cortical asynchronous states
Cortical networks are shaped by the combined action of excitatory and inhibitory interactions. Among
other important functions, inhibition solves the problem of the all-or-none type of response that comes
about in purely excitatory networks, allowing the network to operate in regimes of moderate or low
activity, between quiescent and saturated regimes. Here, we elucidate a noise-induced effect that we
call âJensenâs forceâ âstemming from the combined effect of excitation/inhibition balance and network
sparsityâ which is responsible for generating a phase of self-sustained low activity in excitationinhibition
networks. The uncovered phase reproduces the main empirically-observed features of cortical
networks in the so-called asynchronous state, characterized by low, un-correlated and highly-irregular
activity. The parsimonious model analyzed here allows us to resolve a number of long-standing issues,
such as proving that activity can be self-sustained even in the complete absence of external stimuli or
driving. The simplicity of our approach allows for a deep understanding of asynchronous states and of
the phase transitions to other standard phases it exhibits, opening the door to reconcile, asynchronousstate
and critical-state hypotheses, putting them within a unified framework. We argue that Jensenâs
forces are measurable experimentally and might be relevant in contexts beyond neuroscience.The study is supported by Fondazione Cariparma, under TeachInParma Project. MAM thanks the Spanish
Ministry of Science and the Agencia Española de Investigación (AEI) for financial support under grant
FIS2017-84256-P (European Regional Development Fund (ERDF)) as well as the Consejera de Conocimiento,
InvestigaciĂłn y Universidad, Junta de Andaluca and European Regional Development Fund (ERDF), ref.
SOMM17/6105/UGR. V.B. and R.B. acknowledge funding from the INFN BIOPHYS projec
PRINCIPLES OF INFORMATION PROCESSING IN NEURONAL AVALANCHES
How the brain processes information is poorly understood. It has been suggested that the imbalance of excitation and inhibition (E/I) can significantly affect information processing in the brain. Neuronal avalanches, a type of spontaneous activity recently discovered, have been ubiquitously observed in vitro and in vivo when the cortical network is in the E/I balanced state. In this dissertation, I experimentally demonstrate that several properties regarding information processing in the cortex, i.e. the entropy of spontaneous activity, the information transmission between stimulus and response, the diversity of synchronized states and the discrimination of external stimuli, are optimized when the cortical network is in the E/I balanced state, exhibiting neuronal avalanche dynamics. These experimental studies not only support the hypothesis that the cortex operates in the critical state, but also suggest that criticality is a potential principle of information processing in the cortex. Further, we study the interaction structure in population neuronal dynamics, and discovered a special structure of higher order interactions that are inherent in the neuronal dynamics
Time-WarpâInvariant Neuronal Processing
A biophysical mechanism acting in auditory neurons allows the brain to process the high variability of speaking rates in natural speech in a time-warp-invariant manner
The computational role of short-term plasticity and the balance of excitation and inhibition in neural microcircuits: experimental and theoretical analysis
The computations performed by the brain ultimately rely on the
functional connectivity between neurons embedded in complex networks. It is
well known that the neuronal connections, the synapses, are plastic, i.e. the
contribution of each presynaptic neuron to the firing of a postsynaptic neuron
can be independently adjusted. The modulation of effective synaptic strength
can occur on time scales that range from tens or hundreds of milliseconds, to
tens of minutes or hours, to days, and may involve pre- and/or post-synaptic
modifications. The collection of these mechanisms is generally believed to
underlie learning and memory and, hence, it is fundamental to understand their
consequences in the behavior of neurons.(...
Functional consequences of inhibitory plasticity: homeostasis, the excitation-inhibition balance and beyond
Computational neuroscience has a long-standing tradition of investigating the consequences of excitatory synaptic plasticity. In contrast, the functions of inhibitory plasticity are still largely nebulous, particularly given the bewildering diversity of interneurons in the brain. Here, we review recent computational advances that provide first suggestions for the functional roles of inhibitory plasticity, such as a maintenance of the excitation-inhibition balance, a stabilization of recurrent network dynamics and a decorrelation of sensory responses. The field is still in its infancy, but given the existing body of theory for excitatory plasticity, it is likely to mature quickly and deliver important insights into the self-organization of inhibitory circuits in the brain
Impact of Excitation-Inhibition Balance/Imbalance on Dynamics of Cortical Neural Networks
The purpose of this research is to study the implications of Excitation/Inhibition balance and imbalance on the dynamics of ongoing (spontaneous) neural activity in the cerebral cortex region of the brain.
The first research work addresses the question that why among the continuum of Excitation-Inhibition balance configurations, particular configuration should be favored? We calculate the entropy of neural network dynamics by studying an analytically tractable network of binary neurons. Our main result from this work is that the entropy maximizes at regime which is neither excitation-dominant nor inhibition-dominant but at the boundary of both. Along this boundary we see there is a trade-off between high and robust entropy. Weak synapse strengths yield entropy which is high but drops rapidly under parameter change. Strong synapse strengths, on the other hand yield a lower, but more robust, network entropy.
The second research work is motivated from experiments suggest that the cerebral cortex can also operate near a critical phase transition. It has been observed in many physical systems that the governing physical laws obey a fractal symmetry near critical phase transition. This symmetry exists irrespective of the observational length-scale. Thus, we hypothesize that the laws governing cortical dynamics may obey scale-change symmetry. We test and confirm this hypothesis using two different computational models. Further, we extend the transformational scheme show that as a mouse awakens from anesthesia, scale-change symmetry emerges.
The third research project is motivated by experimental observations from in motor cortex under modulation of inhibitory inputs. We found that low intensity increase (decrease) in overall inhibition in cortex causes decrease (increase) in spiking activity for some neurons. Even though, the population level activity largely unchanged. This behavior is paradoxical when compared to the status quo that says that increase (decrease) inhibition should lead to decrease (increase) in neural spiking activity. We simulated similar dynamical change to inhibitory signal modulation in neural network model. We found that this paradoxical behavior arises due to sparse connectivity and inhomogeneity in inhibitory weights
Recommended from our members
Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning
Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations. © 2020 National Academy of Sciences. All rights reserved
- âŠ