126 research outputs found
The Emergence of Up and Down States in Cortical Networks
The cerebral cortex is continuously active in the absence of external stimuli. An example of this spontaneous activity is the voltage transition between an Up and a Down state, observed simultaneously at individual neurons. Since this phenomenon could be of critical importance for working memory and attention, its explanation could reveal some fundamental properties of cortical organization. To identify a possible scenario for the dynamics of Up–Down states, we analyze a reduced stochastic dynamical system that models an interconnected network of excitatory neurons with activity-dependent synaptic depression. The model reveals that when the total synaptic connection strength exceeds a certain threshold, the phase space of the dynamical system contains two attractors, interpreted as Up and Down states. In that case, synaptic noise causes transitions between the states. Moreover, an external stimulation producing a depolarization increases the time spent in the Up state, as observed experimentally. We therefore propose that the existence of Up–Down states is a fundamental and inherent property of a noisy neural ensemble with sufficiently strong synaptic connections
Correction: Persistent Activity in Neural Networks with Dynamic Synapses
Persistent activity states (attractors), observed in several neocortical areas after the removal of a sensory stimulus, are believed to be the neuronal basis of working memory. One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections. A recent experimental study revealed that connections between pyramidal cells in prefrontal cortex exhibit various degrees of synaptic depression and facilitation. Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks. We show that different combinations of synaptic depression and facilitation result in qualitatively different network dynamics with respect to the emergence of the attractor states. This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli
Continuous Attractor Network Model for Conjunctive Position-by-Velocity Tuning of Grid Cells
The spatial responses of many of the cells recorded in layer II of rodent medial entorhinal cortex (MEC) show a triangular grid pattern, which appears to provide an accurate population code for animal spatial position. In layer III, V and VI of the rat MEC, grid cells are also selective to head-direction and are modulated by the speed of the animal. Several putative mechanisms of grid-like maps were proposed, including attractor network dynamics, interactions with theta oscillations or single-unit mechanisms such as firing rate adaptation. In this paper, we present a new attractor network model that accounts for the conjunctive position-by-velocity selectivity of grid cells. Our network model is able to perform robust path integration even when the recurrent connections are subject to random perturbations
Processing of Sounds by Population Spikes in a Model of Primary Auditory Cortex
We propose a model of the primary auditory cortex (A1), in which each iso-frequency column is represented by a recurrent neural network with short-term synaptic depression. Such networks can emit Population Spikes, in which most of the neurons fire synchronously for a short time period. Different columns are interconnected in a way that reflects the tonotopic map in A1, and population spikes can propagate along the map from one column to the next, in a temporally precise manner that depends on the specific input presented to the network. The network, therefore, processes incoming sounds by precise sequences of population spikes that are embedded in a continuous asynchronous activity, with both of these response components carrying information about the inputs and interacting with each other. With these basic characteristics, the model can account for a wide range of experimental findings. We reproduce neuronal frequency tuning curves, whose width depends on the strength of the intracortical inhibitory and excitatory connections. Non-simultaneous two-tone stimuli show forward masking depending on their temporal separation, as well as on the duration of the first stimulus. The model also exhibits non-linear suppressive interactions between sub-threshold tones and broad-band noise inputs, similar to the hypersensitive locking suppression recently demonstrated in auditory cortex. We derive several predictions from the model. In particular, we predict that spontaneous activity in primary auditory cortex gates the temporally locked responses of A1 neurons to auditory stimuli. Spontaneous activity could, therefore, be a mechanism for rapid and reversible modulation of cortical processing
Using large language models to study human memory for meaningful narratives
One of the most impressive achievements of the AI revolution is the
development of large language models that can generate meaningful text and
respond to instructions in plain English with no additional training necessary.
Here we show that language models can be used as a scientific instrument for
studying human memory for meaningful material. We developed a pipeline for
designing large scale memory experiments and analyzing the obtained results. We
performed online memory experiments with a large number of participants and
collected recognition and recall data for narratives of different lengths. We
found that both recall and recognition performance scale linearly with
narrative length. Furthermore, in order to investigate the role of narrative
comprehension in memory, we repeated these experiments using scrambled versions
of the presented stories. We found that even though recall performance declined
significantly, recognition remained largely unaffected. Interestingly, recalls
in this condition seem to follow the original narrative order rather than the
scrambled presentation, pointing to a contextual reconstruction of the story in
memory.Comment: v2: 43 pages, with added discussion and a new appendix
Slow oscillations in neural networks with facilitating synapses
The synchronous oscillatory activity characterizing many neurons in a network is often considered to be a mechanism for representing, binding, conveying, and organizing information. A number of models have been proposed to explain high-frequency oscillations, but the mechanisms that underlie slow oscillations are still unclear. Here, we show by means of analytical solutions and simulations that facilitating excitatory (E f) synapses onto interneurons in a neural network play a fundamental role, not only in shaping the frequency of slow oscillations, but also in determining the form of the up and down states observed in electrophysiological measurements. Short time constants and strong E f synapse-connectivity were found to induce rapid alternations between up and down states, whereas long time constants and weak E f synapse connectivity prolonged the time between up states and increased the up state duration. These results suggest a novel role for facilitating excitatory synapses onto interneurons in controlling the form and frequency of slow oscillations in neuronal circuit
Slow oscillations in neural networks with facilitating synapses
The synchronous oscillatory activity characterizing many neurons in a network is often considered to be a mechanism for representing, binding, conveying, and organizing information. A number of models have been proposed to explain high-frequency oscillations, but the mechanisms that underlie slow oscillations are still unclear. Here, we show by means of analytical solutions and simulations that facilitating excitatory (E(f)) synapses onto interneurons in a neural network play a fundamental role, not only in shaping the frequency of slow oscillations, but also in determining the form of the up and down states observed in electrophysiological measurements. Short time constants and strong E(f) synapse-connectivity were found to induce rapid alternations between up and down states, whereas long time constants and weak E(f) synapse connectivity prolonged the time between up states and increased the up state duration. These results suggest a novel role for facilitating excitatory synapses onto interneurons in controlling the form and frequency of slow oscillations in neuronal circuits
- …