3,933 research outputs found

    Task-Related Synaptic Changes Localized to Small Neuronal Population in Recurrent Neural Network Cortical Models

    Get PDF
    Humans have flexible control over cognitive functions depending on the context. Several studies suggest that the prefrontal cortex (PFC) controls this cognitive flexibility, but the detailed underlying mechanisms remain unclear. Recent developments in machine learning techniques allow simple PFC models written as a recurrent neural network to perform various behavioral tasks like humans and animals. Computational modeling allows the estimation of neuronal parameters that are crucial for performing the tasks, which cannot be observed by biologic experiments. To identify salient neural-network features for flexible cognition tasks, we compared four PFC models using a context-dependent integration task. After training the neural networks with the task, we observed highly plastic synapses localized to a small neuronal population in all models. In three of the models, the neuronal units containing these highly plastic synapses contributed most to the performance. No common tendencies were observed in the distribution of synaptic strengths among the four models. These results suggest that task-dependent plastic synaptic changes are more important for accomplishing flexible cognitive tasks than the structures of the constructed synaptic networks

    Correlation-based model of artificially induced plasticity in motor cortex by a bidirectional brain-computer interface

    Full text link
    Experiments show that spike-triggered stimulation performed with Bidirectional Brain-Computer-Interfaces (BBCI) can artificially strengthen connections between separate neural sites in motor cortex (MC). What are the neuronal mechanisms responsible for these changes and how does targeted stimulation by a BBCI shape population-level synaptic connectivity? The present work describes a recurrent neural network model with probabilistic spiking mechanisms and plastic synapses capable of capturing both neural and synaptic activity statistics relevant to BBCI conditioning protocols. When spikes from a neuron recorded at one MC site trigger stimuli at a second target site after a fixed delay, the connections between sites are strengthened for spike-stimulus delays consistent with experimentally derived spike time dependent plasticity (STDP) rules. However, the relationship between STDP mechanisms at the level of networks, and their modification with neural implants remains poorly understood. Using our model, we successfully reproduces key experimental results and use analytical derivations, along with novel experimental data. We then derive optimal operational regimes for BBCIs, and formulate predictions concerning the efficacy of spike-triggered stimulation in different regimes of cortical activity.Comment: 35 pages, 9 figure

    Short-term plasticity explains irregular persistent activity in working memory tasks

    Get PDF
    Persistent activity in cortex is the neural correlate of working memory (WM). In persistent activity, spike trains are highly irregular, even more than in baseline. This seemingly innocuous feature challenges our current understanding of the synaptic mechanisms underlying WM. Here we argue that in WM the prefrontal cortex (PFC) operates in a regime of balanced excitation and inhibition and that the observed temporal irregularity reflects this regime. We show that this requires that nonlinearities underlying the persistent activity are primarily in the neuronal interactions between PFC neurons. We also show that short-term synaptic facilitation can be the physiological substrate of these nonlinearities and that the resulting mechanism of balanced persistent activity is robust, in particular with respect to changes in the connectivity. As an example, we put forward a computational model of the PFC circuit involved in oculomotor delayed response task. The novelty of this model is that recurrent excitatory synapses are facilitating. We demonstrate that this model displays direction-selective persistent activity. We find that, even though the memory eventually degrades because of the heterogeneities, it can be stored for several seconds for plausible network size and connectivity. This model accounts for a large number of experimental findings, such as the findings that have shown that firing is more irregular during the persistent state than during baseline, that the neuronal responses are very diverse, and that the preferred directions during cue and delay periods are strongly correlated but tuning widths are not.Fil: Hansel, David. University Paris Descartes. Laboratory of Neurophysics and Physiology; Francia. University Paris Descartes. Institute of Neuroscience and Cognition; FranciaFil: Mato, German. Comision Nacional de Energia Atomica. Gerencia del Area de Investigaciones y Aplicaciones no Nucleares. Gerencia de Fisica (CAB); Argentina. ComisiĂłn Nacional de EnergĂ­a AtĂłmica. Gerencia del Area de EnergĂ­a Nuclear. Instituto Balseiro; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; Argentin

    NMDA Receptors Subserve Persistent Neuronal Firing during Working Memory in Dorsolateral Prefrontal Cortex

    Get PDF
    SummaryNeurons in the primate dorsolateral prefrontal cortex (dlPFC) generate persistent firing in the absence of sensory stimulation, the foundation of mental representation. Persistent firing arises from recurrent excitation within a network of pyramidal Delay cells. Here, we examined glutamate receptor influences underlying persistent firing in primate dlPFC during a spatial working memory task. Computational models predicted dependence on NMDA receptor (NMDAR) NR2B stimulation, and Delay cell persistent firing was abolished by local NR2B NMDAR blockade or by systemic ketamine administration. AMPA receptors (AMPARs) contributed background depolarization to sustain network firing. In contrast, many Response cells were sensitive to AMPAR blockade and increased firing after systemic ketamine, indicating that models of ketamine actions should be refined to reflect neuronal heterogeneity. The reliance of Delay cells on NMDAR may explain why insults to NMDARs in schizophrenia or Alzheimer’s disease profoundly impair cognition

    Dynamical principles in neuroscience

    Full text link
    Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and FundaciĂłn BBVA

    The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields

    Get PDF
    The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences

    Dynamical Synapses Enhance Neural Information Processing: Gracefulness, Accuracy and Mobility

    Full text link
    Experimental data have revealed that neuronal connection efficacy exhibits two forms of short-term plasticity, namely, short-term depression (STD) and short-term facilitation (STF). They have time constants residing between fast neural signaling and rapid learning, and may serve as substrates for neural systems manipulating temporal information on relevant time scales. The present study investigates the impact of STD and STF on the dynamics of continuous attractor neural networks (CANNs) and their potential roles in neural information processing. We find that STD endows the network with slow-decaying plateau behaviors-the network that is initially being stimulated to an active state decays to a silent state very slowly on the time scale of STD rather than on the time scale of neural signaling. This provides a mechanism for neural systems to hold sensory memory easily and shut off persistent activities gracefully. With STF, we find that the network can hold a memory trace of external inputs in the facilitated neuronal interactions, which provides a way to stabilize the network response to noisy inputs, leading to improved accuracy in population decoding. Furthermore, we find that STD increases the mobility of the network states. The increased mobility enhances the tracking performance of the network in response to time-varying stimuli, leading to anticipative neural responses. In general, we find that STD and STP tend to have opposite effects on network dynamics and complementary computational advantages, suggesting that the brain may employ a strategy of weighting them differentially depending on the computational purpose.Comment: 40 pages, 17 figure
    • …
    corecore