33 research outputs found

    Controlled generation of switching dynamics among metastable states in pulse-coupled oscillator networks

    Get PDF
    This research was supported by the Aihara Project, the FIRST program from JSPS, initiated by CSTP, and CREST, JST. Y.C.L. was supported by ARO under Grant No. W911NF-14-1-0504. Z.C.D. was supported by the National Natural Science Foundation of China (No. 11432010). H.L.Z. was supported by “The Fundamental Research Funds for the Central Universities” (No. 3102014JCQ01036), and by the National Natural Science Foundation of China (No. 11502200). We also thank anonymous reviewers for their insightful and useful comments.Peer reviewedPublisher PD

    Biological neurons act as generalization filters in reservoir computing

    Full text link
    Reservoir computing is a machine learning paradigm that transforms the transient dynamics of high-dimensional nonlinear systems for processing time-series data. Although reservoir computing was initially proposed to model information processing in the mammalian cortex, it remains unclear how the non-random network architecture, such as the modular architecture, in the cortex integrates with the biophysics of living neurons to characterize the function of biological neuronal networks (BNNs). Here, we used optogenetics and fluorescent calcium imaging to record the multicellular responses of cultured BNNs and employed the reservoir computing framework to decode their computational capabilities. Micropatterned substrates were used to embed the modular architecture in the BNNs. We first show that modular BNNs can be used to classify static input patterns with a linear decoder and that the modularity of the BNNs positively correlates with the classification accuracy. We then used a timer task to verify that BNNs possess a short-term memory of ~1 s and finally show that this property can be exploited for spoken digit classification. Interestingly, BNN-based reservoirs allow transfer learning, wherein a network trained on one dataset can be used to classify separate datasets of the same category. Such classification was not possible when the input patterns were directly decoded by a linear decoder, suggesting that BNNs act as a generalization filter to improve reservoir computing performance. Our findings pave the way toward a mechanistic understanding of information processing within BNNs and, simultaneously, build future expectations toward the realization of physical reservoir computing systems based on BNNs.Comment: 31 pages, 5 figures, 3 supplementary figure

    Brain-inspired neural network navigation system with hippocampus, prefrontal cortex, and amygdala functions

    Get PDF
    We propose a brain-inspired neural network model consisting of the hippocampus, prefrontal cortex, and amygdala models for a navigation system that acquires specific knowledge in home environments from few experiences. The proposed model was evaluated in a home environment using a robot simulator. In the experiment, the robot determines a path for navigation based on the knowledge acquired by the brain-inspired model.2021 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS 2021), November 16-19, 2021, Hualien, Taiwa

    Representational Switching by Dynamical Reorganization of Attractor Structure in a Network Model of the Prefrontal Cortex

    Get PDF
    The prefrontal cortex (PFC) plays a crucial role in flexible cognitive behavior by representing task relevant information with its working memory. The working memory with sustained neural activity is described as a neural dynamical system composed of multiple attractors, each attractor of which corresponds to an active state of a cell assembly, representing a fragment of information. Recent studies have revealed that the PFC not only represents multiple sets of information but also switches multiple representations and transforms a set of information to another set depending on a given task context. This representational switching between different sets of information is possibly generated endogenously by flexible network dynamics but details of underlying mechanisms are unclear. Here we propose a dynamically reorganizable attractor network model based on certain internal changes in synaptic connectivity, or short-term plasticity. We construct a network model based on a spiking neuron model with dynamical synapses, which can qualitatively reproduce experimentally demonstrated representational switching in the PFC when a monkey was performing a goal-oriented action-planning task. The model holds multiple sets of information that are required for action planning before and after representational switching by reconfiguration of functional cell assemblies. Furthermore, we analyzed population dynamics of this model with a mean field model and show that the changes in cell assemblies' configuration correspond to those in attractor structure that can be viewed as a bifurcation process of the dynamical system. This dynamical reorganization of a neural network could be a key to uncovering the mechanism of flexible information processing in the PFC

    Memory association dynamics on neural network with dynamic synapses

    No full text
    corecore