121,782 research outputs found

    A dynamic field model of ordinal and timing properties of sequential events

    Get PDF
    Recent evidence suggests that the neural mechanisms underlying memory for serial order and interval timing of sequential events are closely linked. We present a dynamic neural field model which exploits the existence and stability of multi-bump solutions with a gradient of activation to store serial order. The activation gradient is achieved by applying a state-dependent threshold accommodation process to the firing rate function. A field dynamics of lateral inhibition type is used in combination with a dynamics for the baseline activity to recall the sequence from memory. We show that depending on the time scale of the baseline dynamics the precise temporal structure of the original sequence may be retrieved or a proactive timing of events may be achievedFundação para a Ciência e a Tecnologia (FCT) - Bolsa SFRH/BD/41179/200

    Dynamic Graph Generation Network: Generating Relational Knowledge from Diagrams

    Full text link
    In this work, we introduce a new algorithm for analyzing a diagram, which contains visual and textual information in an abstract and integrated way. Whereas diagrams contain richer information compared with individual image-based or language-based data, proper solutions for automatically understanding them have not been proposed due to their innate characteristics of multi-modality and arbitrariness of layouts. To tackle this problem, we propose a unified diagram-parsing network for generating knowledge from diagrams based on an object detector and a recurrent neural network designed for a graphical structure. Specifically, we propose a dynamic graph-generation network that is based on dynamic memory and graph theory. We explore the dynamics of information in a diagram with activation of gates in gated recurrent unit (GRU) cells. On publicly available diagram datasets, our model demonstrates a state-of-the-art result that outperforms other baselines. Moreover, further experiments on question answering shows potentials of the proposed method for various applications

    How do neural processes give rise to cognition? Simultaneously predicting brain and behavior with a dynamic model of visual working memory

    Get PDF
    There is consensus that activation within distributed functional brain networks underlies human thought. The impact of this consensus is limited, however, by a gap that exists between data-driven correlational analyses that specify where functional brain activity is localized using functional magnetic resonance imaging (fMRI), and neural process accounts that specify how neural activity unfolds through time to give rise to behavior. Here, we show how an integrative cognitive neuroscience approach may bridge this gap. In an exemplary study of visual working memory, we use multilevel Bayesian statistics to demonstrate that a neural dynamic model simultaneously explains behavioral data and predicts localized patterns of brain activity, outperforming standard analytic approaches to fMRI. The model explains performance on both correct trials and incorrect trials where errors in change detection emerge from neural fluctuations amplified by neural interaction. Critically, predictions of the model run counter to cognitive theories of the origin of errors in change detection. Results reveal neural patterns predicted by the model within regions of the dorsal attention network that have been the focus of much debate. The model-based analysis suggests that key areas in the dorsal attention network such as the intraparietal sulcus play a central role in change detection rather than working memory maintenance, counter to previous interpretations of fMRI studies. More generally, the integrative cognitive neuroscience approach used here establishes a framework for directly testing theories of cognitive and brain function using the combined power of behavioral and fMRI data. (PsycInfo Database Record (c) 2021 APA, all rights reserved)

    Autonomous Reinforcement of Behavioral Sequences in Neural Dynamics

    Full text link
    We introduce a dynamic neural algorithm called Dynamic Neural (DN) SARSA(\lambda) for learning a behavioral sequence from delayed reward. DN-SARSA(\lambda) combines Dynamic Field Theory models of behavioral sequence representation, classical reinforcement learning, and a computational neuroscience model of working memory, called Item and Order working memory, which serves as an eligibility trace. DN-SARSA(\lambda) is implemented on both a simulated and real robot that must learn a specific rewarding sequence of elementary behaviors from exploration. Results show DN-SARSA(\lambda) performs on the level of the discrete SARSA(\lambda), validating the feasibility of general reinforcement learning without compromising neural dynamics.Comment: Sohrob Kazerounian, Matthew Luciw are Joint first author

    A neural blackboard architecture of sentence structure

    Get PDF
    We present a neural architecture for sentence representation. Sentences are represented in terms of word representations as constituents. A word representation consists of a neural assembly distributed over the brain. Sentence representation does not result from associations between neural word assemblies. Instead, word assemblies are embedded in a neural architecture, in which the structural (thematic) relations between words can be represented. Arbitrary thematic relations between arguments and verbs can be represented. Arguments can consist of nouns and phrases, as in sentences with relative clauses. A number of sentences can be stored simultaneously in this architecture. We simulate how probe questions about thematic relations can be answered. We discuss how differences in sentence complexity, such as the difference between subject-extracted versus object-extracted relative clauses and the difference between right-branching versus center-embedded structures, can be related to the underlying neural dynamics of the model. Finally, we illustrate how memory capacity for sentence representation can be related to the nature of reverberating neural activity, which is used to store information temporarily in this architecture

    Neural Network Models of Learning and Memory: Leading Questions and an Emerging Framework

    Full text link
    Office of Naval Research and the Defense Advanced Research Projects Agency (N00014-95-1-0409, N00014-1-95-0657); National Institutes of Health (NIH 20-316-4304-5
    corecore