7 research outputs found

    Learning in a Unitary Coherent Hippocampus

    Get PDF

    Machines Learning - Towards a New Synthetic Autobiographical Memory

    Get PDF
    Autobiographical memory is the organisation of episodes and contextual information from an individual’s experiences into a coherent narrative, which is key to our sense of self. Formation and recall of autobiographical memories is essential for effective, adaptive behaviour in the world, providing contextual information necessary for planning actions and memory functions such as event reconstruction. A synthetic autobiographical memory system would endow intelligent robotic agents with many essential components of cognition through active compression and storage of historical sensorimotor data in an easily addressable manner. Current approaches neither fulfil these functional requirements, nor build upon recent understanding of predictive coding, deep learning, nor the neurobiology of memory. This position paper highlights desiderata for a modern implementation of synthetic autobiographical memory based on human episodic memory, and proposes that a recently developed model of hippocampal memory could be extended as a generalised model of autobiographical memory. Initial implementation will be targeted at social interaction, where current synthetic autobiographical memory systems have had success

    Extending a Hippocampal Model for Navigation Around a Maze Generated from Real-World Data

    Get PDF
    An essential component in the formation of understanding is the ability to use past experience to comprehend the here and now, and to aid selection of future action. Past experience is stored as memories which are then available for recall at very short notice, allowing for understanding of short and long term action. Autobiographical memory (ABM) is a form of temporally organised memory and is the organisation of episodes and contextual information from an individual’s experience into a coherent narrative, which is key to a sense of self. Formation and recall of memories is essential for effective and adaptive behaviour in the world, providing contextual information necessary for planning actions and memory functions, such as event reconstruction. Here we tested and developed a previously defined computational memory model, based on hippocampal structure and function, as a first step towards developing a synthetic model of human ABM (SAM). The hippocampal model chosen has functions analogous to that of human ABM. We trained the model on real-world sensory data and demonstrate successful, biologically plausible memory formation and recall, in a navigational task. The hippocampal model will later be extended for application in a biologically inspired system for human-robot interaction

    Simultaneous localisation and mapping on a multi-degree of freedom biomimetic whiskered robot

    Get PDF
    A biomimetic mobile robot called “Shrewbot” has been built as part of a neuroethological study of the mammalian facial whisker sensory system. This platform has been used to further evaluate the problem space of whisker based tactile Simultaneous Localisation And Mapping (tSLAM). Shrewbot uses a biomorphic 3-dimensional array of active whiskers and a model of action selection based on tactile sensory attention to explore a circular walled arena sparsely populated with simple geometric shapes. Datasets taken during this exploration have been used to parameterise an approach to localisation and mapping based on probabilistic occupancy grids. We present the results of this work and conclude that simultaneous localisation and mapping is possible given only noisy odometry and tactile information from a 3-dimensional array of active biomimetic whiskers and no prior information of features in the environment

    Scaling a hippocampus model with GPU parallelisation and test-driven refactoring

    Get PDF
    The hippocampus is the brain area used for localisation, mapping and episodic memory. Humans and animals can outperform robotic systems in these tasks, so functional models of hippocampus may be useful to improve robotic navigation, such as for self-driving cars. Previous work developed a biologically plausible model of hippocampus based on Unitary Coherent Particle Filter (UCPF) and Temporal Restricted Boltzmann Machine, which was able to learn to navigate around small test environments. However it was implemented in serial software, which becomes very slow as the environments and numbers of neurons scale up. Modern GPUs can parallelize execution of neural networks. The present Neural Software Engineering study develops a GPU accelerated version of the UCPF hippocampus software, using the formal Software Engineering techniques of profiling, optimisation and test-driven refactoring. Results show that the model can greatly benefit from parallel execution, which may enable it to scale from toy environments and applications to real-world ones such as self-driving car navigation. The refactored parallel code is released to the community as open source software as part of this publication

    A unified neural model explaining optimal multi-guidance coordination in insect navigation

    Get PDF
    The robust navigation of insects arises from the coordinated action of concurrently functioning and interacting guidance systems. Computational models of specific brain regions can account for isolated behaviours such as path integration or route following, but the neural mechanisms by which their outputs are coordinated remains unknown. In this work, a functional modelling approach was taken to identify and model the elemental guidance subsystems required by homing insects. Then we produced realistic adaptive behaviours by integrating different guidance's outputs in a biologically constrained unified model mapped onto identified neural circuits. Homing paths are quantitatively and qualitatively compared with real ant data in a series of simulation studies replicating key infield experiments. Our analysis reveals that insects require independent visual homing and route following capabilities which we show can be realised by encoding panoramic skylines in the frequency domain, using image processing circuits in the optic lobe and learning pathways through the Mushroom Bodies (MB) and Anterior Optic Tubercle (AOTU) to Bulb (BU) respectively before converging in the Central Complex (CX) steering circuit. Further, we demonstrate that a ring attractor network inspired by firing patterns recorded in the CX can optimally integrate the outputs of path integration and visual homing systems guiding simulated ants back to their familiar route, and a simple non-linear weighting function driven by the output of the MB provides a context-dependent switch allowing route following strategies to dominate and the learned route retraced back to the nest when familiar terrain is encountered. The resultant unified model of insect navigation reproduces behavioural data from a series of cue conflict experiments in realistic animal environments and offers testable hypotheses of where and how insects process visual cues, utilise the different information that they provide and coordinate their outputs to achieve the adaptive behaviours observed in the wild. These results forward the case for a distributed architecture of the insect navigational toolkit. This unified model then be further validated by modelling the olfactory navigation of flies and ants. With simple adaptions of the sensory inputs, this model reproduces the main characteristics of the observed behavioural data, further demonstrating the useful role played by sensory-processing to CX to motor pathway in generating context-dependent coordination behaviours. In addition, this model help to complete the unified model of insect navigation by adding the olfactory cues that is one of the most crucial cues for insects
    corecore