648 research outputs found

    Geometry and Topology in Memory and Navigation

    Get PDF
    Okinawa Institute of Science and Technology Graduate UniversityDoctor of PhilosophyGeometry and topology offer rich mathematical worlds and perspectives with which to study and improve our understanding of cognitive function. Here I present the following examples: (1) a functional role for inhibitory diversity in associative memories with graph- ical relationships; (2) improved memory capacity in an associative memory model with setwise connectivity, with implications for glial and dendritic function; (3) safe and effi- cient group navigation among conspecifics using purely local geometric information; and (4) enhancing geometric and topological methods to probe the relations between neural activity and behaviour. In each work, tools and insights from geometry and topology are used in essential ways to gain improved insights or performance. This thesis contributes to our knowledge of the potential computational affordances of biological mechanisms (such as inhibition and setwise connectivity), while also demonstrating new geometric and topological methods and perspectives with which to deepen our understanding of cognitive tasks and their neural representations.doctoral thesi

    Coherence and recurrency: maintenance, control and integration in working memory

    Get PDF
    Working memory (WM), including a ‘central executive’, is used to guide behavior by internal goals or intentions. We suggest that WM is best described as a set of three interdependent functions which are implemented in the prefrontal cortex (PFC). These functions are maintenance, control of attention and integration. A model for the maintenance function is presented, and we will argue that this model can be extended to incorporate the other functions as well. Maintenance is the capacity to briefly maintain information in the absence of corresponding input, and even in the face of distracting information. We will argue that maintenance is based on recurrent loops between PFC and posterior parts of the brain, and probably within PFC as well. In these loops information can be held temporarily in an active form. We show that a model based on these structural ideas is capable of maintaining a limited number of neural patterns. Not the size, but the coherence of patterns (i.e., a chunking principle based on synchronous firing of interconnected cell assemblies) determines the maintenance capacity. A mechanism that optimizes coherent pattern segregation, also poses a limit to the number of assemblies (about four) that can concurrently reverberate. Top-down attentional control (in perception, action and memory retrieval) can be modelled by the modulation and re-entry of top-down information to posterior parts of the brain. Hierarchically organized modules in PFC create the possibility for information integration. We argue that large-scale multimodal integration of information creates an ‘episodic buffer’, and may even suffice for implementing a central executive

    Electrophysiological evidence for memory schemas in the rat hippocampus

    Full text link
    According to Piaget and Bartlett, learning involves both assimilation of new memories into networks of preexisting knowledge and alteration of existing networks to accommodate new information into existing schemas. Recent evidence suggests that the hippocampus integrates related memories into schemas that link representations of separately acquired experiences. In this thesis, I first review models for how memories of individual experiences become consolidated into the structure of world knowledge. Disruption of consolidated memories can occur during related learning, which suggests that consolidation of new information is the reconsolidation of related memories. The accepted role of the hippocampus during memory consolidation and reconsolidation suggests that it is also involved in modifying appropriate schemas during learning. To study schema development, I trained rats to retrieve rewards at different loci on a maze while recording hippocampal calls. About a quarter of cells were active at multiple goal sites, though the ensemble as a whole distinguished goal loci from one another. When new goals were introduced, cells that had been active at old goal locations began firing at the new locations. This initial generalization decreased in the days after learning. Learning also caused changes in firing patterns at well-learned goal locations. These results suggest that learning was supported by modification of an active schema of spatially related reward loci. In another experiment, I extended these findings to explore a schema of object and place associations. Ensemble activity was influenced by a hierarchy of task dimensions which included: experimental context, rat's spatial location, the reward potential and the identity of sampled objects. As rats learned about new objects, the cells that had previously fired for particular object-place conjunctions generalized their firing patterns to new conjunctions that similarly predicted reward. In both experiments, I observed highly structured representations for a set of related experiences. This organization of hippocampal activity counters key assumptions in standard models of hippocampal function that predict relative independence between memory traces. Instead, these findings reveal neural mechanisms for how the hippocampus develops a relational organization of memories that could support novel, inferential judgments between indirectly related events

    Investigating the storage capacity of a network with cell assemblies

    Get PDF
    Cell assemblies are co-operating groups of neurons believed to exist in the brain. Their existence was proposed by the neuropsychologist D.O. Hebb who also formulated a mechanism by which they could form, now known as Hebbian learning. Evidence for the existence of Hebbian learning and cell assemblies in the brain is accumulating as investigation tools improve. Researchers have also simulated cell assemblies as neural networks in computers. This thesis describes simulations of networks of cell assemblies. The feasibility of simulated cell assemblies that possess all the predicted properties of biological cell assemblies is established. Cell assemblies can be coupled together with weighted connections to form hierarchies in which a group of basic assemblies, termed primitives are connected in such a way that they form a compound cell assembly. The component assemblies of these hierarchies can be ignited independently, i.e. they are activated due to signals being passed entirely within the network, but if a sufficient number of them. are activated, they co-operate to ignite the remaining primitives in the compound assembly. Various experiments are described in which networks of simulated cell assemblies are subject to external activation involving cells in those assemblies being stimulated artificially to a high level. These cells then fire, i.e. produce a spike of activity analogous to the spiking of biological neurons, and in this way pass their activity to other cells. Connections are established, by learning in some experiments and set artificially in others, between cells within primitives and in different ones, and these connections allow activity to pass from one primitive to another. In this way, activating one or more primitives may cause others to ignite. Experiments are described in which spontaneous activation of cells aids recruitment of uncommitted cells to a neighbouring assembly. The strong relationship between cell assemblies and Hopfield nets is described. A network of simulated cells can support different numbers of assemblies depending on the complexity of those assemblies. Assemblies are classified in terms of how many primitives are present in each compound assembly and the minimum number needed to complete it. A 2-3 assembly contains 3 primitives, any 2 of which will complete it. A network of N cells can hold on the order of N 2-3 assemblies, and an architecture is proposed that contains O(N2) 3-4 assemblies. Experiments are described that show the number of connections emanating from each cell must be scaled up linearly as the number of primitives in any network .increases in order to maintain the same mean number of connections between each primitive. Restricting each cell to a maximum number of connections leads, to severe loss of performance as the size of the network increases. It is shown that the architecture can be duplicated with Hopfield nets, but that there are severe restrictions on the carrying capacity of either a hierarchy of cell assemblies or a Hopfield net storing 3-4 patterns, and that the promise of N2 patterns is largely illusory. When the number of connections from each cell is fixed as the number of primitives is increased, only O(N) cell assemblies can be stored

    Searching basic units in memory traces: associative memory cells [version 1; peer review: 2 approved, 1 approved with reservations]

    Get PDF
    The acquisition of associated signals is commonly seen in life. The integrative storage of these exogenous and endogenous signals is essential for cognition, emotion and behaviors. In terms of basic units of memory traces or engrams, associative memory cells are recruited in the brain during learning, cognition and emotional reactions. The recruitment and refinement of associative memory cells facilitate the retrieval of memory-relevant events and the learning of reorganized unitary signals that have been acquired. The recruitment of associative memory cells is fulfilled by generating mutual synapse innervations among them in coactivated brain regions. Their axons innervate downstream neurons convergently and divergently to recruit secondary associative memory cells. Mutual synapse innervations among associative memory cells confer the integrative storage and reciprocal retrieval of associated signals. Their convergent synapse innervations to secondary associative memory cells endorse integrative cognition. Their divergent innervations to secondary associative memory cells grant multiple applications of associated signals. Associative memory cells in memory traces are defined to be nerve cells that are able to encode multiple learned signals and receive synapse innervations carrying these signals. An impairment in the recruitment and refinement of associative memory cells will lead to the memory deficit associated with neurological diseases and psychological disorders. This review presents a comprehensive diagram for the recruitment and refinement of associative memory cells for memory-relevant events in a lifetime

    The hearing hippocampus

    Get PDF
    The hippocampus has a well-established role in spatial and episodic memory but a broader function has been proposed including aspects of perception and relational processing. Neural bases of sound analysis have been described in the pathway to auditory cortex, but wider networks supporting auditory cognition are still being established. We review what is known about the role of the hippocampus in processing auditory information, and how the hippocampus itself is shaped by sound. In examining imaging, recording, and lesion studies in species from rodents to humans, we uncover a hierarchy of hippocampal responses to sound including during passive exposure, active listening, and the learning of associations between sounds and other stimuli. We describe how the hippocampus' connectivity and computational architecture allow it to track and manipulate auditory information – whether in the form of speech, music, or environmental, emotional, or phantom sounds. Functional and structural correlates of auditory experience are also identified. The extent of auditory-hippocampal interactions is consistent with the view that the hippocampus makes broad contributions to perception and cognition, beyond spatial and episodic memory. More deeply understanding these interactions may unlock applications including entraining hippocampal rhythms to support cognition, and intervening in links between hearing loss and dementia

    Neural processes underpinning episodic memory

    Get PDF
    Episodic memory is the memory for our personal past experiences. Although numerous functional magnetic resonance imaging (fMRI) studies investigating its neural basis have revealed a consistent and distributed network of associated brain regions, surprisingly little is known about the contributions individual brain areas make to the recollective experience. In this thesis I address this fundamental issue by employing a range of different experimental techniques including neuropsychological testing, virtual reality environments, whole brain and high spatial resolution fMRI, and multivariate pattern analysis. Episodic memory recall is widely agreed to be a reconstructive process, one that is known to be critically reliant on the hippocampus. I therefore hypothesised that the same neural machinery responsible for reconstruction might also support ‘constructive’ cognitive functions such as imagination. To test this proposal, patients with focal damage to the hippocampus bilaterally were asked to imagine new experiences and were found to be impaired relative to matched control participants. Moreover, driving this deficit was a lack of spatial coherence in their imagined experiences, pointing to a role for the hippocampus in binding together the disparate elements of a scene. A subsequent fMRI study involving healthy participants compared the recall of real memories with the construction of imaginary memories. This revealed a fronto-temporo-parietal network in common to both tasks that included the hippocampus, ventromedial prefrontal, retrosplenial and parietal cortices. Based on these results I advanced the notion that this network might support the process of ‘scene construction’, defined as the generation and maintenance of a complex and coherent spatial context. Furthermore, I argued that this scene construction network might underpin other important cognitive functions besides episodic memory and imagination, such as navigation and thinking about the future. It is has been proposed that spatial context may act as the scaffold around which episodic memories are built. Given the hippocampus appears to play a critical role in imagination by supporting the creation of a rich coherent spatial scene, I sought to explore the nature of this hippocampal spatial code in a novel way. By combining high spatial resolution fMRI with multivariate pattern analysis techniques it proved possible to accurately determine where a subject was located in a virtual reality environment based solely on the pattern of activity across hippocampal voxels. For this to have been possible, the hippocampal population code must be large and non-uniform. I then extended these techniques to the domain of episodic memory by showing that individual memories could be accurately decoded from the pattern of activity across hippocampal voxels, thus identifying individual memory traces. I consider these findings together with other recent advances in the episodic memory field, and present a new perspective on the role of the hippocampus in episodic recollection. I discuss how this new (and preliminary) framework compares with current prevailing theories of hippocampal function, and suggest how it might account for some previously contradictory data

    Synesthetic Sensor Fusion via a Cross-Wired Artificial Neural Network.

    Get PDF
    The purpose of this interdisciplinary study was to examine the behavior of two artificial neural networks cross-wired based on the synesthesia cross-wiring hypothesis. Motivation for the study was derived from the study of psychology, robotics, and artificial neural networks, with perceivable application in the domain of mobile autonomous robotics where sensor fusion is a current research topic. This model of synesthetic sensor fusion does not exhibit synesthetic responses. However, it was observed that cross-wiring two independent networks does not change the functionality of the individual networks, but allows the inputs to one network to partially determine the outputs of the other network in some cases. Specifically, there are measurable influences of network A on network B, and yet network B retains its ability to respond independently

    Content-based retrieval of melodies using artificial neural networks

    Get PDF
    Human listeners are capable of spontaneously organizing and remembering a continuous stream of musical notes. A listener automatically segments a melody into phrases, from which an entire melody may be learnt and later recognized. This ability makes human listeners ideal for the task of retrieving melodies by content. This research introduces two neural networks, known as SONNETMAP and _ReTREEve, which attempt to model this behaviour. SONNET-MAP functions as a melody segmenter, whereas ReTREEve is specialized towards content-based retrieval (CBR). Typically, CBR systems represent melodies as strings of symbols drawn from a finite alphabet, thereby reducing the retrieval process to the task of approximate string matching. SONNET-MAP and ReTREEwe, which are derived from Nigrin’s SONNET architecture, offer a novel approach to these traditional systems, and indeed CBR in general. Based on melodic grouping cues, SONNETMAP segments a melody into phrases. Parallel SONNET modules form independent, sub-symbolic representations of the pitch and rhythm dimensions of each phrase. These representations are then bound using associative maps, forming a two-dimensional representation of each phrase. This organizational scheme enables SONNET-MAP to segment melodies into phrases using both the pitch and rhythm features of each melody. The boundary points formed by these melodic phrase segments are then utilized to populate the iieTREEve network. ReTREEw is organized in the same parallel fashion as SONNET-MAP. However, in addition, melodic phrases are aggregated by an additional layer; thus forming a two-dimensional, hierarchical memory structure of each entire melody. Melody retrieval is accomplished by matching input queries, whether perfect (for example, a fragment from the original melody) or imperfect (for example, a fragment derived from humming), against learned phrases and phrase sequence templates. Using a sample of fifty melodies composed by The Beatles , results show th a t the use of both pitch and rhythm during the retrieval process significantly improves retrieval results over networks that only use either pitch o r rhythm. Additionally, queries that are aligned along phrase boundaries are retrieved using significantly fewer notes than those that are not, thus indicating the importance of a human-based approach to melody segmentation. Moreover, depending on query degradation, different melodic features prove more adept at retrieval than others. The experiments presented in this thesis represent the largest empirical test of SONNET-based networks ever performed. As far as we are aware, the combined SONNET-MAP and -ReTREEue networks constitute the first self-organizing CBR system capable of automatic segmentation and retrieval of melodies using various features of pitch and rhythm
    corecore