320 research outputs found

    Improving Associative Memory in a Network of Spiking Neurons

    Get PDF
    In this thesis we use computational neural network models to examine the dynamics and functionality of the CA3 region of the mammalian hippocampus. The emphasis of the project is to investigate how the dynamic control structures provided by inhibitory circuitry and cellular modification may effect the CA3 region during the recall of previously stored information. The CA3 region is commonly thought to work as a recurrent auto-associative neural network due to the neurophysiological characteristics found, such as, recurrent collaterals, strong and sparse synapses from external inputs and plasticity between coactive cells. Associative memory models have been developed using various configurations of mathematical artificial neural networks which were first developed over 40 years ago. Within these models we can store information via changes in the strength of connections between simplified model neurons (two-state). These memories can be recalled when a cue (noisy or partial) is instantiated upon the net. The type of information they can store is quite limited due to restrictions caused by the simplicity of the hard-limiting nodes which are commonly associated with a binary activation threshold. We build a much more biologically plausible model with complex spiking cell models and with realistic synaptic properties between cells. This model is based upon some of the many details we now know of the neuronal circuitry of the CA3 region. We implemented the model in computer software using Neuron and Matlab and tested it by running simulations of storage and recall in the network. By building this model we gain new insights into how different types of neurons, and the complex circuits they form, actually work. The mammalian brain consists of complex resistive-capacative electrical circuitry which is formed by the interconnection of large numbers of neurons. A principal cell type is the pyramidal cell within the cortex, which is the main information processor in our neural networks. Pyramidal cells are surrounded by diverse populations of interneurons which have proportionally smaller numbers compared to the pyramidal cells and these form connections with pyramidal cells and other inhibitory cells. By building detailed computational models of recurrent neural circuitry we explore how these microcircuits of interneurons control the flow of information through pyramidal cells and regulate the efficacy of the network. We also explore the effect of cellular modification due to neuronal activity and the effect of incorporating spatially dependent connectivity on the network during recall of previously stored information. In particular we implement a spiking neural network proposed by Sommer and Wennekers (2001). We consider methods for improving associative memory recall using methods inspired by the work by Graham and Willshaw (1995) where they apply mathematical transforms to an artificial neural network to improve the recall quality within the network. The networks tested contain either 100 or 1000 pyramidal cells with 10% connectivity applied and a partial cue instantiated, and with a global pseudo-inhibition.We investigate three methods. Firstly, applying localised disynaptic inhibition which will proportionalise the excitatory post synaptic potentials and provide a fast acting reversal potential which should help to reduce the variability in signal propagation between cells and provide further inhibition to help synchronise the network activity. Secondly, implementing a persistent sodium channel to the cell body which will act to non-linearise the activation threshold where after a given membrane potential the amplitude of the excitatory postsynaptic potential (EPSP) is boosted to push cells which receive slightly more excitation (most likely high units) over the firing threshold. Finally, implementing spatial characteristics of the dendritic tree will allow a greater probability of a modified synapse existing after 10% random connectivity has been applied throughout the network. We apply spatial characteristics by scaling the conductance weights of excitatory synapses which simulate the loss in potential in synapses found in the outer dendritic regions due to increased resistance. To further increase the biological plausibility of the network we remove the pseudo-inhibition and apply realistic basket cell models with differing configurations for a global inhibitory circuit. The networks are configured with; 1 single basket cell providing feedback inhibition, 10% basket cells providing feedback inhibition where 10 pyramidal cells connect to each basket cell and finally, 100% basket cells providing feedback inhibition. These networks are compared and contrasted for efficacy on recall quality and the effect on the network behaviour. We have found promising results from applying biologically plausible recall strategies and network configurations which suggests the role of inhibition and cellular dynamics are pivotal in learning and memory

    Network analysis of the cellular circuits of memory

    Get PDF
    Intuitively, memory is conceived as a collection of static images that we accumulate as we experience the world. But actually, memories are constantly changing through our life, shaped by our ongoing experiences. Assimilating new knowledge without corrupting pre-existing memories is then a critical brain function. However, learning and memory interact: prior knowledge can proactively influence learning, and new information can retroactively modify memories of past events. The hippocampus is a brain region essential for learning and memory, but the network-level operations that underlie the continuous integration of new experiences into memory, segregating them as discrete traces while enabling their interaction, are unknown. Here I show a network mechanism by which two distinct memories interact. Hippocampal CA1 neuron ensembles were monitored in mice as they explored a familiar environment before and after forming a new place-reward memory in a different environment. By employing a network science representation of the co-firing relationships among principal cells, I first found that new associative learning modifies the topology of the cells’ co-firing patterns representing the unrelated familiar environment. I fur- ther observed that these neuronal co-firing graphs evolved along three functional axes: the first segregated novelty; the second distinguished individual novel be- havioural experiences; while the third revealed cross-memory interaction. Finally, I found that during this process, high activity principal cells rapidly formed the core representation of each memory; whereas low activity principal cells gradually joined co-activation motifs throughout individual experiences, enabling cross-memory in- teractions. These findings reveal an organizational principle of brain networks where high and low activity cells are differentially recruited into coactivity motifs as build- ing blocks for the flexible integration and interaction of memories. Finally, I employ a set of manifold learning and related approaches to explore and characterise the complex neural population dynamics within CA1 that underlie sim- ple exploration.Open Acces

    The malleable brain: plasticity of neural circuits and behavior: A review from students to students

    Get PDF
    One of the most intriguing features of the brain is its ability to be malleable, allowing it to adapt continually to changes in the environment. Specific neuronal activity patterns drive long-lasting increases or decreases in the strength of synaptic connections, referred to as long-term potentiation (LTP) and long-term depression (LTD) respectively. Such phenomena have been described in a variety of model organisms, which are used to study molecular, structural, and functional aspects of synaptic plasticity. This review originated from the first International Society for Neurochemistry (ISN) and Journal of Neurochemistry (JNC) Flagship School held in Alpbach, Austria (Sep 2016), and will use its curriculum and discussions as a framework to review some of the current knowledge in the field of synaptic plasticity. First, we describe the role of plasticity during development and the persistent changes of neural circuitry occurring when sensory input is altered during critical developmental stages. We then outline the signaling cascades resulting in the synthesis of new plasticity-related proteins, which ultimately enable sustained changes in synaptic strength. Going beyond the traditional understanding of synaptic plasticity conceptualized by LTP and LTD, we discuss system-wide modifications and recently unveiled homeostatic mechanisms, such as synaptic scaling. Finally, we describe the neural circuits and synaptic plasticity mechanisms driving associative memory and motor learning. Evidence summarized in this review provides a current view of synaptic plasticity in its various forms, offers new insights into the underlying mechanisms and behavioral relevance, and provides directions for future research in the field of synaptic plasticity.Fil: Schaefer, Natascha. University of Wuerzburg; AlemaniaFil: Rotermund, Carola. University of Tuebingen; AlemaniaFil: Blumrich, Eva Maria. Universitat Bremen; AlemaniaFil: Lourenco, Mychael V.. Universidade Federal do Rio de Janeiro; BrasilFil: Joshi, Pooja. Robert Debre Hospital; FranciaFil: Hegemann, Regina U.. University of Otago; Nueva ZelandaFil: Jamwal, Sumit. ISF College of Pharmacy; IndiaFil: Ali, Nilufar. Augusta University; Estados UnidosFil: García Romero, Ezra Michelet. Universidad Veracruzana; MéxicoFil: Sharma, Sorabh. Birla Institute of Technology and Science; IndiaFil: Ghosh, Shampa. Indian Council of Medical Research; IndiaFil: Sinha, Jitendra K.. Indian Council of Medical Research; IndiaFil: Loke, Hannah. Hudson Institute of Medical Research; AustraliaFil: Jain, Vishal. Defence Institute of Physiology and Allied Sciences; IndiaFil: Lepeta, Katarzyna. Polish Academy of Sciences; ArgentinaFil: Salamian, Ahmad. Polish Academy of Sciences; ArgentinaFil: Sharma, Mahima. Polish Academy of Sciences; ArgentinaFil: Golpich, Mojtaba. University Kebangsaan Malaysia Medical Centre; MalasiaFil: Nawrotek, Katarzyna. University Of Lodz; ArgentinaFil: Paid, Ramesh K.. Indian Institute of Chemical Biology; IndiaFil: Shahidzadeh, Sheila M.. Syracuse University; Estados UnidosFil: Piermartiri, Tetsade. Universidade Federal de Santa Catarina; BrasilFil: Amini, Elham. University Kebangsaan Malaysia Medical Centre; MalasiaFil: Pastor, Verónica. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; ArgentinaFil: Wilson, Yvette. University of Melbourne; AustraliaFil: Adeniyi, Philip A.. Afe Babalola University; NigeriaFil: Datusalia, Ashok K.. National Brain Research Centre; IndiaFil: Vafadari, Benham. Polish Academy of Sciences; ArgentinaFil: Saini, Vedangana. University of Nebraska; Estados UnidosFil: Suárez Pozos, Edna. Instituto Politécnico Nacional; MéxicoFil: Kushwah, Neetu. Defence Institute of Physiology and Allied Sciences; IndiaFil: Fontanet, Paula. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; ArgentinaFil: Turner, Anthony J.. University of Leeds; Reino Unid

    Computation in the high-conductance state

    Get PDF

    Parvalbumin interneuron subpopulations: genetic characterization and hippocampal connectivity

    Get PDF
    In this thesis, I focus on the heterogeneity of hippocampal CA1, uncovering molecular diversity in developmentally defined groups of parvalbumin expressing basket cells. This molecular diversity is then integrated, through findings on differential connectivity, with previously described populations of CA1 principal cells and with cell-type specific manipulations during behavioral testing

    Functional Brain Oscillations: How Oscillations Facilitate Information Representation and Code Memories

    Get PDF
    The overall aim of the modelling works within this thesis is to lend theoretical evidence to empirical findings from the brain oscillations literature. We therefore hope to solidify and expand the notion that precise spike timing through oscillatory mechanisms facilitates communication, learning, information processing and information representation within the brain. The primary hypothesis of this thesis is that it can be shown computationally that neural de-synchronisations can allow information content to emerge. We do this using two neural network models, the first of which shows how differential rates of neuronal firing can indicate when a single item is being actively represented. The second model expands this notion by creating a complimentary timing mechanism, thus enabling the emergence of qualitive temporal information when a pattern of items is being actively represented. The secondary hypothesis of this thesis is that it can be also be shown computationally that oscillations might play a functional role in learning. Both of the models presented within this thesis propose a sparsely coded and fast learning hippocampal region that engages in the binding of novel episodic information. The first model demonstrates how active cortical representations enable learning to occur in their hippocampal counterparts via a phase-dependent learning rule. The second model expands this notion, creating hierarchical temporal sequences to encode the relative temporal position of cortical representations. We demonstrate in both of these models, how cortical brain oscillations might provide a gating function to the representation of information, whilst complimentary hippocampal oscillations might provide distinct phasic reference points for learning
    • …
    corecore