715 research outputs found

    Fast and robust learning by reinforcement signals: explorations in the insect brain

    Get PDF
    We propose a model for pattern recognition in the insect brain. Departing from a well-known body of knowledge about the insect brain, we investigate which of the potentially present features may be useful to learn input patterns rapidly and in a stable manner. The plasticity underlying pattern recognition is situated in the insect mushroom bodies and requires an error signal to associate the stimulus with a proper response. As a proof of concept, we used our model insect brain to classify the well-known MNIST database of handwritten digits, a popular benchmark for classifiers. We show that the structural organization of the insect brain appears to be suitable for both fast learning of new stimuli and reasonable performance in stationary conditions. Furthermore, it is extremely robust to damage to the brain structures involved in sensory processing. Finally, we suggest that spatiotemporal dynamics can improve the level of confidence in a classification decision. The proposed approach allows testing the effect of hypothesized mechanisms rather than speculating on their benefit for system performance or confidence in its responses

    Network self-organization explains the statistics and dynamics of synaptic connection strengths in cortex

    Get PDF
    The information processing abilities of neural circuits arise from their synaptic connection patterns. Understanding the laws governing these connectivity patterns is essential for understanding brain function. The overall distribution of synaptic strengths of local excitatory connections in cortex and hippocampus is long-tailed, exhibiting a small number of synaptic connections of very large efficacy. At the same time, new synaptic connections are constantly being created and individual synaptic connection strengths show substantial fluctuations across time. It remains unclear through what mechanisms these properties of neural circuits arise and how they contribute to learning and memory. In this study we show that fundamental characteristics of excitatory synaptic connections in cortex and hippocampus can be explained as a consequence of self-organization in a recurrent network combining spike-timing-dependent plasticity (STDP), structural plasticity and different forms of homeostatic plasticity. In the network, associative synaptic plasticity in the form of STDP induces a rich-get-richer dynamics among synapses, while homeostatic mechanisms induce competition. Under distinctly different initial conditions, the ensuing self-organization produces long-tailed synaptic strength distributions matching experimental findings. We show that this self-organization can take place with a purely additive STDP mechanism and that multiplicative weight dynamics emerge as a consequence of network interactions. The observed patterns of fluctuation of synaptic strengths, including elimination and generation of synaptic connections and long-term persistence of strong connections, are consistent with the dynamics of dendritic spines found in rat hippocampus. Beyond this, the model predicts an approximately power-law scaling of the lifetimes of newly established synaptic connection strengths during development. Our results suggest that the combined action of multiple forms of neuronal plasticity plays an essential role in the formation and maintenance of cortical circuits

    Neuromorphic Computing Applications in Robotics

    Get PDF
    Deep learning achieves remarkable success through training using massively labeled datasets. However, the high demands on the datasets impede the feasibility of deep learning in edge computing scenarios and suffer from the data scarcity issue. Rather than relying on labeled data, animals learn by interacting with their surroundings and memorizing the relationships between events and objects. This learning paradigm is referred to as associative learning. The successful implementation of associative learning imitates self-learning schemes analogous to animals which resolve the challenges of deep learning. Current state-of-the-art implementations of associative memory are limited to simulations with small-scale and offline paradigms. Thus, this work implements associative memory with an Unmanned Ground Vehicle (UGV) and neuromorphic hardware, specifically Intel’s Loihi, for an online learning scenario. This system emulates the classic associative learning in rats using the UGV in place of the rats. In specific, it successfully reproduces the fear conditioning with no pretraining procedure or labeled datasets. The UGV is rendered capable of autonomously learning the cause-and-effect relationship of the light stimulus and vibration stimulus and exhibiting a movement response to demonstrate the memorization. Hebbian learning dynamics are used to update the synaptic weights during the associative learning process. The Intel Loihi chip is integrated with this online learning system for processing visual signals with a specialized neural assembly. While processing, the Loihi’s average power usages for computing logic and memory are 30 mW and 29 mW, respectively

    The malleable brain: plasticity of neural circuits and behavior: A review from students to students

    Get PDF
    One of the most intriguing features of the brain is its ability to be malleable, allowing it to adapt continually to changes in the environment. Specific neuronal activity patterns drive long-lasting increases or decreases in the strength of synaptic connections, referred to as long-term potentiation (LTP) and long-term depression (LTD) respectively. Such phenomena have been described in a variety of model organisms, which are used to study molecular, structural, and functional aspects of synaptic plasticity. This review originated from the first International Society for Neurochemistry (ISN) and Journal of Neurochemistry (JNC) Flagship School held in Alpbach, Austria (Sep 2016), and will use its curriculum and discussions as a framework to review some of the current knowledge in the field of synaptic plasticity. First, we describe the role of plasticity during development and the persistent changes of neural circuitry occurring when sensory input is altered during critical developmental stages. We then outline the signaling cascades resulting in the synthesis of new plasticity-related proteins, which ultimately enable sustained changes in synaptic strength. Going beyond the traditional understanding of synaptic plasticity conceptualized by LTP and LTD, we discuss system-wide modifications and recently unveiled homeostatic mechanisms, such as synaptic scaling. Finally, we describe the neural circuits and synaptic plasticity mechanisms driving associative memory and motor learning. Evidence summarized in this review provides a current view of synaptic plasticity in its various forms, offers new insights into the underlying mechanisms and behavioral relevance, and provides directions for future research in the field of synaptic plasticity.Fil: Schaefer, Natascha. University of Wuerzburg; AlemaniaFil: Rotermund, Carola. University of Tuebingen; AlemaniaFil: Blumrich, Eva Maria. Universitat Bremen; AlemaniaFil: Lourenco, Mychael V.. Universidade Federal do Rio de Janeiro; BrasilFil: Joshi, Pooja. Robert Debre Hospital; FranciaFil: Hegemann, Regina U.. University of Otago; Nueva ZelandaFil: Jamwal, Sumit. ISF College of Pharmacy; IndiaFil: Ali, Nilufar. Augusta University; Estados UnidosFil: García Romero, Ezra Michelet. Universidad Veracruzana; MéxicoFil: Sharma, Sorabh. Birla Institute of Technology and Science; IndiaFil: Ghosh, Shampa. Indian Council of Medical Research; IndiaFil: Sinha, Jitendra K.. Indian Council of Medical Research; IndiaFil: Loke, Hannah. Hudson Institute of Medical Research; AustraliaFil: Jain, Vishal. Defence Institute of Physiology and Allied Sciences; IndiaFil: Lepeta, Katarzyna. Polish Academy of Sciences; ArgentinaFil: Salamian, Ahmad. Polish Academy of Sciences; ArgentinaFil: Sharma, Mahima. Polish Academy of Sciences; ArgentinaFil: Golpich, Mojtaba. University Kebangsaan Malaysia Medical Centre; MalasiaFil: Nawrotek, Katarzyna. University Of Lodz; ArgentinaFil: Paid, Ramesh K.. Indian Institute of Chemical Biology; IndiaFil: Shahidzadeh, Sheila M.. Syracuse University; Estados UnidosFil: Piermartiri, Tetsade. Universidade Federal de Santa Catarina; BrasilFil: Amini, Elham. University Kebangsaan Malaysia Medical Centre; MalasiaFil: Pastor, Verónica. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; ArgentinaFil: Wilson, Yvette. University of Melbourne; AustraliaFil: Adeniyi, Philip A.. Afe Babalola University; NigeriaFil: Datusalia, Ashok K.. National Brain Research Centre; IndiaFil: Vafadari, Benham. Polish Academy of Sciences; ArgentinaFil: Saini, Vedangana. University of Nebraska; Estados UnidosFil: Suárez Pozos, Edna. Instituto Politécnico Nacional; MéxicoFil: Kushwah, Neetu. Defence Institute of Physiology and Allied Sciences; IndiaFil: Fontanet, Paula. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; ArgentinaFil: Turner, Anthony J.. University of Leeds; Reino Unid

    Analog Spiking Neuromorphic Circuits and Systems for Brain- and Nanotechnology-Inspired Cognitive Computing

    Get PDF
    Human society is now facing grand challenges to satisfy the growing demand for computing power, at the same time, sustain energy consumption. By the end of CMOS technology scaling, innovations are required to tackle the challenges in a radically different way. Inspired by the emerging understanding of the computing occurring in a brain and nanotechnology-enabled biological plausible synaptic plasticity, neuromorphic computing architectures are being investigated. Such a neuromorphic chip that combines CMOS analog spiking neurons and nanoscale resistive random-access memory (RRAM) using as electronics synapses can provide massive neural network parallelism, high density and online learning capability, and hence, paves the path towards a promising solution to future energy-efficient real-time computing systems. However, existing silicon neuron approaches are designed to faithfully reproduce biological neuron dynamics, and hence they are incompatible with the RRAM synapses, or require extensive peripheral circuitry to modulate a synapse, and are thus deficient in learning capability. As a result, they eliminate most of the density advantages gained by the adoption of nanoscale devices, and fail to realize a functional computing system. This dissertation describes novel hardware architectures and neuron circuit designs that synergistically assemble the fundamental and significant elements for brain-inspired computing. Versatile CMOS spiking neurons that combine integrate-and-fire, passive dense RRAM synapses drive capability, dynamic biasing for adaptive power consumption, in situ spike-timing dependent plasticity (STDP) and competitive learning in compact integrated circuit modules are presented. Real-world pattern learning and recognition tasks using the proposed architecture were demonstrated with circuit-level simulations. A test chip was implemented and fabricated to verify the proposed CMOS neuron and hardware architecture, and the subsequent chip measurement results successfully proved the idea. The work described in this dissertation realizes a key building block for large-scale integration of spiking neural network hardware, and then, serves as a step-stone for the building of next-generation energy-efficient brain-inspired cognitive computing systems

    More than synaptic plasticity: role of nonsynaptic plasticity in learning and memory.

    Get PDF
    Decades of research on the cellular mechanisms of memory have led to the widely held view that memories are stored as modifications of synaptic strength. These changes involve presynaptic processes, such as direct modulation of the release machinery, or postsynaptic processes, such as modulation of receptor properties. Parallel studies have revealed that memories might also be stored by nonsynaptic processes, such as modulation of voltage-dependent membrane conductances, which are expressed as changes in neuronal excitability. Although in some cases nonsynaptic changes can function as part of the engram itself, they might also serve as mechanisms through which a neural circuit is set to a permissive state to facilitate synaptic modifications that are necessary for memory storage

    Long Sequence Hopfield Memory

    Full text link
    Sequence memory is an essential attribute of natural and artificial intelligence that enables agents to encode, store, and retrieve complex sequences of stimuli and actions. Computational models of sequence memory have been proposed where recurrent Hopfield-like neural networks are trained with temporally asymmetric Hebbian rules. However, these networks suffer from limited sequence capacity (maximal length of the stored sequence) due to interference between the memories. Inspired by recent work on Dense Associative Memories, we expand the sequence capacity of these models by introducing a nonlinear interaction term, enhancing separation between the patterns. We derive novel scaling laws for sequence capacity with respect to network size, significantly outperforming existing scaling laws for models based on traditional Hopfield networks, and verify these theoretical results with numerical simulation. Moreover, we introduce a generalized pseudoinverse rule to recall sequences of highly correlated patterns. Finally, we extend this model to store sequences with variable timing between states' transitions and describe a biologically-plausible implementation, with connections to motor neuroscience.Comment: NeurIPS 2023 Camera-Ready, 41 page

    Hebbian Wiring Plasticity Generates Efficient Network Structures for Robust Inference with Synaptic Weight Plasticity

    Get PDF
    In the adult mammalian cortex, a small fraction of spines are created and eliminated every day, and the resultant synaptic connection structure is highly nonrandom, even in local circuits. However, it remains unknown whether a particular synaptic connection structure is functionally advantageous in local circuits, and why creation and elimination of synaptic connections is necessary in addition to rich synaptic weight plasticity. To answer these questions, we studied an inference task model through theoretical and numerical analyses. We demonstrate that a robustly beneficial network structure naturally emerges by combining Hebbian-type synaptic weight plasticity and wiring plasticity. Especially in a sparsely connected network, wiring plasticity achieves reliable computation by enabling efficient information transmission. Furthermore, the proposed rule reproduces experimental observed correlation between spine dynamics and task performance
    corecore