143 research outputs found

    Action selection in the rhythmic brain: The role of the basal ganglia and tremor.

    Get PDF
    Low-frequency oscillatory activity has been the target of extensive research both in cortical structures and in the basal ganglia (BG), due to numerous reports of associations with brain disorders and the normal functioning of the brain. Additionally, a plethora of evidence and theoretical work indicates that the BG might be the locus where conflicts between prospective actions are being resolved. Whereas a number of computational models of the BG investigate these phenomena, these models tend to focus on intrinsic oscillatory mechanisms, neglecting evidence that points to the cortex as the origin of this oscillatory behaviour. In this thesis, we construct a detailed neural model of the complete BG circuit based on fine-tuned spiking neurons, with both electrical and chemical synapses as well as short-term plasticity between structures. To do so, we build a complete suite of computational tools for the design, optimization and simulation of spiking neural networks. Our model successfully reproduces firing and oscillatory behaviour found in both the healthy and Parkinsonian BG, and it was used to make a number of biologically-plausible predictions. First, we investigate the influence of various cortical frequency bands on the intrinsic effective connectivity of the BG, as well as the role of the latter in regulating cortical behaviour. We found that, indeed, effective connectivity changes dramatically for different cortical frequency bands and phase offsets, which are able to modulate (or even block) information flow in the three major BG pathways. Our results indicate the existence of a multimodal gating mechanism at the level of the BG that can be entirely controlled by cortical oscillations, and provide evidence for the hypothesis of cortically-entrained but locally-generated subthalamic beta activity. Next, we explore the relationship of wave properties of entrained cortical inputs, dopamine and the transient effectiveness of the BG, when viewed as an action selection device. We found that cortical frequency, phase, dopamine and the examined time scale, all have a very important impact on the ability of our model to select. Our simulations resulted in a canonical profile of selectivity, which we termed selectivity portraits. Taking together, our results suggest that the cortex is the structure that determines whether action selection will be performed and what strategy will be utilized while the role of the BG is to perform this selection. Some frequency ranges promote the exploitation of actions of whom the outcome is known, others promote the exploration of new actions with high uncertainty while the remaining frequencies simply deactivate selection. Based on this behaviour, we propose a metaphor according to which, the basal ganglia can be viewed as the ''gearbox" of the cortex. Coalitions of rhythmic cortical areas are able to switch between a repertoire of available BG modes which, in turn, change the course of information flow back to and within the cortex. In the same context, dopamine can be likened to the ''control pedals" of action selection that either stop or initiate a decision. Finally, the frequency of active cortical areas that project to the BG acts as a gear lever, that instead of controlling the type and direction of thrust that the throttle provides to an automobile, it dictates the extent to which dopamine can trigger a decision, as well as what type of decision this will be. Finally, we identify a selection cycle with a period of around 200 ms, which was used to assess the biological plausibility of the most popular architectures in cognitive science. Using extensions of the BG model, we further propose novel mechanisms that provide explanations for (1) the two distinctive dynamical behaviours of neurons in globus pallidus external, and (2) the generation of resting tremor in Parkinson's disease. Our findings agree well with experimental observations, suggest new insights into the pathophysiology of specific BG disorders, provide new justifications for oscillatory phenomena related to decision making and reaffirm the role of the BG as the selection centre of the brain.Open Acces

    Integration of Spiking Neural Networks for Understanding Interval Timing

    Get PDF
    The ability to perceive the passage of time in the seconds-to-minutes range is a vital and ubiquitous characteristic of life. This ability allows organisms to make behavioral changes based on the temporal contingencies between stimuli and the potential rewards they predict. While the psychophysical manifestations of time perception have been well-characterized, many aspects of its underlying biology are still poorly understood. A major contributor to this is limitations of current in vivo techniques that do not allow for proper assessment of the di signaling over micro-, meso- and macroscopic spatial scales. Alternatively, the integration of biologically inspired artificial neural networks (ANNs) based on the dynamics and cyto-architecture of brain regions associated with time perception can help mitigate these limitations and, in conjunction, provide a powerful tool for progressing research in the field. To this end, this chapter aims to: (1) provide insight into the biological complexity of interval timing, (2) outline limitations in our ability to accurately assess these neural mechanisms in vivo, and (3) demonstrate potential application of ANNs for better understanding the biological underpinnings of temporal processing

    An efficient automated parameter tuning framework for spiking neural networks

    Get PDF
    As the desire for biologically realistic spiking neural networks (SNNs) increases, tuning the enormous number of open parameters in these models becomes a difficult challenge. SNNs have been used to successfully model complex neural circuits that explore various neural phenomena such as neural plasticity, vision systems, auditory systems, neural oscillations, and many other important topics of neural function. Additionally, SNNs are particularly well-adapted to run on neuromorphic hardware that will support biological brain-scale architectures. Although the inclusion of realistic plasticity equations, neural dynamics, and recurrent topologies has increased the descriptive power of SNNs, it has also made the task of tuning these biologically realistic SNNs difficult. To meet this challenge, we present an automated parameter tuning framework capable of tuning SNNs quickly and efficiently using evolutionary algorithms (EA) and inexpensive, readily accessible graphics processing units (GPUs). A sample SNN with 4104 neurons was tuned to give V1 simple cell-like tuning curve responses and produce self-organizing receptive fields (SORFs) when presented with a random sequence of counterphase sinusoidal grating stimuli. A performance analysis comparing the GPU-accelerated implementation to a single-threaded central processing unit (CPU) implementation was carried out and showed a speedup of 65× of the GPU implementation over the CPU implementation, or 0.35 h per generation for GPU vs. 23.5 h per generation for CPU. Additionally, the parameter value solutions found in the tuned SNN were studied and found to be stable and repeatable. The automated parameter tuning framework presented here will be of use to both the computational neuroscience and neuromorphic engineering communities, making the process of constructing and tuning large-scale SNNs much quicker and easier

    Deploying and Optimizing Embodied Simulations of Large-Scale Spiking Neural Networks on HPC Infrastructure

    Get PDF
    Simulating the brain-body-environment trinity in closed loop is an attractive proposal to investigate how perception, motor activity and interactions with the environment shape brain activity, and vice versa. The relevance of this embodied approach, however, hinges entirely on the modeled complexity of the various simulated phenomena. In this article, we introduce a software framework that is capable of simulating large-scale, biologically realistic networks of spiking neurons embodied in a biomechanically accurate musculoskeletal system that interacts with a physically realistic virtual environment. We deploy this framework on the high performance computing resources of the EBRAINS research infrastructure and we investigate the scaling performance by distributing computation across an increasing number of interconnected compute nodes. Our architecture is based on requested compute nodes as well as persistent virtualmachines; this provides a high-performance simulation environment that is accessible to multidomain users without expert knowledge, with a view to enable users to instantiate and control simulations at custom scale via a web-based graphical user interface. Our simulation environment, entirely open source, is based on the Neurorobotics Platform developed in the context of the Human Brain Project, and the NEST simulator. We characterize the capabilities of our parallelized architecture for large-scale embodied brain simulations through two benchmark experiments, by investigating the effects of scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a largescale balanced network, while the second one is a multi-region embodied brain simulation consisting of more than a million neurons and a billion synapses. Both benchmarks clearly show how scaling compute resources improves the aforementioned performance metrics in a near-linear fashion. The second benchmark in particular is indicative of both the potential and limitations of a highly distributed simulation in terms of a trade-off between computation speed and resource cost. Our simulation architecture is being prepared to be accessible for everyone as an EBRAINS service, thereby offering a community-wide tool with a unique workflow that should provide momentum to the investigation of closed-loop embodiment within the computational neuroscience community.European Union’s Horizon 2020 Framework Programme 785907 945539European Union’s Horizon 2020 800858MEXT (hp200139, hp210169) MEXT KAKENHI grant no. 17H06310

    Deploying and Optimizing Embodied Simulations of Large-Scale Spiking Neural Networks on HPC Infrastructure

    Get PDF
    Simulating the brain-body-environment trinity in closed loop is an attractive proposal to investigate how perception, motor activity and interactions with the environment shape brain activity, and vice versa. The relevance of this embodied approach, however, hinges entirely on the modeled complexity of the various simulated phenomena. In this article, we introduce a software framework that is capable of simulating large-scale, biologically realistic networks of spiking neurons embodied in a biomechanically accurate musculoskeletal system that interacts with a physically realistic virtual environment. We deploy this framework on the high performance computing resources of the EBRAINS research infrastructure and we investigate the scaling performance by distributing computation across an increasing number of interconnected compute nodes. Our architecture is based on requested compute nodes as well as persistent virtual machines; this provides a high-performance simulation environment that is accessible to multi-domain users without expert knowledge, with a view to enable users to instantiate and control simulations at custom scale via a web-based graphical user interface. Our simulation environment, entirely open source, is based on the Neurorobotics Platform developed in the context of the Human Brain Project, and the NEST simulator. We characterize the capabilities of our parallelized architecture for large-scale embodied brain simulations through two benchmark experiments, by investigating the effects of scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a large-scale balanced network, while the second one is a multi-region embodied brain simulation consisting of more than a million neurons and a billion synapses. Both benchmarks clearly show how scaling compute resources improves the aforementioned performance metrics in a near-linear fashion. The second benchmark in particular is indicative of both the potential and limitations of a highly distributed simulation in terms of a trade-off between computation speed and resource cost. Our simulation architecture is being prepared to be accessible for everyone as an EBRAINS service, thereby offering a community-wide tool with a unique workflow that should provide momentum to the investigation of closed-loop embodiment within the computational neuroscience community.journal articl

    Amygdala Modeling with Context and Motivation Using Spiking Neural Networks for Robotics Applications

    Get PDF
    Cognitive capabilities for robotic applications are furthered by developing an artificial amygdala that mimics biology. The amygdala portion of the brain is commonly understood to control mood and behavior based upon sensory inputs, motivation, and context. This research builds upon prior work in creating artificial intelligence for robotics which focused on mood-generated actions. However, recent amygdala research suggests a void in greater functionality. This work developed a computational model of an amygdala, integrated this model into a robot model, and developed a comprehensive integration of the robot for simulation, and live embodiment. The developed amygdala, instantiated in the Nengo Brain Maker environment, leveraged spiking neural networks and the semantic pointer architecture to allow the abstraction of neuron ensembles into high-level concept vocabularies. Test and validation were performed on a TurtleBot in both simulated (Gazebo) and live testing. Results were compared to a baseline model which has a simplistic, amygdala-like model. Metrics of nearest distance and nearest time were used for assessment. The amygdala model is shown to outperform the baseline in both simulations, with a 70.8% improvement in nearest distance and, 4% improvement in the nearest time, and in real applications with a 62.4% improvement in nearest distance. Notably, this performance occurred despite a five-fold increase in architecture size and complexity

    Technical Integration of Hippocampus, Basal Ganglia and Physical Models for Spatial Navigation

    Get PDF
    Computational neuroscience is increasingly moving beyond modeling individual neurons or neural systems to consider the integration of multiple models, often constructed by different research groups. We report on our preliminary technical integration of recent hippocampal formation, basal ganglia and physical environment models, together with visualisation tools, as a case study in the use of Python across the modelling tool-chain. We do not present new modeling results here. The architecture incorporates leaky-integrator and rate-coded neurons, a 3D environment with collision detection and tactile sensors, 3D graphics and 2D plots. We found Python to be a flexible platform, offering a significant reduction in development time, without a corresponding significant increase in execution time. We illustrate this by implementing a part of the model in various alternative languages and coding styles, and comparing their execution times. For very large-scale system integration, communication with other languages and parallel execution may be required, which we demonstrate using the BRAHMS framework's Python bindings
    corecore