136 research outputs found
1st INCF Workshop on Global Portal Services for Neuroscience
The goal of this meeting was to map out existing portal services for neuroscience, identify their features and future plans, and outline opportunities for synergistic developments. The workshop discussed alternative formats of future global and integrated portal services
Neuron Depot: keeping your colleagues in sync by combining modern cloud storage services, the local file system, and simple web applications
Neuroscience today deals with a "data deluge" derived from the availability of high-throughput sensors of brain structure and brain activity, and increased computational resources for detailed simulations with complex output. We report here (1) a novel approach to data sharing between collaborating scientists that brings together file system tools and cloud technologies, (2) a service implementing this approach, called NeuronDepot, and (3) an example application of the service to a complex use case in the neurosciences. The main drivers for our approach are to facilitate collaborations with a transparent, automated data flow that shields scientists from having to learn new tools or data structuring paradigms. Using NeuronDepot is simple: one-time data assignment from the originator and cloud based syncing thus making experimental and modeling data available across the collaboration with minimum overhead. Since data sharing is cloud based, our approach opens up the possibility of using new software developments and hardware scalabitliy which are associated with elastic cloud computing. We provide an implementation that relies on existing synchronization services and is usable from all devices via a reactive web interface. We are motivating our solution by solving the practical problems of the GinJang project, a collaboration of three universities across eight time zones with a complex workflow encompassing data from electrophysiological recordings, imaging, morphological reconstructions, and simulations
Recommended from our members
HD Physiology Project-Japanese efforts to promote multilevel integrative systems biology and physiome research.
The HD Physiology Project is a Japanese research consortium that aimed to develop methods and a computational platform in which physiological and pathological information can be described in high-level definitions across multiple scales of time and size. During the 5 years of this project, an appropriate software platform for multilevel functional simulation was developed and a whole-heart model including pharmacokinetics for the assessment of the proarrhythmic risk of drugs was developed. In this article, we outline the description and scientific strategy of this project and present the achievements and influence on multilevel integrative systems biology and physiome research
Deploying and Optimizing Embodied Simulations of Large-Scale Spiking Neural Networks on HPC Infrastructure
Simulating the brain-body-environment trinity in closed loop is an attractive proposal to investigate how perception, motor activity and interactions with the environment shape brain activity, and vice versa. The relevance of this embodied approach, however, hinges entirely on the modeled complexity of the various simulated phenomena. In this article, we introduce a software framework that is capable of simulating large-scale, biologically realistic networks of spiking neurons embodied in a biomechanically accurate musculoskeletal system that interacts with a physically realistic virtual environment. We deploy this framework on the high performance computing resources of the EBRAINS research infrastructure and we investigate the scaling performance by distributing computation across an increasing number of interconnected compute nodes. Our architecture is based on requested compute nodes as well as persistent virtual machines; this provides a high-performance simulation environment that is accessible to multi-domain users without expert knowledge, with a view to enable users to instantiate and control simulations at custom scale via a web-based graphical user interface. Our simulation environment, entirely open source, is based on the Neurorobotics Platform developed in the context of the Human Brain Project, and the NEST simulator. We characterize the capabilities of our parallelized architecture for large-scale embodied brain simulations through two benchmark experiments, by investigating the effects of scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a large-scale balanced network, while the second one is a multi-region embodied brain simulation consisting of more than a million neurons and a billion synapses. Both benchmarks clearly show how scaling compute resources improves the aforementioned performance metrics in a near-linear fashion. The second benchmark in particular is indicative of both the potential and limitations of a highly distributed simulation in terms of a trade-off between computation speed and resource cost. Our simulation architecture is being prepared to be accessible for everyone as an EBRAINS service, thereby offering a community-wide tool with a unique workflow that should provide momentum to the investigation of closed-loop embodiment within the computational neuroscience community.journal articl
Deploying and Optimizing Embodied Simulations of Large-Scale Spiking Neural Networks on HPC Infrastructure
Simulating the brain-body-environment trinity in closed loop is an attractive proposal
to investigate how perception, motor activity and interactions with the environment
shape brain activity, and vice versa. The relevance of this embodied approach, however,
hinges entirely on the modeled complexity of the various simulated phenomena. In this
article, we introduce a software framework that is capable of simulating large-scale,
biologically realistic networks of spiking neurons embodied in a biomechanically accurate
musculoskeletal system that interacts with a physically realistic virtual environment. We
deploy this framework on the high performance computing resources of the EBRAINS
research infrastructure and we investigate the scaling performance by distributing
computation across an increasing number of interconnected compute nodes. Our
architecture is based on requested compute nodes as well as persistent virtualmachines;
this provides a high-performance simulation environment that is accessible to multidomain
users without expert knowledge, with a view to enable users to instantiate
and control simulations at custom scale via a web-based graphical user interface. Our
simulation environment, entirely open source, is based on the Neurorobotics Platform
developed in the context of the Human Brain Project, and the NEST simulator. We
characterize the capabilities of our parallelized architecture for large-scale embodied
brain simulations through two benchmark experiments, by investigating the effects of
scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a largescale
balanced network, while the second one is a multi-region embodied brain
simulation consisting of more than a million neurons and a billion synapses. Both
benchmarks clearly show how scaling compute resources improves the aforementioned
performance metrics in a near-linear fashion. The second benchmark in particular is
indicative of both the potential and limitations of a highly distributed simulation in
terms of a trade-off between computation speed and resource cost. Our simulation
architecture is being prepared to be accessible for everyone as an EBRAINS service,
thereby offering a community-wide tool with a unique workflow that should provide
momentum to the investigation of closed-loop embodiment within the computational
neuroscience community.European Union’s Horizon
2020 Framework Programme 785907 945539European Union’s Horizon
2020 800858MEXT (hp200139, hp210169) MEXT KAKENHI grant
no. 17H06310
Technical and Organizational Considerations for the Long-Term Maintenance and Development of Digital Brain Atlases and Web-Based Databases
Digital brain atlas is a kind of image database that specifically provide information about neurons and glial cells in the brain. It has various advantages that are unmatched by conventional paper-based atlases. Such advantages, however, may become disadvantages if appropriate cares are not taken. Because digital atlases can provide unlimited amount of data, they should be designed to minimize redundancy and keep consistency of the records that may be added incrementally by different staffs. The fact that digital atlases can easily be revised necessitates a system to assure that users can access previous versions that might have been cited in papers at a particular period. To inherit our knowledge to our descendants, such databases should be maintained for a very long period, well over 100 years, like printed books and papers. Technical and organizational measures to enable long-term archive should be considered seriously. Compared to the initial development of the database, subsequent efforts to increase the quality and quantity of its contents are not regarded highly, because such tasks do not materialize in the form of publications. This fact strongly discourages continuous expansion of, and external contributions to, the digital atlases after its initial launch. To solve these problems, the role of the biocurators is vital. Appreciation of the scientific achievements of the people who do not write papers, and establishment of the secure academic career path for them, are indispensable for recruiting talents for this very important job
Time to consider animal data governance: perspectives from neuroscience
Introduction: Scientific research relies mainly on multimodal, multidimensional big data generated from both animal and human organisms as well as technical data. However, unlike human data that is increasingly regulated at national, regional and international levels, regulatory frameworks that can govern the sharing and reuse of non-human animal data are yet to be established. Whereas the legal and ethical principles that shape animal data generation in many countries and regions differ, the generated data are shared beyond boundaries without any governance mechanism. This paper, through perspectives from neuroscience, shows conceptually and empirically that there is a need for animal data governance that is informed by ethical concerns. There is a plurality of ethical views on the use of animals in scientific research that data governance mechanisms need to consider. Methods: Semi-structured interviews were used for data collection. Overall, 13 interviews with 12 participants (10 males and 2 females) were conducted. The interviews were transcribed and stored in NviVo 12 where they were thematically analyzed. Results: The participants shared the view that it is time to consider animal data governance due to factors such as differences in regulations, differences in ethical principles, values and beliefs and data quality concerns. They also provided insights on possible approaches to governance. Discussion: We therefore conclude that a procedural approach to data governance is needed: an approach that does not prescribe a particular ethical position but allows for a quick understanding of ethical concerns and debate about how different positions differ to facilitate cross-cultural and international collaboration
- …