461 research outputs found

    Web-Based Interfaces for Virtual C. elegans Neuron Model Definition, Network Configuration, Behavioral Experiment Definition and Experiment Results Visualization

    Get PDF
    The Si elegans platform targets the complete virtualization of the nematode Caenorhabditis elegans, and its environment. This paper presents a suite of unified web-based Graphical User Interfaces (GUIs) as the main user interaction point, and discusses their underlying technologies and methods. The user-friendly features of this tool suite enable users to graphically create neuron and network models, and behavioral experiments, without requiring knowledge of domain-specific computer-science tools. The framework furthermore allows the graphical visualization of all simulation results using a worm locomotion and neural activity viewer. Models, experiment definitions and results can be exported in a machine-readable format, thereby facilitating reproducible and cross-platform execution of in silico C. elegans experiments in other simulation environments. This is made possible by a novel XML-based behavioral experiment definition encoding format, a NeuroML XML-based model generation and network configuration description language, and their associated GUIs. User survey data confirms the platform usability and functionality, and provides insights into future directions for web-based simulation GUIs of C. elegans and other living organisms. The tool suite is available online to the scientific community and its source code has been made available

    Plasticity in gustatory and nociceptive neurons controls decision making in C. elegans salt navigation

    Get PDF
    A conventional understanding of perception assigns sensory organs the role of capturing the environment. Better sensors result in more accurate encoding of stimuli, allowing for cognitive processing downstream. Here we show that plasticity in sensory neurons mediates a behavioral switch in C. elegans between attraction to NaCl in naïve animals and avoidance of NaCl in preconditioned animals, called gustatory plasticity. Ca2+ imaging in ASE and ASH NaCl sensing neurons reveals multiple cell-autonomous and distributed circuit adaptation mechanisms. A computational model quantitatively accounts for observed behaviors and reveals roles for sensory neurons in the control and modulation of motor behaviors, decision making and navigational strategy. Sensory adaptation dynamically alters the encoding of the environment. Rather than encoding the stimulus directly, therefore, we propose that these C. elegans sensors dynamically encode a context-dependent value of the stimulus. Our results demonstrate how adaptive sensory computation can directly control an animal’s behavioral state

    Regulation of two motor patterns enables the gradual adjustment of locomotion strategy in Caenorhabditis elegans

    Get PDF
    In animal locomotion a tradeoff exists between stereotypy and flexibility: fast long-distance travelling (LDT) requires coherent regular motions, while local sampling and area-restricted search (ARS) rely on flexible movements. We report here on a posture control system in C. elegans that coordinates these needs. Using quantitative posture analysis we explain worm locomotion as a composite of two modes: regular undulations versus flexible turning. Graded reciprocal regulation of both modes allows animals to flexibly adapt their locomotion strategy under sensory stimulation along a spectrum ranging from LDT to ARS. Using genetics and functional imaging of neural activity we characterize the counteracting interneurons AVK and DVA that utilize FLP-1 and NLP-12 neuropeptides to control both motor modes. Gradual regulation of behaviors via this system is required for spatial navigation during chemotaxis. This work shows how a nervous system controls simple elementary features of posture to generate complex movements for goal-directed locomotion strategies

    Insects have the capacity for subjective experience

    Get PDF
    To what degree are non-human animals conscious? We propose that the most meaningful way to approach this question is from the perspective of functional neurobiology. Here we focus on subjective experience, which is a basic awareness of the world without further reflection on that awareness. This is considered the most basic form of consciousness. Tellingly, this capacity is supported by the integrated midbrain and basal ganglia structures, which are among the oldest and most highly conserved brain systems in vertebrates. A reasonable inference is that the capacity for subjective experience is both widespread and evolutionarily old within the vertebrate lineage. We argue that the insect brain supports functions analogous to those of the vertebrate midbrain and hence that insects may also have a capacity for subjective experience. We discuss the features of neural systems which can and cannot be expected to support this capacity as well as the relationship between our arguments based on neurobiological mechanism and our approach to the “hard problem” of conscious experience

    Regulation of two motor patterns enables the gradual adjustment of locomotion strategy in Caenorhabditis elegans

    Get PDF
    In animal locomotion a tradeoff exists between stereotypy and flexibility: fast long-distance travelling (LDT) requires coherent regular motions, while local sampling and area-restricted search (ARS) rely on flexible movements. We report here on a posture control system in C. elegans that coordinates these needs. Using quantitative posture analysis we explain worm locomotion as a composite of two modes: regular undulations versus flexible turning. Graded reciprocal regulation of both modes allows animals to flexibly adapt their locomotion strategy under sensory stimulation along a spectrum ranging from LDT to ARS. Using genetics and functional imaging of neural activity we characterize the counteracting interneurons AVK and DVA that utilize FLP-1 and NLP-12 neuropeptides to control both motor modes. Gradual regulation of behaviors via this system is required for spatial navigation during chemotaxis. This work shows how a nervous system controls simple elementary features of posture to generate complex movements for goal-directed locomotion strategies

    Robot-Assisted Full Automation Interface: Touch-Response On Zebrafish Larvae

    Get PDF

    Sensor Fusion in the Perception of Self-Motion

    No full text
    This dissertation has been written at the Max Planck Institute for Biological Cybernetics (Max-Planck-Institut für Biologische Kybernetik) in Tübingen in the department of Prof. Dr. Heinrich H. Bülthoff. The work has universitary support by Prof. Dr. Günther Palm (University of Ulm, Abteilung Neuroinformatik). Main evaluators are Prof. Dr. Günther Palm, Prof. Dr. Wolfgang Becker (University of Ulm, Sektion Neurophysiologie) and Prof. Dr. Heinrich Bülthoff.amp;lt;bramp;gt;amp;lt;bramp;gt; The goal of this thesis was to investigate the integration of different sensory modalities in the perception of self-motion, by using psychophysical methods. Experiments with healthy human participants were to be designed for and performed in the Motion Lab, which is equipped with a simulator platform and projection screen. Results from psychophysical experiments should be used to refine models of the multisensory integration process, with an mphasis on Bayesian (maximum likelihood) integration mechanisms.amp;lt;bramp;gt;amp;lt;bramp;gt; To put the psychophysical experiments into the larger framework of research on multisensory integration in the brain, results of neuroanatomical and neurophysiological experiments on multisensory integration are also reviewed

    A Modular and Open-Source Framework for Virtual Reality Visualisation and Interaction in Bioimaging

    Get PDF
    Life science today involves computational analysis of a large amount and variety of data, such as volumetric data acquired by state-of-the-art microscopes, or mesh data from analysis of such data or simulations. The advent of new imaging technologies, such as lightsheet microscopy, has resulted in the users being confronted with an ever-growing amount of data, with even terabytes of imaging data created within a day. With the possibility of gentler and more high-performance imaging, the spatiotemporal complexity of the model systems or processes of interest is increasing as well. Visualisation is often the first step in making sense of this data, and a crucial part of building and debugging analysis pipelines. It is therefore important that visualisations can be quickly prototyped, as well as developed or embedded into full applications. In order to better judge spatiotemporal relationships, immersive hardware, such as Virtual or Augmented Reality (VR/AR) headsets and associated controllers are becoming invaluable tools. In this work we present scenery, a modular and extensible visualisation framework for the Java VM that can handle mesh and large volumetric data, containing multiple views, timepoints, and color channels. scenery is free and open-source software, works on all major platforms, and uses the Vulkan or OpenGL rendering APIs. We introduce scenery's main features, and discuss its use with VR/AR hardware and in distributed rendering. In addition to the visualisation framework, we present a series of case studies, where scenery can provide tangible benefit in developmental and systems biology: With Bionic Tracking, we demonstrate a new technique for tracking cells in 4D volumetric datasets via tracking eye gaze in a virtual reality headset, with the potential to speed up manual tracking tasks by an order of magnitude. We further introduce ideas to move towards virtual reality-based laser ablation and perform a user study in order to gain insight into performance, acceptance and issues when performing ablation tasks with virtual reality hardware in fast developing specimen. To tame the amount of data originating from state-of-the-art volumetric microscopes, we present ideas how to render the highly-efficient Adaptive Particle Representation, and finally, we present sciview, an ImageJ2/Fiji plugin making the features of scenery available to a wider audience.:Abstract Foreword and Acknowledgements Overview and Contributions Part 1 - Introduction 1 Fluorescence Microscopy 2 Introduction to Visual Processing 3 A Short Introduction to Cross Reality 4 Eye Tracking and Gaze-based Interaction Part 2 - VR and AR for System Biology 5 scenery — VR/AR for Systems Biology 6 Rendering 7 Input Handling and Integration of External Hardware 8 Distributed Rendering 9 Miscellaneous Subsystems 10 Future Development Directions Part III - Case Studies C A S E S T U D I E S 11 Bionic Tracking: Using Eye Tracking for Cell Tracking 12 Towards Interactive Virtual Reality Laser Ablation 13 Rendering the Adaptive Particle Representation 14 sciview — Integrating scenery into ImageJ2 & Fiji Part IV - Conclusion 15 Conclusions and Outlook Backmatter & Appendices A Questionnaire for VR Ablation User Study B Full Correlations in VR Ablation Questionnaire C Questionnaire for Bionic Tracking User Study List of Tables List of Figures Bibliography Selbstständigkeitserklärun
    corecore