16 research outputs found

    A Statistically Representative Atlas for Mapping Neuronal Circuits in the Drosophila Adult Brain

    Get PDF
    Published: 23 March 2018The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fninf.2018.00013/full#supplementary-material Supplementary Figure 1. 3D renderings of the 14 regions used for quantitative evaluation of atlas performances in segmentation and registration tasks. The 14 regions shown here were extracted from the atlas of Ito et al. (2014) that has been registered onto the group-wise inter-sex atlas (available from http://fruitfly.tefor.net). Supplementary Figure 2. Selected lines from the Janelia Farm collection showing an overlap value with the search pattern ranking among the first 50 for at least three of the five PDF profiles. (Left) GAL4-driven GFP profile registered on the standard brain. (Right) overlap between the first PDF profile and the GAL4-driven GFP profile. Numbers refer to Janelia Farm lines with associated gene names. Scale bar: 20 μm. Supplementary Table 1. Results of the 3D space query for each of the five PDF profiles. Overlap values are indicated for each Janelia Farm line and the corresponding gene name (FlyBase nomenclature) is indicated for the overlap values ranking among the first 50 for at least three of the five PDF profiles (blue). Bold names correspond to the three lines shown in Figure 10. Supplementary Movie 1. Animated rendering of the group-wise inter-sex atlas. Successively: nc82 template image (2D sections then 3D volume rendering, opaque then transparent); label image (3D surface rendering of anatomical regions, defined following Ito et al. 2014); six registered patterns of GAL4-GFP expression (3D surface rendering of intensity-thresholded pattern images); same patterns (left half of the brain) with the anatomical regions (right half of the brain).Imaging the expression patterns of reporter constructs is a powerful tool to dissect the neuronal circuits of perception and behavior in the adult brain of Drosophila, one of the major models for studying brain functions. To date, several Drosophila brain templates and digital atlases have been built to automatically analyze and compare collections of expression pattern images. However, there has been no systematic comparison of performances between alternative atlasing strategies and registration algorithms. Here, we objectively evaluated the performance of different strategies for building adult Drosophila brain templates and atlases. In addition, we used state-of-the-art registration algorithms to generate a new group-wise inter-sex atlas. Our results highlight the benefit of statistical atlases over individual ones and show that the newly proposed inter-sex atlas outperformed existing solutions for automated registration and annotation of expression patterns. Over 3,000 images from the Janelia Farm FlyLight collection were registered using the proposed strategy. These registered expression patterns can be searched and compared with a new version of the BrainBaseWeb system and BrainGazer software. We illustrate the validity of our methodology and brain atlas with registration-based predictions of expression patterns in a subset of clock neurons. The described registration framework should benefit to brain studies in Drosophila and other insect species.IA-C, TM, NM, FS, and AJ were funded by the Tefor Infrastructure under the Investments for the Future program of the French National Research Agency (Grant #ANR-11-INBS-0014). FR was supported by INSERM. Work at Institut des Neurosciences Paris-Saclay was supported by ANR Infrastructure Tefor and by ANR ClockEye(#ANR-14-CE13-0034-01). JI was supported by the Spanish Ministry of Economy and Competitiveness (TEC2014-51882-P), the European Union's Horizon 2020 research and innovation programme (Marie Sklodowska-Curie grant 654911, project THALAMODEL), and the European Research Council (ERC Starting Grant no. 677697 BUNGEE-TOOLS). VRVis (KB, FS) is funded by BMVIT, BMWFW, Styria, SFG and Vienna Business Agency in the scope of COMET - Competence Centers for Excellent Technologies (854174) which is managed by FFG. The Institut Jean-Pierre Bourgin benefits from the support of the LabEx Saclay Plant Sciences-SPS (#ANR-10-LABX-0040-SPS)

    Towards Advanced Interactive Visualization for Virtual Atlases

    Get PDF
    Under embargo until: 2020-07-24An atlas is generally defined as a bound collection of tables, charts or illustrations describing a phenomenon. In an anatomical atlas for example, a collection of representative illustrations and text describes anatomy for the purpose of communicating anatomical knowledge. The atlas serves as reference frame for comparing and integrating data from different sources by spatially or semantically relating collections of drawings, imaging data, and/or text. In the field of medical image processing, atlas information is often constructed from a collection of regions of interest, which are based on medical images that are annotated by domain experts. Such an atlas may be employed, for example, for automatic segmentation of medical imaging data. The combination of interactive visualization techniques with atlas information opens up new possibilities for content creation, curation, and navigation in virtual atlases. With interactive visualization of atlas information, students are able to inspect and explore anatomical atlases in ways that were not possible with the traditional method of presenting anatomical atlases in book format, such as viewing the illustrations from other viewpoints. With advanced interaction techniques, it becomes possible to query the data that forms the basis for the atlas, thus empowering researchers to access a wealth of information in new ways. So far, atlas-based visualization has been employed mainly for medical education, as well as biological research. In this survey, we provide an overview of current digital biomedical atlas tasks and applications and summarize relevant visualization techniques. We discuss recent approaches for providing next-generation visual interfaces to navigate atlas data that go beyond common text-based search and hierarchical lists. Finally, we reflect on open challenges and opportunities for the next steps in interactive atlas visualization.acceptedVersio

    Towards the Holodeck: fully immersive virtual reality visualisation of scientific and engineering data

    Get PDF
    In this paper, we describe the development and operating principles of an immersive virtual reality (VR) visualisation environment that is designed around the use of consumer VR headsets in an existing wide area motion capture suite. We present two case studies in the application areas of visualisation of scientific and engineering data. Each of these case studies utilise a different render engine, namely a custom engine for one case and a commercial game engine for the other. The advantages and appropriateness of each approach are discussed along with suggestions for future work

    Assembling models of embryo development: Image analysis and the construction of digital atlases

    Get PDF
    Digital atlases of animal development provide a quantitative description of morphogenesis, opening the path toward processes modeling. Prototypic atlases offer a data integration framework where to gather information from cohorts of individuals with phenotypic variability. Relevant information for further theoretical reconstruction includes measurements in time and space for cell behaviors and gene expression. The latter as well as data integration in a prototypic model, rely on image processing strategies. Developing the tools to integrate and analyze biological multidimensional data are highly relevant for assessing chemical toxicity or performing drugs preclinical testing. This article surveys some of the most prominent efforts to assemble these prototypes, categorizes them according to salient criteria and discusses the key questions in the field and the future challenges toward the reconstruction of multiscale dynamics in model organisms

    WYSIWYP: What You See Is What You Pick

    Full text link

    Tadpole VR: virtual reality visualization of a simulated tadpole spinal cord

    Get PDF
    Recent advances in “developmental” approach (combining experimental study with computational modelling) of neural networks produces increasingly large data sets, in both complexity and size. This poses a significant challenge in analyzing, visualizing and understanding not only the spatial structure but also the behavior of such networks. This paper describes a Virtual Reality application for visualization of two biologically accurate computational models that model the anatomical structure of a neural network comprised of 1,500 neurons and over 80,000 connections. The visualization enables a user to observe the complex spatio-temporal interplay between seven unique types of neurons culminating in an observable swimming pattern. We present a detailed description of the design approach for the virtual environment, based on a set of initial requirements, followed up by the implementation and optimization steps. Lastly, the results of a pilot usability study are being presented on how confident participants are in their ability to understand how the alternating firing pattern between the two sides of the tadpole’s body generate swimming motion
    corecore