8 research outputs found

    High resolution diffusion imaging in the unfixed post-mortem infant brain at 7T

    Get PDF
    Diffusion MRI of the infant brain allows investigation of the organizational structure of maturing fibers during brain development. Post-mortem imaging has the potential to achieve high resolution by using long scan times, enabling precise assessment of small structures. Technical development for post-mortem diffusion MRI has primarily focused on scanning of fixed tissue, which is robust to effects like temperature drift that can cause unfixed tissue to degrade. The ability to scan unfixed tissue in the intact body would enable post-mortem studies without organ donation, but poses new technical challenges. This paper describes our approach to scan setup, protocol optimization, and tissue protection in the context of the Developing Human Connectome Project (dHCP) of neonates. A major consideration was the need to preserve the integrity of unfixed tissue during scanning in light of energy deposition at ultra-high magnetic field strength. We present results from one of the first two subjects recruited to the study, who died on postnatal day 46 at 29+6 weeks postmenstrual age, demonstrating high-quality diffusion MRI data. We find altered diffusion properties consistent with post-mortem changes reported previously. Preliminary voxel-wise and tractography analyses are presented with comparison to age-matched in vivo dHCP data. These results show that high-quality, high-resolution post-mortem data of unfixed tissue can be acquired to explore the developing human brain

    Tensor image registration library: Deformable registration of stand‐alone histology images to whole‐brain post‐mortem MRI data

    Get PDF
    Background: Accurate registration between microscopy and MRI data is necessary for validating imaging biomarkers against neuropathology, and to disentangle complex signal dependencies in microstructural MRI. Existing registration methods often rely on serial histological sampling or significant manual input, providing limited scope to work with a large number of stand-alone histology sections. Here we present a customisable pipeline to assist the registration of stand-alone histology sections to whole-brain MRI data. Methods: Our pipeline registers stained histology sections to whole-brain post-mortem MRI in 4 stages, with the help of two photographic intermediaries: a block face image (to undistort histology sections) and coronal brain slab photographs (to insert them into MRI space). Each registration stage is implemented as a configurable stand-alone Python script using our novel platform, Tensor Image Registration Library (TIRL), which provides flexibility for wider adaptation. We report our experience of registering 87 PLP-stained histology sections from 14 subjects and perform various experiments to assess the accuracy and robustness of each stage of the pipeline. Results: All 87 histology sections were successfully registered to MRI. Histology-to-block registration (Stage 1) achieved 0.2–0.4 mm accuracy, better than commonly used existing methods. Block-to-slice matching (Stage 2) showed great robustness in automatically identifying and inserting small tissue blocks into whole brain slices with 0.2 mm accuracy. Simulations demonstrated sub-voxel level accuracy (0.13 mm) of the slice-to-volume registration (Stage 3) algorithm, which was observed in over 200 actual brain slice registrations, compensating 3D slice deformations up to 6.5 mm. Stage 4 combined the previous stages and generated refined pixelwise aligned multi-modal histology-MRI stacks. Conclusions: Our open-source pipeline provides robust automation tools for registering stand-alone histology sections to MRI data with sub-voxel level precision, and the underlying framework makes it readily adaptable to a diverse range of microscopy-MRI studies

    The Digital Brain Bank, an open access platform for post-mortem datasets

    No full text
    Post-mortem MRI provides the opportunity to acquire high-resolution datasets to investigate neuroanatomy, and validate the origins of image contrast through microscopy comparisons. We introduce the Digital Brain Bank (open.win.ox.ac.uk/DigitalBrainBank), a data release platform providing open access to curated, multimodal post-mortem neuroimaging datasets. Datasets span three themes-Digital Neuroanatomist: datasets for detailed neuroanatomical investigations; Digital Brain Zoo: datasets for comparative neuroanatomy; Digital Pathologist: datasets for neuropathology investigations. The first Digital Brain Bank release includes twenty one distinctive whole-brain diffusion MRI datasets for structural connectivity investigations, alongside microscopy and complementary MRI modalities. This includes one of the highest-resolution whole-brain human diffusion MRI datasets ever acquired, whole-brain diffusion MRI in fourteen non-human primate species, and one of the largest post-mortem whole-brain cohort imaging studies in neurodegeneration. The Digital Brain Bank is the culmination of our lab’s investment into post-mortem MRI methodology and MRI-microscopy analysis techniques. This manuscript provides a detailed overview of our work with post-mortem imaging to date, including the development of diffusion MRI methods to image large post-mortem samples, including whole, human brains. Taken together, the Digital Brain Bank provides cross-scale, cross-species datasets facilitating the incorporation of post-mortem data into neuroimaging studies

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore