55 research outputs found

    Informatics and data mining tools and strategies for the Human Connectome Project

    Get PDF
    The Human Connectome Project (HCP) is a major endeavor that will acquire and analyze connectivity data plus other neuroimaging, behavioral, and genetic data from 1,200 healthy adults. It will serve as a key resource for the neuroscience research community, enabling discoveries of how the brain is wired and how it functions in different individuals. To fulfill its potential, the HCP consortium is developing an informatics platform that will handle: 1) storage of primary and processed data, 2) systematic processing and analysis of the data, 3) open access data sharing, and 4) mining and exploration of the data. This informatics platform will include two primary components. ConnectomeDB will provide database services for storing and distributing the data, as well as data analysis pipelines. Connectome Workbench will provide visualization and exploration capabilities. The platform will be based on standard data formats and provide an open set of application programming interfaces (APIs) that will facilitate broad utilization of the data and integration of HCP services into a variety of external applications. Primary and processed data generated by the HCP will be openly shared with the scientific community, and the informatics platform will be available under an open source license. This paper describes the HCP informatics platform as currently envisioned and places it into the context of the overall HCP vision and agenda

    The Human Connectome Project: A retrospective

    Get PDF
    The Human Connectome Project (HCP) was launched in 2010 as an ambitious effort to accelerate advances in human neuroimaging, particularly for measures of brain connectivity; apply these advances to study a large number of healthy young adults; and freely share the data and tools with the scientific community. NIH awarded grants to two consortia; this retrospective focuses on the WU-Minn-Ox HCP consortium centered at Washington University, the University of Minnesota, and University of Oxford. In just over 6 years, the WU-Minn-Ox consortium succeeded in its core objectives by: 1) improving MR scanner hardware, pulse sequence design, and image reconstruction methods, 2) acquiring and analyzing multimodal MRI and MEG data of unprecedented quality together with behavioral measures from more than 1100 HCP participants, and 3) freely sharing the data (via the ConnectomeDB database) and associated analysis and visualization tools. To date, more than 27 Petabytes of data have been shared, and 1538 papers acknowledging HCP data use have been published. The HCP-style neuroimaging paradigm has emerged as a set of best-practice strategies for optimizing data acquisition and analysis. This article reviews the history of the HCP, including comments on key events and decisions associated with major project components. We discuss several scientific advances using HCP data, including improved cortical parcellations, analyses of connectivity based on functional and diffusion MRI, and analyses of brain-behavior relationships. We also touch upon our efforts to develop and share a variety of associated data processing and analysis tools along with detailed documentation, tutorials, and an educational course to train the next generation of neuroimagers. We conclude with a look forward at opportunities and challenges facing the human neuroimaging field from the perspective of the HCP consortium

    The Human Connectome Project's neuroimaging approach

    Get PDF
    Noninvasive human neuroimaging has yielded many discoveries about the brain. Numerous methodological advances have also occurred, though inertia has slowed their adoption. This paper presents an integrated approach to data acquisition, analysis and sharing that builds upon recent advances, particularly from the Human Connectome Project (HCP). The 'HCP-style' paradigm has seven core tenets: (i) collect multimodal imaging data from many subjects; (ii) acquire data at high spatial and temporal resolution; (iii) preprocess data to minimize distortions, blurring and temporal artifacts; (iv) represent data using the natural geometry of cortical and subcortical structures; (v) accurately align corresponding brain areas across subjects and studies; (vi) analyze data using neurobiologically accurate brain parcellations; and (vii) share published data via user-friendly databases. We illustrate the HCP-style paradigm using existing HCP data sets and provide guidance for future research. Widespread adoption of this paradigm should accelerate progress in understanding the brain in health and disease

    The Human Connectome Project: A retrospective

    Get PDF
    The Human Connectome Project (HCP) was launched in 2010 as an ambitious effort to accelerate advances in human neuroimaging, particularly for measures of brain connectivity; apply these advances to study a large number of healthy young adults; and freely share the data and tools with the scientific community. NIH awarded grants to two consortia; this retrospective focuses on the “WU-Minn-Ox” HCP consortium centered at Washington University, the University of Minnesota, and University of Oxford. In just over 6 years, the WU-Minn-Ox consortium succeeded in its core objectives by: 1) improving MR scanner hardware, pulse sequence design, and image reconstruction methods, 2) acquiring and analyzing multimodal MRI and MEG data of unprecedented quality together with behavioral measures from more than 1100 HCP participants, and 3) freely sharing the data (via the ConnectomeDB database) and associated analysis and visualization tools. To date, more than 27 Petabytes of data have been shared, and 1538 papers acknowledging HCP data use have been published. The “HCP-style” neuroimaging paradigm has emerged as a set of best-practice strategies for optimizing data acquisition and analysis. This article reviews the history of the HCP, including comments on key events and decisions associated with major project components. We discuss several scientific advances using HCP data, including improved cortical parcellations, analyses of connectivity based on functional and diffusion MRI, and analyses of brain-behavior relationships. We also touch upon our efforts to develop and share a variety of associated data processing and analysis tools along with detailed documentation, tutorials, and an educational course to train the next generation of neuroimagers. We conclude with a look forward at opportunities and challenges facing the human neuroimaging field from the perspective of the HCP consortium

    The heritability of multi-modal connectivity in human brain activity

    Get PDF
    Patterns of intrinsic human brain activity exhibit a profile of functional connectivity that is associated with behaviour and cognitive performance, and deteriorates with disease. This paper investigates the relative importance of genetic factors and the common environment between twins in determining this functional connectivity profile. Using functional magnetic resonance imaging (fMRI) on 820 subjects from the Human Connectome Project, and magnetoencephalographic (MEG) recordings from a subset, the heritability of connectivity between 39 cortical regions was estimated. On average over all connections, genes account for about 15% of the observed variance in fMRI connectivity (and about 10% in alpha-band and 20% in beta-band oscillatory power synchronisation), which substantially exceeds the contribution from the environment shared between twins. Therefore, insofar as twins share a common upbringing, it appears that genes, rather than the developmental environment, play a dominant role in determining the coupling of neuronal activity

    Assessing Preprocessing Methods and Software Tools for Functional Connectivity Analysis

    Get PDF
    This thesis presents an introductory exploration of the neuroimaging field, focusing on volume-based analysis using functional connectivity. By studying resting state data using the Power et al. atlas, the research aims to uncover patterns and relationships between brain regions, employing a qualitative study design with a single subject. DPABI Toolbox and Connectome Workbench were used to conduct volume-based functional connectivity analyses. DPABI Toolbox was used to preprocess and perform resting state analysis on the Human Connectome Project (HCP) raw data to produce a functional connectivity matrix. Connectome Workbench was used to create a pipeline for creating a resting state functional connectivity matrix. The resting state analysis was performed on the raw data preprocessed by DPABI and the preprocessed data provided by HCP. The resting state analyses performed on the same data preprocessed by DPABI yielded visually indistinguishable matrices. The similarity metrics further corroborated this observation, indicating no significant dissimilarities. The application of DPABI for preprocessing and resting state analysis was compared with a self-made pipeline utilizing preprocessed data from the HCP. The resting state analyses yielded two matrices that exhibited noticeable differences in their overall intensities, despite sharing a similar structure. The similarity metrics confirmed these distinctions, as they recorded lower values, indicating a dissimilarity between the matrices

    Multi-Modal Neuroimaging Analysis and Visualization Tool (MMVT)

    Full text link
    Sophisticated visualization tools are essential for the presentation and exploration of human neuroimaging data. While two-dimensional orthogonal views of neuroimaging data are conventionally used to display activity and statistical analysis, three-dimensional (3D) representation is useful for showing the spatial distribution of a functional network, as well as its temporal evolution. For these purposes, there is currently no open-source, 3D neuroimaging tool that can simultaneously visualize desired combinations of MRI, CT, EEG, MEG, fMRI, PET, and intracranial EEG (i.e., ECoG, depth electrodes, and DBS). Here we present the Multi-Modal Visualization Tool (MMVT), which is designed for researchers to interact with their neuroimaging functional and anatomical data through simultaneous visualization of these existing imaging modalities. MMVT contains two separate modules: The first is an add-on to the open-source, 3D-rendering program Blender. It is an interactive graphical interface that enables users to simultaneously visualize multi-modality functional and statistical data on cortical and subcortical surfaces as well as MEEG sensors and intracranial electrodes. This tool also enables highly accurate 3D visualization of neuroanatomy, including the location of invasive electrodes relative to brain structures. The second module includes complete stand-alone pre-processing pipelines, from raw data to statistical maps. Each of the modules and module features can be integrated, separate from the tool, into existing data pipelines. This gives the tool a distinct advantage in both clinical and research domains as each has highly specialized visual and processing needs. MMVT leverages open-source software to build a comprehensive tool for data visualization and exploration.Comment: 29 pages, 10 figure

    Novel Brain Complexity Measures Based on Information Theory

    Get PDF
    Brain networks are widely used models to understand the topology and organization of the brain. These networks can be represented by a graph, where nodes correspond to brain regions and edges to structural or functional connections. Several measures have been proposed to describe the topological features of these networks, but unfortunately, it is still unclear which measures give the best representation of the brain. In this paper, we propose a new set of measures based on information theory. Our approach interprets the brain network as a stochastic process where impulses are modeled as a random walk on the graph nodes. This new interpretation provides a solid theoretical framework from which several global and local measures are derived. Global measures provide quantitative values for the whole brain network characterization and include entropy, mutual information, and erasure mutual information. The latter is a new measure based on mutual information and erasure entropy. On the other hand, local measures are based on different decompositions of the global measures and provide different properties of the nodes. Local measures include entropic surprise, mutual surprise, mutual predictability, and erasure surprise. The proposed approach is evaluated using synthetic model networks and structural and functional human networks at different scales. Results demonstrate that the global measures can characterize new properties of the topology of a brain network and, in addition, for a given number of nodes, an optimal number of edges is found for small-world networks. Local measures show different properties of the nodes such as the uncertainty associated to the node, or the uniqueness of the path that the node belongs. Finally, the consistency of the results across healthy subjects demonstrates the robustness of the proposed measures
    • 

    corecore