5,937 research outputs found

    Deep learning cardiac motion analysis for human survival prediction

    Get PDF
    Motion analysis is used in computer vision to understand the behaviour of moving objects in sequences of images. Optimising the interpretation of dynamic biological systems requires accurate and precise motion tracking as well as efficient representations of high-dimensional motion trajectories so that these can be used for prediction tasks. Here we use image sequences of the heart, acquired using cardiac magnetic resonance imaging, to create time-resolved three-dimensional segmentations using a fully convolutional network trained on anatomical shape priors. This dense motion model formed the input to a supervised denoising autoencoder (4Dsurvival), which is a hybrid network consisting of an autoencoder that learns a task-specific latent code representation trained on observed outcome data, yielding a latent representation optimised for survival prediction. To handle right-censored survival outcomes, our network used a Cox partial likelihood loss function. In a study of 302 patients the predictive accuracy (quantified by Harrell's C-index) was significantly higher (p < .0001) for our model C=0.73 (95%\% CI: 0.68 - 0.78) than the human benchmark of C=0.59 (95%\% CI: 0.53 - 0.65). This work demonstrates how a complex computer vision task using high-dimensional medical image data can efficiently predict human survival

    Using Networks To Understand Medical Data: The Case of Class III Malocclusions

    Get PDF
    A system of elements that interact or regulate each other can be represented by a mathematical object called a network. While network analysis has been successfully applied to high-throughput biological systems, less has been done regarding their application in more applied fields of medicine; here we show an application based on standard medical diagnostic data. We apply network analysis to Class III malocclusion, one of the most difficult to understand and treat orofacial anomaly. We hypothesize that different interactions of the skeletal components can contribute to pathological disequilibrium; in order to test this hypothesis, we apply network analysis to 532 Class III young female patients. The topology of the Class III malocclusion obtained by network analysis shows a strong co-occurrence of abnormal skeletal features. The pattern of these occurrences influences the vertical and horizontal balance of disharmony in skeletal form and position. Patients with more unbalanced orthodontic phenotypes show preponderance of the pathological skeletal nodes and minor relevance of adaptive dentoalveolar equilibrating nodes. Furthermore, by applying Power Graphs analysis we identify some functional modules among orthodontic nodes. These modules correspond to groups of tightly inter-related features and presumably constitute the key regulators of plasticity and the sites of unbalance of the growing dentofacial Class III system. The data of the present study show that, in their most basic abstraction level, the orofacial characteristics can be represented as graphs using nodes to represent orthodontic characteristics, and edges to represent their various types of interactions. The applications of this mathematical model could improve the interpretation of the quantitative, patient-specific information, and help to better targeting therapy. Last but not least, the methodology we have applied in analyzing orthodontic features can be applied easily to other fields of the medical science.</p

    Current advances in systems and integrative biology

    Get PDF
    Systems biology has gained a tremendous amount of interest in the last few years. This is partly due to the realization that traditional approaches focusing only on a few molecules at a time cannot describe the impact of aberrant or modulated molecular environments across a whole system. Furthermore, a hypothesis-driven study aims to prove or disprove its postulations, whereas a hypothesis-free systems approach can yield an unbiased and novel testable hypothesis as an end-result. This latter approach foregoes assumptions which predict how a biological system should react to an altered microenvironment within a cellular context, across a tissue or impacting on distant organs. Additionally, re-use of existing data by systematic data mining and re-stratification, one of the cornerstones of integrative systems biology, is also gaining attention. While tremendous efforts using a systems methodology have already yielded excellent results, it is apparent that a lack of suitable analytic tools and purpose-built databases poses a major bottleneck in applying a systematic workflow. This review addresses the current approaches used in systems analysis and obstacles often encountered in large-scale data analysis and integration which tend to go unnoticed, but have a direct impact on the final outcome of a systems approach. Its wide applicability, ranging from basic research, disease descriptors, pharmacological studies, to personalized medicine, makes this emerging approach well suited to address biological and medical questions where conventional methods are not ideal

    A precision medicine initiative for Alzheimer's disease: the road ahead to biomarker-guided integrative disease modeling

    Get PDF
    After intense scientific exploration and more than a decade of failed trials, Alzheimer’s disease (AD) remains a fatal global epidemic. A traditional research and drug development paradigm continues to target heterogeneous late-stage clinically phenotyped patients with single 'magic bullet' drugs. Here, we propose that it is time for a paradigm shift towards the implementation of precision medicine (PM) for enhanced risk screening, detection, treatment, and prevention of AD. The overarching structure of how PM for AD can be achieved will be provided through the convergence of breakthrough technological advances, including big data science, systems biology, genomic sequencing, blood-based biomarkers, integrated disease modeling and P4 medicine. It is hypothesized that deconstructing AD into multiple genetic and biological subsets existing within this heterogeneous target population will provide an effective PM strategy for treating individual patients with the specific agent(s) that are likely to work best based on the specific individual biological make-up. The Alzheimer’s Precision Medicine Initiative (APMI) is an international collaboration of leading interdisciplinary clinicians and scientists devoted towards the implementation of PM in Neurology, Psychiatry and Neuroscience. It is hypothesized that successful realization of PM in AD and other neurodegenerative diseases will result in breakthrough therapies, such as in oncology, with optimized safety profiles, better responder rates and treatment responses, particularly through biomarker-guided early preclinical disease-stage clinical trials

    A Knowledge Graph Framework for Dementia Research Data

    Get PDF
    Dementia disease research encompasses diverse data modalities, including advanced imaging, deep phenotyping, and multi-omics analysis. However, integrating these disparate data sources has historically posed a significant challenge, obstructing the unification and comprehensive analysis of collected information. In recent years, knowledge graphs have emerged as a powerful tool to address such integration issues by enabling the consolidation of heterogeneous data sources into a structured, interconnected network of knowledge. In this context, we introduce DemKG, an open-source framework designed to facilitate the construction of a knowledge graph integrating dementia research data, comprising three core components: a KG-builder that integrates diverse domain ontologies and data annotations, an extensions ontology providing necessary terms tailored for dementia research, and a versatile transformation module for incorporating study data. In contrast with other current solutions, our framework provides a stable foundation by leveraging established ontologies and community standards and simplifies study data integration while delivering solid ontology design patterns, broadening its usability. Furthermore, the modular approach of its components enhances flexibility and scalability. We showcase how DemKG might aid and improve multi-modal data investigations through a series of proof-of-concept scenarios focused on relevant Alzheimer’s disease biomarkers

    Artificial intelligence and model checking methods for in silico clinical trials

    Get PDF
    Model-based approaches to safety and efficacy assessment of pharmacological treatments (In Silico Clinical Trials, ISCT) hold the promise to decrease time and cost for the needed experimentations, reduce the need for animal and human testing, and enable personalised medicine, where treatments tailored for each single patient can be designed before being actually administered. Research in Virtual Physiological Human (VPH) is harvesting such promise by developing quantitative mechanistic models of patient physiology and drugs. Depending on many parameters, such models define physiological differences among different individuals and different reactions to drug administrations. Value assignments to model parameters can be regarded as Virtual Patients (VPs). Thus, as in vivo clinical trials test relevant drugs against suitable candidate patients, ISCT simulate effect of relevant drugs against VPs covering possible behaviours that might occur in vivo. Having a population of VPs representative of the whole spectrum of human patient behaviours is a key enabler of ISCT. However, VPH models of practical relevance are typically too complex to be solved analytically or to be formally analysed. Thus, they are usually solved numerically within simulators. In this setting, Artificial Intelligence and Model Checking methods are typically devised. Indeed, a VP coupled together with a pharmacological treatment represents a closed-loop model where the VP plays the role of a physical subsystem and the treatment strategy plays the role of the control software. Systems with this structure are known as Cyber-Physical Systems (CPSs). Thus, simulation-based methodologies for CPSs can be employed within personalised medicine in order to compute representative VP populations and to conduct ISCT. In this thesis, we advance the state of the art of simulation-based Artificial Intelligence and Model Checking methods for ISCT in the following directions. First, we present a Statistical Model Checking (SMC) methodology based on hypothesis testing that, given a VPH model as input, computes a population of VPs which is representative (i.e., large enough to represent all relevant phenotypes, with a given degree of statistical confidence) and stratified (i.e., organised as a multi-layer hierarchy of homogeneous sub-groups). Stratification allows ISCT to adaptively focus on specific phenotypes, also supporting prioritisation of patient sub-groups in follow-up in vivo clinical trials. Second, resting on a representative VP population, we design an ISCT aiming at optimising a complex treatment for a patient digital twin, that is the virtual counterpart of that patient physiology defined by means of a set of VPs. Our ISCT employs an intelligent search driving a VPH model simulator to seek the lightest but still effective treatment for the input patient digital twin. Third, to enable interoperability among VPH models defined with different modelling and simulation environments and to increase efficiency of our ISCT, we also design an optimised simulator driver to speed-up backtracking-based search algorithms driving simulators. Finally, we evaluate the effectiveness of our presented methodologies on state-of-the-art use cases and validate our results on retrospective clinical data

    Principal component gene set enrichment (PCGSE)

    Get PDF
    Motivation: Although principal component analysis (PCA) is widely used for the dimensional reduction of biomedical data, interpretation of PCA results remains daunting. Most existing methods attempt to explain each principal component (PC) in terms of a small number of variables by generating approximate PCs with few non-zero loadings. Although useful when just a few variables dominate the population PCs, these methods are often inadequate for characterizing the PCs of high-dimensional genomic data. For genomic data, reproducible and biologically meaningful PC interpretation requires methods based on the combined signal of functionally related sets of genes. While gene set testing methods have been widely used in supervised settings to quantify the association of groups of genes with clinical outcomes, these methods have seen only limited application for testing the enrichment of gene sets relative to sample PCs. Results: We describe a novel approach, principal component gene set enrichment (PCGSE), for computing the statistical association between gene sets and the PCs of genomic data. The PCGSE method performs a two-stage competitive gene set test using the correlation between each gene and each PC as the gene-level test statistic with flexible choice of both the gene set test statistic and the method used to compute the null distribution of the gene set statistic. Using simulated data with simulated gene sets and real gene expression data with curated gene sets, we demonstrate that biologically meaningful and computationally efficient results can be obtained from a simple parametric version of the PCGSE method that performs a correlation-adjusted two-sample t-test between the gene-level test statistics for gene set members and genes not in the set. Availability: http://cran.r-project.org/web/packages/PCGSE/index.html Contact: [email protected] or [email protected]

    An Object-Oriented Approach to Knowledge Representation in a Biomedical Domain

    Get PDF
    An object-oriented approach has been applied to the different stages involved in developing a knowledge base about insulin metabolism. At an early stage the separation of terminological and assertional knowledge was made. The terminological component was developed by medical experts and represented in CORE. An object-oriented knowledge acquisition process was applied to the assertional knowledge. A frame description is proposed which includes features like states and events, inheritance and collaboration. States and events are formalized with qualitative calculus. The terminological knowledge was very useful in the development of the assertional component. It assisteed in understanding the problem domain, and in the implementation stage, it assisted in building good inheritance hierarchies
    • …
    corecore