9 research outputs found

    Distributed learning on 20 000+ lung cancer patients - The Personal Health Train

    Get PDF
    Background and purpose Access to healthcare data is indispensable for scientific progress and innovation. Sharing healthcare data is time-consuming and notoriously difficult due to privacy and regulatory concerns. The Personal Health Train (PHT) provides a privacy-by-design infrastructure connecting FAIR (Findable, Accessible, Interoperable, Reusable) data sources and allows distributed data analysis and machine learning. Patient data never leaves a healthcare institute. Materials and methods Lung cancer patient-specific databases (tumor staging and post-treatment survival information) of oncology departments were translated according to a FAIR data model and stored locally in a graph database. Software was installed locally to enable deployment of distributed machine learning algorithms via a central server. Algorithms (MATLAB, code and documentation publicly available) are patient privacy-preserving as only summary statistics and regression coefficients are exchanged with the central server. A logistic regression model to predict post-treatment two-year survival was trained and evaluated by receiver operating characteristic curves (ROC), root mean square prediction error (RMSE) and calibration plots. Results In 4 months, we connected databases with 23 203 patient cases across 8 healthcare institutes in 5 countries (Amsterdam, Cardiff, Maastricht, Manchester, Nijmegen, Rome, Rotterdam, Shanghai) using the PHT. Summary statistics were computed across databases. A distributed logistic regression model predicting post-treatment two-year survival was trained on 14 810 patients treated between 1978 and 2011 and validated on 8 393 patients treated between 2012 and 2015. Conclusion The PHT infrastructure demonstrably overcomes patient privacy barriers to healthcare data sharing and enables fast data analyses across multiple institutes from different countries with different regulatory regimens. This infrastructure promotes global evidence-based medicine while prioritizing patient privacy

    Proteolytic Processing of the Cryptosporidium Glycoprotein gp40/15 by Human Furin and by a Parasite-Derived Furin-Like Protease Activity

    No full text
    The apicomplexan parasite Cryptosporidium causes diarrheal disease worldwide. Proteolytic processing of proteins plays a significant role in host cell invasion by apicomplexan parasites. In previous studies, we described gp40/15, a Cryptosporidium sp. glycoprotein that is proteolytically cleaved to yield two surface glycopeptides (gp40 and gp15), which are implicated in mediating infection of host cells. In the present study, we showed that biosynthetically labeled gp40/15 is processed in Cryptosporidium parvum-infected HCT-8 cells. We identified a putative furin cleavage site RSRR↓ in the deduced amino acid sequence of gp40/15 from C. parvum and from all Cryptosporidium hominis subtypes except subtype 1e. Both human furin and a protease activity present in a C. parvum lysate cleaved recombinant C. parvum gp40/15 protein into 2 peptides, identified as gp40 and gp15 by size and by immunoreactivity with specific antibodies. C. hominis gp40/15 subtype 1e, in which the RSRR sequence is replaced by ISKR, has an alternative furin cleavage site (KSISKR↓) and was also cleaved by both furin and the C. parvum lysate. Site-directed mutagenesis of the C. parvum RSRR sequence to ASRR resulted in inhibition of cleavage by furin and the C. parvum lysate. Cleavage of recombinant gp40/15 and a synthetic furin substrate by the C. parvum lysate was inhibited by serine protease inhibitors, by the specific furin inhibitor decanoyl-Arg-Val-Lys-Arg-chloromethylketone (Dec-RVKR-cmk), and by calcium chelators, suggesting that the parasite expresses a Ca(2+) dependent, furin-like protease activity. The furin inhibitor Dec-RVKR-cmk decreased C. parvum infection of HCT-8 cells, suggesting that a furin-like protease activity may be involved in mediating host-parasite interactions
    corecore