44 research outputs found

    Efficient Bayesian inference for mechanistic modelling with high-throughput data

    Get PDF
    Bayesian methods are routinely used to combine experimental data with detailed mathematical models to obtain insights into physical phenomena. However, the computational cost of Bayesian computation with detailed models has been a notorious problem. Moreover, while high-throughput data presents opportunities to calibrate sophisticated models, comparing large amounts of data with model simulations quickly becomes computationally prohibitive. Inspired by the method of Stochastic Gradient Descent, we propose a minibatch approach to approximate Bayesian computation. Through a case study of a high-throughput imaging scratch assay experiment, we show that reliable inference can be performed at a fraction of the computational cost of a traditional Bayesian inference scheme. By applying a detailed mathematical model of single cell motility, proliferation and death to a data set of 118 gene knockdowns, we characterise functional subgroups of gene knockdowns, each displaying its own typical combination of local cell density-dependent and -independent motility and proliferation patterns. By comparing these patterns to experimental measurements of cell counts and wound closure, we find that density-dependent interactions play a crucial role in the process of wound healing

    Efficient Bayesian inference for mechanistic modelling with high-throughput data

    Get PDF
    Bayesian methods are routinely used to combine experimental data with detailed mathematical models to obtain insights into physical phenomena. However, the computational cost of Bayesian computation with detailed models has been a notorious problem. Moreover, while high-throughput data presents opportunities to calibrate sophisticated models, comparing large amounts of data with model simulations quickly becomes computationally prohibitive. Inspired by the method of Stochastic Gradient Descent, we propose a minibatch approach to approximate Bayesian computation. Through a case study of a high-throughput imaging scratch assay experiment, we show that reliable inference can be performed at a fraction of the computational cost of a traditional Bayesian inference scheme. By applying a detailed mathematical model of single cell motility, proliferation and death to a data set of 118 gene knockdowns, we characterise functional subgroups of gene knockdowns, each displaying its own typical combination of local cell density-dependent and -independent motility and proliferation patterns. By comparing these patterns to experimental measurements of cell counts and wound closure, we find that density-dependent interactions play a crucial role in the process of wound healing

    Evolution and Impact of High Content Imaging

    Get PDF
    Abstract/outline: The field of high content imaging has steadily evolved and expanded substantially across many industry and academic research institutions since it was first described in the early 1990′s. High content imaging refers to the automated acquisition and analysis of microscopic images from a variety of biological sample types. Integration of high content imaging microscopes with multiwell plate handling robotics enables high content imaging to be performed at scale and support medium- to high-throughput screening of pharmacological, genetic and diverse environmental perturbations upon complex biological systems ranging from 2D cell cultures to 3D tissue organoids to small model organisms. In this perspective article the authors provide a collective view on the following key discussion points relevant to the evolution of high content imaging:• Evolution and impact of high content imaging: An academic perspective• Evolution and impact of high content imaging: An industry perspective• Evolution of high content image analysis• Evolution of high content data analysis pipelines towards multiparametric and phenotypic profiling applications• The role of data integration and multiomics• The role and evolution of image data repositories and sharing standards• Future perspective of high content imaging hardware and softwar

    A Graph Based Neural Network Approach to Immune Profiling of Multiplexed Tissue Samples

    Full text link
    Multiplexed immunofluorescence provides an unprecedented opportunity for studying specific cell-to-cell and cell microenvironment interactions. We employ graph neural networks to combine features obtained from tissue morphology with measurements of protein expression to profile the tumour microenvironment associated with different tumour stages. Our framework presents a new approach to analysing and processing these complex multi-dimensional datasets that overcomes some of the key challenges in analysing these data and opens up the opportunity to abstract biologically meaningful interactions

    Proteomics profiling of interactome dynamics by colocalisation analysis (COLA)

    Get PDF
    Localisation and protein function are intimately linked in eukaryotes, as proteins are localised to specific compartments where they come into proximity of other functionally relevant proteins. Significant co-localisation of two proteins can therefore be indicative of their functional association. We here present COLA, a proteomics based strategy coupled with a bioinformatics framework to detect protein–protein co-localisations on a global scale. COLA reveals functional interactions by matching proteins with significant similarity in their subcellular localisation signatures. The rapid nature of COLA allows mapping of interactome dynamics across different conditions or treatments with high precision.Cancer Research UK; BBSRC

    KCML: a machine‐learning framework for inference of multi‐scale gene functions from genetic perturbation screens

    No full text
    Abstract Characterising context‐dependent gene functions is crucial for understanding the genetic bases of health and disease. To date, inference of gene functions from large‐scale genetic perturbation screens is based on ad hoc analysis pipelines involving unsupervised clustering and functional enrichment. We present Knowledge‐ and Context‐driven Machine Learning (KCML), a framework that systematically predicts multiple context‐specific functions for a given gene based on the similarity of its perturbation phenotype to those with known function. As a proof of concept, we test KCML on three datasets describing phenotypes at the molecular, cellular and population levels and show that it outperforms traditional analysis pipelines. In particular, KCML identified an abnormal multicellular organisation phenotype associated with the depletion of olfactory receptors, and TGFβ and WNT signalling genes in colorectal cancer cells. We validate these predictions in colorectal cancer patients and show that olfactory receptors expression is predictive of worse patient outcomes. These results highlight KCML as a systematic framework for discovering novel scale‐crossing and context‐dependent gene functions. KCML is highly generalisable and applicable to various large‐scale genetic perturbation screens

    HCC1954

    No full text
    Files are 16-bit tiffs (which means they will appear black if opened in Preview, Photoshop, or similar program). Each file is a Multi-plane tiff, containing three fluorescence channels: Channel 1 = DAPI (Sigma) Channel 2 = NF-kappaB (anti-p65; Abcam ab16502 / Alexa-488 anti-rabbit; Invitrogen) Channel 3 = DHE (dihydroethidium, hydroethidine; Sigma) File names refer to [row][column]-[field

    ZR75.1

    No full text
    Files are 16-bit tiffs (which means they will appear black if opened in Preview, Photoshop, or similar program). Each file is a Multi-plane tiff, containing three fluorescence channels: Channel 1 = DAPI (Sigma) Channel 2 = NF-kappaB (anti-p65; Abcam ab16502 / Alexa-488 anti-rabbit; Invitrogen) Channel 3 = DHE (dihydroethidium, hydroethidine; Sigma) File names refer to [row][column]-[field] See ReadMe file for culture and staining information
    corecore