9 research outputs found

    Erratum to: 36th International Symposium on Intensive Care and Emergency Medicine

    Get PDF
    [This corrects the article DOI: 10.1186/s13054-016-1208-6.]

    Integrative modeling of inhibitor response in breast cancer cells

    No full text
    Cancer patients often respond very differently to any given drug. Some patients respond very well, while others do not respond at all, leaving the cancer to grow unimpeded. If we have a good understanding of how this variability in response arises, we will be better able to choose the optimal treatment strategy for each patient. The variability in drug response observed in patients is also seen in cancer cell lines when they are cultured in vitro. Detailed cell-biological studies have revealed many different mechanisms which affect the response of cancer cells to anticancer drugs. Certain mutations can render cells sensitive to a certain drug, while other mutations, or changes in gene expression, can cause resistance. However, since any combination of these drug sensitivity mechanisms can be operating in a particular cell line, it is difficult to predict whether it will be sensitive or resistant to a particular drug. Computational modeling can be used to better understand this complexity. In this dissertation, we developed a novel method, which we call Inference of Signaling Activity, that can be used to infer the contributions of different drug sensitivity- and resistance mechanisms. We used the available knowledge of signal transduction in cells, and integrated multiple data types including mutations, gene amplifications and deletions, gene expression levels, protein phosphorylation, growth rates and drug response data to infer the signaling activities in each cell line. After an extensive characterization of thirty different breast cancer cell lines, we developed a model that can explain a large part of the variability in the response of these cell lines to seven different kinase inhibitors. At the same time, the response of some cell lines was not recapitulated exactly. Using further data-driven analysis, we found a novel determinant of mTOR inhibitor sensitivity. Overexpression of 4EBP1 in breast cancer cells renders them more sensitive to these inhibitors. This modeling approach can now be further developed to determine whether it can also be used to explain and predict the response of cancer patients. Initially this modeling framework did not permit the inclusion of feedback signaling mechanisms, even though we know feedback control to be an important feature of cellular signaling networks. We therefore subsequently extended our framework such that feedback could be included, and with this extension we were able to delineate signaling activities in regulatory networks with multiple, interrelated feedback loops, again taking into account different datasets. An important consideration in this dissertation was the quantification of uncertainty in model parameters, for which we used Bayesian statistics. If the uncertainty in parameter estimates is not taken into account, we can be lulled into a false sense of security and misinterpret which elements of the model are important. We developed a software package with efficient, multi-threaded implementations of various Monte Carlo sampling algorithms, which allowed the inference to be done in workable amounts of time. We further showed in a different biological system – cell cycle regulation in yeast – that the integration of different types of measurements can increase the identifiability of parameters. Finally, we investigated whether Bayesian inference with multiple datasets can be done sequentially using intermediate posterior approximations. Each of these contributions to Bayesian inference with multiple datasets may be used more broadly in modeling different biological systems. Although further development and validation of the drug response models is needed, the use of integrative computational modeling appears to be a promising approach for enabling precision medicine for cancer patients in the future.Pattern Recognition and Bioinformatic

    Approximating multivariate posterior distribution functions from Monte Carlo samples for sequential Bayesian inference

    No full text
    An important feature of Bayesian statistics is the opportunity to do sequential inference: The posterior distribution obtained after seeing a dataset can be used as prior for a second inference. However, when Monte Carlo sampling methods are used for inference, we only have a set of samples from the posterior distribution. To do sequential inference, we then either have to evaluate the second posterior at only these locations and reweight the samples accordingly, or we can estimate a functional description of the posterior probability distribution from the samples and use that as prior for the second inference. Here, we investigated to what extent we can obtain an accurate joint posterior from two datasets if the inference is done sequentially rather than jointly, under the condition that each inference step is done using Monte Carlo sampling. To test this, we evaluated the accuracy of kernel density estimates, Gaussian mixtures, mixtures of factor analyzers, vine copulas and Gaussian processes in approximating posterior distributions, and then tested whether these approximations can be used in sequential inference. In low dimensionality, Gaussian processes are more accurate, whereas in higher dimensionality Gaussian mixtures, mixtures of factor analyzers or vine copulas perform better. In our test cases of sequential inference, using posterior approximations gives more accurate results than direct sample reweighting, but joint inference is still preferable over sequential inference whenever possible. Since the performance is case-specific, we provide an R package mvdens with a unified interface for the density approximation methods.Pattern Recognition and Bioinformatic

    CREAtive ways: The art of looking sideways

    No full text
    A design for a cultural center and a swimming pool designed at the Binnengasthuisterrein in Amsterdam, a former cloister and hospital. The design is a (happily naive) exploration of the possibility to reconstruct and elaborate on the delicate and unusual local urban condition.Interiors of Buildings and CitiesArchitectur

    BMC: Toolkit for Bayesian analysis of Computational Models using samplers

    Get PDF
    BackgroundComputational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model.ResultsWe present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved.ConclusionsBCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.Pattern Recognition and Bioinformatic

    Contactless interfacial rheology: Probing shear at liquid-liquid interfaces without an interfacial geometry via fluorescence microscopy

    No full text
    Interfacial rheology is important for understanding properties such as Pickering emulsion or foam stability. Currently, the response is measured using a probe directly attached to the interface. This can both disturb the interface and is coupled to flow in the bulk phase, limiting its sensitivity. We have developed a contactless interfacial method to perform interfacial shear rheology on liquid/liquid interfaces with no tool attached directly to the interface. This is achieved by shearing one of the liquid phases and measuring the interfacial response via confocal microscopy. Using this method, we have measured steady shear material parameters such as interfacial elastic moduli for interfaces with solidlike behavior and interfacial viscosities for fluidlike interfaces. The accuracy of this method has been verified relative to a double-wall ring geometry. Moreover, using our contactless method, we are able to measure lower interfacial viscosities than those that have previously been reported using a double-wall ring geometry. A further advantage is the simultaneous combination of macroscopic rheological analysis with microscopic structural analysis. Our analysis directly visualizes how the interfacial response is strongly correlated to the particle surface coverage and their interfacial assembly. Furthermore, we capture the evolution and irreversible changes in the particle assembly that correspond with the rheological response to steady shear. BN/Gijsje Koenderink La

    GPGPU Linear Complexity t-SNE Optimization

    No full text
    In recent years the t-distributed Stochastic Neighbor Embedding (t-SNE) algorithm has become one of the most used and insightful techniques for exploratory data analysis of high-dimensional data. It reveals clusters of high-dimensional data points at different scales while only requiring minimal tuning of its parameters. However, the computational complexity of the algorithm limits its application to relatively small datasets. To address this problem, several evolutions of t-SNE have been developed in recent years, mainly focusing on the scalability of the similarity computations between data points. However, these contributions are insufficient to achieve interactive rates when visualizing the evolution of the t-SNE embedding for large datasets. In this work, we present a novel approach to the minimization of the t-SNE objective function that heavily relies on graphics hardware and has linear computational complexity. Our technique decreases the computational cost of running t-SNE on datasets by orders of magnitude and retains or improves on the accuracy of past approximated techniques. We propose to approximate the repulsive forces between data points by splatting kernel textures for each data point. This approximation allows us to reformulate the t-SNE minimization problem as a series of tensor operations that can be efficiently executed on the graphics card. An efficient implementation of our technique is integrated and available for use in the widely used Google TensorFlow.js, and an open-source C++ library.Accepted author manuscriptComp Graphics & VisualisationPattern Recognition and Bioinformatic

    Geocon bridge geopolymer concrete mixture for structural applications

    Get PDF
    The sustainability of infrastructure projects is becoming increasingly important issue in engineering practice. This means that in the future the construction materials will be selected on the basis of the contribution they can make to reach sustainability requirements. Geopolymers are materials based on by-products from industries. By using geopolymer concrete technology it is possible to reduce our waste and to produce concrete in the environmental-friendly way. An 80% or greater reduction of greenhouse gases compared with Ordinary Portland Cement (OPC) can be achieved through geopolymer technology. However, there are limited practical applications and experience. For a broad and large scale industrial application of geopolymer concrete, challenges still exist in the technological and engineering aspects. The main goal of GeoCon Bridge project was to develop a geopolymer concrete mixture and to upscale it to structural application. The outputs of projects provide input for development of recommendations for structural design of geopolymer based reinforced concrete elements. Through a combination of laboratory experiments on material and structural elements, structural design and finite element simulations, and based on previous experience with OPC concrete, knowledge generated in this project provides an important step towards a “cement free” construction. The project was performed jointly by three team members: Microlab and Group of Concrete Structures from Technical University of Delft and Technical University of Eindhoven.Materials and EnvironmentConcrete StructuresMicrolabSteel & Composite Structure

    Erratum to: 36th International Symposium on Intensive Care and Emergency Medicine

    No full text
    corecore