60 research outputs found

    HIV modelling - parallel implementation strategies

    Get PDF
    We report on the development of a model to understand why the range of experience with respect to HIV infection is so diverse, especially with respect to the latency period. To investigate this, an agent-based approach is used to extract highlevel behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A network of independent matrices mimics the chain of lymph nodes. Dealing with massively multi-agent systems requires major computational effort. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach and, using the MPI library, are here implemented, tested and optimized. Our current focus is on the various implementations of the data transfer across the network. Three communications strategies are proposed and tested, showing that the most efficient approach is communication based on the natural lymph-network connectivity

    An agent-based approach to immune modelling

    Get PDF
    This study focuses on trying to understand why the range of experience with respect to HIV infection is so diverse, especially as regards to the latency period. The challenge is to determine what assumptions can be made about the nature of the experience of antigenic invasion and diversity that can be modelled, tested and argued plausibly. To investigate this, an agent-based approach is used to extract high-level behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties contributing to the individual disease experience and is included in a network which mimics the chain of lymphatic nodes. Dealing with massively multi-agent systems requires major computational efforts. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach. These are implemented using the MPI library

    Predicting the outcomes of HIV treatment interruptions using computational modelling

    Get PDF
    In the past 30 years, HIV infection made a transition from fatal to chronic disease due to the emergence of potent treatment largely suppressing viral replication. However, this medication must be administered life-long on a regular basis to maintain viral suppression and is not always well tolerated. Any interruption of treatment causes residual virus to be reactivated and infection to progress, where the underlying processes occurring and consequences for the immune system are still poorly understood. Nonetheless, treatment interruptions are common due to adherence issues or limited access to antiretroviral drugs. Early clinical studies, aiming at application of treatment interruptions in a structured way, gave contradictory results concerning patient safety, discouraging further trials. In-silico models potentially add to knowledge but a review of the Literature indicates most current models used for studying treatment interruptions (equation-based), neglect recent clinical findings of collagen formation in lymphatic tissue due to HIV and its crucial role in immune system stability and efficacy. The aim of this research is the construction and application of so-called ‘Bottom-Up’ models to allow improved assessment of these processes in relation to HIV treatment interruptions. In this regard, a novel computational model based on 2D Cellular Automata for lymphatic tissue depletion and associated damage to the immune system was developed. Hence, (i) using this model, the influence of spatial distribution of collagen formation on HIV infection progression speed was evaluated while discussing aspects of computational performance. Further, (ii) direct Monte Carlo simulations were employed to explore the accumulation of tissue impairment due to repeated treatment interruptions and consequences for long-term prognosis. Finally, (iii) an inverse Monte Carlo approach was used to reconstruct yet unknown characteristics of patient groups. This is based on sparse data from past clinical studies on treatment interruptions with the aim of explaining their contradictory results

    Multi-layered model of individual HIV infection progression and mechanisms of phenotypical expression

    Get PDF
    Cite as: Perrin, Dimitri (2008) Multi-layered model of individual HIV infection progression and mechanisms of phenotypical expression. PhD thesis, Dublin City University

    Choices and trade-offs in inference with infectious disease models.

    Get PDF
    Inference using mathematical models of infectious disease dynamics can be an invaluable tool for the interpretation and analysis of epidemiological data. However, researchers wishing to use this tool are faced with a choice of models and model types, simulation methods, inference methods and software packages. Given the multitude of options, it can be challenging to decide on the best approach. Here, we delineate the choices and trade-offs involved in deciding on an approach for inference, and discuss aspects that might inform this decision. We provide examples of inference with a dataset of influenza cases using the R packages pomp and rbi

    Stochastic computational modelling of complex drug delivery systems

    Get PDF
    As modern drug formulations become more advanced, pharmaceutical companies face the need for adequate tools to permit them to model complex requirements and to reduce unnecessary adsorption rates while increasing the dosage administered. The aim of the research presented here is the development and application of a general stochastic framework with agent-based elements for building drug dissolution models, with a particular focus on controlled release systems. The utilisation of three dimensional Cellular Automata and Monte Carlo methods, to describe structural compositions and the main physico-chemical mechanisms, is shown to have several key advantages: (i) the bottom up approach simplifies the definition of complex interactions between underlying phenomena such as diffusion,polymer degradation and hydration, and the dissolution media; (ii) permits straightforward extensibility for drug formulation variations in terms of supporting various geometries and exploring effects of polymer composition and layering; (iii) facilitates visualisation, affording insight on system structural evolution over time by capturing successive stages of dissolution. The framework has been used to build models simulating several distinct release scenarios from coated spheres covering single coated erosion and swelling dominated spheres as well as the influence of multiple heterogeneous coatings. High-performance computational optimisation enables precision simulations of the very thin coatings used and allows fast realisation of model state changes. Furthermore, theoretical analysis of the comparative impact of synchronous and asynchronous Cellular Automata and the suitability of their application to pharmaceutical systems is performed. Likely parameter distributions from noisy in vitro data are reconstructed using Inverse Monte Carlo methods and outcomes are reported

    Control and surveillance of partially observed stochastic epidemics in a Bayesian framework

    Get PDF
    This thesis comprises a number of inter-related parts. For most of the thesis we are concerned with developing a new statistical technique that can enable the identi cation of the optimal control by comparing competing control strategies for stochastic epidemic models in real time. In the second part, we develop a novel approach for modelling the spread of Peste des Petits Ruminants (PPR) virus within a given country and the risk of introduction to other countries. The control of highly infectious diseases of agriculture crops, animal and human diseases is considered as one of the key challenges in epidemiological and ecological modelling. Previous methods for analysis of epidemics, in which different controls are compared, do not make full use of the trajectory of the epidemic. Most methods use the information provided by the model parameters which may consider partial information on the epidemic trajectory, so for example the same control strategy may lead to different outcomes when the experiment is repeated. Also, by using partial information it is observed that it might need more simulated realisations when comparing two different controls. We introduce a statistical technique that makes full use of the available information in estimating the effect of competing control strategies on real-time epidemic outbreaks. The key to this approach lies in identifying a suitable mechanism to couple epidemics, which could be unaffected by controls. To that end, we use the Sellke construction as a latent process to link epidemics with different control strategies. The method is initially applied on non-spatial processes including SIR and SIS models assuming that there are no observation data available before moving on to more complex models that explicitly represent the spatial nature of the epidemic spread. In the latter case, the analysis is conditioned on some observed data and inference on the model parameters is performed in Bayesian framework using the Markov Chain Monte Carlo (MCMC) techniques coupled with the data augmentation methods. The methodology is applied on various simulated data sets and to citrus canker data from Florida. Results suggest that the approach leads to highly positively correlated outcomes of different controls, thus reducing the variability between the effect of different control strategies, hence providing a more efficient estimator of their expected differences. Therefore, a reduction of the number of realisations required to compare competing strategies in term of their expected outcomes is obtained. The main purpose of the final part of this thesis is to develop a novel approach to modelling the speed of Pest des Petits Ruminants (PPR) within a given country and to understand the risk of subsequent spread to other countries. We are interested in constructing models that can be fitted using information on the occurrence of outbreaks as the information on the susceptible population is not available, and use these models to estimate the speed of spatial spread of the virus. However, there was little prior modelling on which the models developed here could be built. We start by first establishing a spatio-temporal stochastic formulation for the spread of PPR. This modelling is then used to estimate spatial transmission and speed of spread. To account for uncertainty on the lack of information on the susceptible population, we apply ideas from Bayesian modelling and data augmentation by treating the transmission network as a missing quantity. Lastly, we establish a network model to address questions regarding the risk of spread in the large-scale network of countries and introduce the notion of ` first-passage time' using techniques from graph theory and operational research such as the Bellman-Ford algorithm. The methodology is first applied to PPR data from Tunisia and on simulated data. We also use simulated models to investigate the dynamics of spread through a network of countries

    Bayesian inference for indirectly observed stochastic processes, applications to epidemic modelling

    Get PDF
    Stochastic processes are mathematical objects that offer a probabilistic representation of how some quantities evolve in time. In this thesis we focus on estimating the trajectory and parameters of dynamical systems in cases where only indirect observations of the driving stochastic process are available. We have first explored means to use weekly recorded numbers of cases of Influenza to capture how the frequency and nature of contacts made with infected individuals evolved in time. The latter was modelled with diffusions and can be used to quantify the impact of varying drivers of epidemics as holidays, climate, or prevention interventions. Following this idea, we have estimated how the frequency of condom use has evolved during the intervention of the Gates Foundation against HIV in India. In this setting, the available estimates of the proportion of individuals infected with HIV were not only indirect but also very scarce observations, leading to specific difficulties. At last, we developed a methodology for fractional Brownian motions (fBM), here a fractional stochastic volatility model, indirectly observed through market prices. The intractability of the likelihood function, requiring augmentation of the parameter space with the diffusion path, is ubiquitous in this thesis. We aimed for inference methods robust to refinements in time discretisations, made necessary to enforce accuracy of Euler schemes. The particle Marginal Metropolis Hastings (PMMH) algorithm exhibits this mesh free property. We propose the use of fast approximate filters as a pre-exploration tool to estimate the shape of the target density, for a quicker and more robust adaptation phase of the asymptotically exact algorithm. The fBM problem could not be treated with the PMMH, which required an alternative methodology based on reparameterisation and advanced Hamiltonian Monte Carlo techniques on the diffusion pathspace, that would also be applicable in the Markovian setting

    Biomolecular simulations: From dynamics and mechanisms to computational assays of biological activity

    Get PDF
    Biomolecular simulation is increasingly central to understanding and designing biological molecules and their interactions. Detailed, physics‐based simulation methods are demonstrating rapidly growing impact in areas as diverse as biocatalysis, drug delivery, biomaterials, biotechnology, and drug design. Simulations offer the potential of uniquely detailed, atomic‐level insight into mechanisms, dynamics, and processes, as well as increasingly accurate predictions of molecular properties. Simulations can now be used as computational assays of biological activity, for example, in predictions of drug resistance. Methodological and algorithmic developments, combined with advances in computational hardware, are transforming the scope and range of calculations. Different types of methods are required for different types of problem. Accurate methods and extensive simulations promise quantitative comparison with experiments across biochemistry. Atomistic simulations can now access experimentally relevant timescales for large systems, leading to a fertile interplay of experiment and theory and offering unprecedented opportunities for validating and developing models. Coarse‐grained methods allow studies on larger length‐ and timescales, and theoretical developments are bringing electronic structure calculations into new regimes. Multiscale methods are another key focus for development, combining different levels of theory to increase accuracy, aiming to connect chemical and molecular changes to macroscopic observables. In this review, we outline biomolecular simulation methods and highlight examples of its application to investigate questions in biology. This article is categorized under: Molecular and Statistical Mechanics > Molecular Dynamics and Monte‐Carlo Methods Structure and Mechanism > Computational Biochemistry and Biophysics Molecular and Statistical Mechanics > Free Energy Method
    corecore