499 research outputs found

    Design and Development of Software Tools for Bio-PEPA

    Get PDF
    This paper surveys the design of software tools for the Bio-PEPA process algebra. Bio-PEPA is a high-level language for modelling biological systems such as metabolic pathways and other biochemical reaction networks. Through providing tools for this modelling language we hope to allow easier use of a range of simulators and model-checkers thereby freeing the modeller from the responsibility of developing a custom simulator for the problem of interest. Further, by providing mappings to a range of different analysis tools the Bio-PEPA language allows modellers to compare analysis results which have been computed using independent numerical analysers, which enhances the reliability and robustness of the results computed.

    Modelling chemistry in the nocturnal boundary layer above tropical rainforest and a generalised effective nocturnal ozone deposition velocity for sub-ppbv NOx conditions

    Get PDF
    Measurements of atmospheric composition have been made over a remote rainforest landscape. A box model has previously been demonstrated to model the observed daytime chemistry well. However the box model is unable to explain the nocturnal measurements of relatively high [NO] and [O3], but relatively low observed [NO2]. It is shown that a one-dimensional (1-D) column model with simple O3 -NOx chemistry and a simple representation of vertical transport is able to explain the observed nocturnal concentrations and predict the likely vertical profiles of these species in the nocturnal boundary layer (NBL). Concentrations of tracers carried over from the end of the night can affect the atmospheric chemistry of the following day. To ascertain the anomaly introduced by using the box model to represent the NBL, vertically-averaged NBL concentrations at the end of the night are compared between the 1-D model and the box model. It is found that, under low to medium [NOx] conditions (NOx <1 ppbv), a simple parametrisation can be used to modify the box model deposition velocity of ozone, in order to achieve good agreement between the box and 1-D models for these end-of-night concentrations of NOx and O3. This parametrisation would could also be used in global climate-chemistry models with limited vertical resolution near the surface. Box-model results for the following day differ significantly if this effective nocturnal deposition velocity for ozone is implemented; for instance, there is a 9% increase in the following day’s peak ozone concentration. However under medium to high [NOx] conditions (NOx > 1 ppbv), the effect on the chemistry due to the vertical distribution of the species means no box model can adequately represent chemistry in the NBL without modifying reaction rate constants

    NIRT: Developing a Nanoscale Sensing Device for Measuring the Supply of Iron to Phytoplankton in Marine Systems

    Get PDF
    There is increasing evidence that Fe has a singularly unique role in marine ecosystems, both regulating total phytoplankton production in high nitrate, low chlorophyll regions of the world, and influencing the predominant composition of the phytoplankton assemblages found in others. It is remarkable then that there is no agreement about how to define biologically available Fe, in contrast to the macronutrients nitrogen, phosphorous or silicon. Current attempts to attain predictive insights to how ocean ecosystems will influence the magnitude of climate change are blocked in large part by this question, along with an extreme shortage of data on Fe distributions in the oceans. There is recent evidence that Fe availability can be regulated in bulk seawater incubations by small additions of the fungal siderophore desferrioximine B (DFB). The Fe-DFB complex is not readily available to eukaryotic phytoplankton, so that if the quantity of Fe complexed by DFB were measured and calibrated to Fe uptake by phytoplankton it could yield a novel first order measure of Fe availability. Building from our current research we have developed liposomes that specifically acquire DFB-bound Fe from solution. These devices, 100 nm in diameter, open the way to applying nanotechnology to create a new breed of Fe biosensors in marine waters. The project goals are to 1) optimize these nanodevices by improving their physical robustness, identifying the size/functionality relationship, and examining the efficacy of other DFB-Fe transporter molecules, 2) develop self-reporting capabilities for quantifying Fe uptake by these nanodevices, and 3) to calibrate the capture of Fe by these nanodevices to the Fe uptake by various phytoplankton species. The anticipated final product will be a calibrated nanoscale biosensor for laboratory-scale use that could then be adapted for deploying on remote vehicles. Broader Impacts Resulting from the Proposed Activity: The two institutions involved in this project (U. Maine and Colby College) have a strong track record for involving undergraduate and graduate students in cutting edge research in marine science and chemistry, and this project will continue this process

    Assessing the Predictions of Biogenic Volatile Organic Compound Emissions from Multiple Chemical Transport Models Within the Greater Metropolitan Region NSW

    Get PDF
    Within the Greater Metropolitan Region NSW, consideration of the accuracy of predicted biogenic emissions inputted into chemical transport models is important. These biogenic emissions react with anthropogenic compounds producing organic aerosol and ground level ozone, which negatively impact the wider environment. Despite this, there have been few studies in the area regarding these compounds and large uncertainty exists. To address this issue, the predictions of biogenic emissions from MEGAN and the CSIRO-CTM, within the Greater Metropolitan Region, were assessed using computational and statistical methods. This involved: a model intercomparison between three different model implementations run for February 2011, an assessment of seasonal variability of predicted emissions using a complete 2013 dataset, and a comparison between the outputs of one model using February 2011 and 2013 data. Predicted emissions from these models revealed that photosynthetically active radiation and temperature explain the majority of the temporal variation in the predicted emissions resulting in a diurnal distribution. However, the majority of spatial variation is explained by leaf area index and broadleaf vegetation cover within each of the models. It was also found that implementations of MEGAN predict higher quantities of emissions than the CSIRO-CTM, and high emissions of isoprene and lower emissions of monoterpenes. Each model also predicts high levels of emissions over national parks. Emissions were found to be seasonally variable with emissions at their highest during summer and lowest during winter. While the spatial distribution remained nearly unchanged throughout the year. The emission predictions for February 2013 were found to be significantly higher than those in February 2011 owing to the increased temperatures predicted for 2013. This research highlights the importance of using up to date and accurate model inputs and the need for further biogenic flux measurements in the area

    Uncertainty quantification in classical molecular dynamics

    Get PDF
    Molecular dynamics simulation is now a widespread approach for understanding complex systems on the atomistic scale. It finds applications from physics and chemistry to engineering, life and medical science. In the last decade, the approach has begun to advance from being a computer-based means of rationalizing experimental observations to producing apparently credible predictions for a number of real-world applications within industrial sectors such as advanced materials and drug discovery. However, key aspects concerning the reproducibility of the method have not kept pace with the speed of its uptake in the scientific community. Here, we present a discussion of uncertainty quantification for molecular dynamics simulation designed to endow the method with better error estimates that will enable it to be used to report actionable results. The approach adopted is a standard one in the field of uncertainty quantification, namely using ensemble methods, in which a sufficiently large number of replicas are run concurrently, from which reliable statistics can be extracted. Indeed, because molecular dynamics is intrinsically chaotic, the need to use ensemble methods is fundamental and holds regardless of the duration of the simulations performed. We discuss the approach and illustrate it in a range of applications from materials science to ligand-protein binding free energy estimation. This article is part of the theme issue 'Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico'

    AMPERE Newsletter. Issue 102

    Get PDF

    Novel in vitro and mathematical models for the prediction of chemical toxicity

    Get PDF
    The focus of much scientific and medical research is directed towards understanding the disease process and defining therapeutic intervention strategies. Whilst the scientific basis of drug safety has received relatively little attention, despite the fact that adverse drug reactions (ADRs) are a major health concern and a serious impediment to development of new medicines. Toxicity issues account for ~21% drug attrition during drug development and safety testing strategies require considerable animal use. Mechanistic relationships between drug plasma levels and molecular/cellular events that culminate in whole organ toxicity underpins development of novel safety assessment strategies. Current in vitro test systems are poorly predictive of toxicity of chemicals entering the systemic circulation, particularly to the liver. Such systems fall short because of 1) the physiological gap between cells currently used & human hepatocytes existing in their native state, 2) the lack of physiological integration with other cells/systems within organs, required to amplify the initial toxicological lesion into overt toxicity, 3) the inability to assess how low level cell damage induced by chemicals may develop into overt organ toxicity in a minority of patients, 4) lack of consideration of systemic effects. Reproduction of centrilobular & periportal hepatocyte phenotypes in in vitro culture is crucial for sensitive detection of cellular stress. Hepatocyte metabolism/phenotype is dependent on cell position along the liver lobule, with corresponding differences in exposure to substrate, oxygen & hormone gradients. Application of bioartificial liver (BAL) technology can encompass in vitro predictive toxicity testing with enhanced sensitivity and improved mechanistic understanding. Combining this technology with mechanistic mathematical models describing intracellular metabolism, fluid-­‐flow, substrate, hormone and nutrient distribution provides the opportunity to design the BAL specifically to mimic the in vivo scenario. Such mathematical models enable theoretical hypothesis testing, will inform the design of in vitro experiments, and will enable both refinement and reduction of in vivo animal trials. In this way, development of novel mathematical modelling tools will help to focus and direct in vitro and in vivo research, and can be used as a framework for other areas of drug safety science

    From qualitative data to quantitative models: analysis of the phage shock protein stress response in Escherichia coli

    Get PDF
    Background Bacteria have evolved a rich set of mechanisms for sensing and adapting to adverse conditions in their environment. These are crucial for their survival, which requires them to react to extracellular stresses such as heat shock, ethanol treatment or phage infection. Here we focus on studying the phage shock protein (Psp) stress response in Escherichia coli induced by a phage infection or other damage to the bacterial membrane. This system has not yet been theoretically modelled or analysed in silico. Results We develop a model of the Psp response system, and illustrate how such models can be constructed and analyzed in light of available sparse and qualitative information in order to generate novel biological hypotheses about their dynamical behaviour. We analyze this model using tools from Petri-net theory and study its dynamical range that is consistent with currently available knowledge by conditioning model parameters on the available data in an approximate Bayesian computation (ABC) framework. Within this ABC approach we analyze stochastic and deterministic dynamics. This analysis allows us to identify different types of behaviour and these mechanistic insights can in turn be used to design new, more detailed and time-resolved experiments. Conclusions We have developed the first mechanistic model of the Psp response in E. coli. This model allows us to predict the possible qualitative stochastic and deterministic dynamic behaviours of key molecular players in the stress response. Our inferential approach can be applied to stress response and signalling systems more generally: in the ABC framework we can condition mathematical models on qualitative data in order to delimit e.g. parameter ranges or the qualitative system dynamics in light of available end-point or qualitative information.Medical Research Council (Great Britain)Biotechnology and Biological Sciences Research Council (Great Britain)Wellcome Trust (London, England)Royal Society (Great Britain) (Wolfson Research Merit Award

    The role of expectations and visions of the future in the development of target-based environmental policies: the case of the UK Air Quality Strategy

    Get PDF
    Increasingly, policy-makers rely on forecasts to set targets for environmental and health protection. I examine the UK Air Quality Strategies (AQS) for particulate matter (1997-2007). Here policy-makers select and articulate visions for technological and policy developments in order to set targets and policies to achieve them. Despite growing evidence for adverse health effects of particulates, challenging targets in 1997 were followed by two revisions of Objectives without introducing measures for reducing pollution. In 2007 more challenging targets were resumed. This thesis is a study of the formation and evolution of a policy framework: of the interactions and contrasting roles of scientific expertise, wider political discourse, and the ‘futures’ presented by actors involved in the policy process. Sociology of Expectations has previously examined the roles of visions in innovation processes. I extended this framework to examine dynamics of visions in the policy-making process. My findings were based on analysis of visions and discourses identified in texts, model data, and interviews. Whilst none of the explanatory factors alone accounted the developments in the AQS, together they provide an explanation of change which highlights the role of learning by policy-makers . Visions for technological development articulated in each version of the AQS were in line with the dominant visions articulated in central government, but over time policy-makers responsible for the Strategy used them to present options for taking action on pollution. Co-construction of the AQS and modelled forecasts enabled policy-makers responsible for the Strategy to articulate visions for technologies and policies to promote taking action to reduce pollutants, and this led to the more action-oriented Strategy in 2007. This thesis proposes that visions can change more quickly than wider political discourses, and as such can provide opportunities for the introduction of new discourses

    Numerical modelling of two HMX-based plastic-bonded explosives at the mesoscale

    Get PDF
    Mesoscale models are needed to predict the effect of changes to the microstructure of plastic-bonded explosives on their shock initiation and detonation behaviour. This thesis describes the considerable progress that has been made towards a mesoscale model for two HMX-based explosives PBX9501 and EDC37. In common with previous work in the literature, the model is implemented in hydrocodes that have been designed for shock physics and detonation modelling. Two relevant physics effects, heat conduction and Arrhenius chemistry, are added to a one-dimensional Lagrangian hydrocode and correction factors are identified to improve total energy conservation. Material models are constructed for the HMX crystals and polymer binders in the explosives, and are validated by comparison to Hugoniot data, Pop-plot data and detonation wave profiles. One and two-dimensional simulations of PBX9501 and EDC37 microstructures are used to investigate the response of the bulk explosive to shock loading. The sensitivity of calculated temperature distributions to uncertainties in the material properties data is determined, and a thermodynamic explanation is given for time-independent features in temperature profiles. Hotspots are widely accepted as being responsible for shock initiation in plastic-bonded explosives. It is demonstrated that, although shock heating of crystals and binder is responsible for temperature localisation, it is not a feasible hotspot mechanism in PBX9501 and EDC37 because the temperatures generated are too low to cause significant chemical reaction in the required timescales. Critical hotspot criteria derived for HMX and the binders compare favourably to earlier studies. The speed of reaction propagation from hotspots into the surrounding explosive is validated by comparison to flame propagation data, and the temperature of the gaseous reaction products is identified as being responsible for negative pressure dependence. Hotspot size, separation and temperature requirements are identified which can be used to eliminate candidate mechanisms in future
    corecore