494,508 research outputs found

    Modelling the cAMP pathway using BioNessie, and the use of BVP techniques for solving ODEs (Poster Presentation)

    Get PDF
    Copyright @ 2007 Gu et al; licensee BioMed Central LtdBiochemists often conduct experiments in-vivo in order to explore observable behaviours and understand the dynamics of many intercellular and intracellular processes. However an intuitive understanding of their dynamics is hard to obtain because most pathways of interest involve components connected via interlocking loops. Formal methods for modelling and analysis of biochemical pathways are therefore indispensable. To this end, ODEs (ordinary differential equations) have been widely adopted as a method to model biochemical pathways because they have an unambiguous mathematical format and are amenable to rigorous quantitative analysis. BioNessie http://www.bionessie.com webcite is a workbench for the composition, simulation and analysis of biochemical networks which is being developed in by the Systems Biology team at the Bioinformatics Research Centre as a part of a large DTI funded project 'BPS: A Software Tool for the Simulation and Analysis of Biochemical Networks' http://www.brc.dcs.gla.ac.uk/projects/dti_beacon webcite. BioNessie is written in Java using NetBeans Platform libraries that makes it platform independent. The software employs specialised differential equations solvers for stiff and non-stiff systems to produce model simulation traces. BioNessie provides a user-friendly interfact that comes up with an intuitive tree-based graphical layout, an edition function to SBML-compatible models and feature of data output

    Publishing and sharing multi-dimensional image data with OMERO

    Get PDF
    Imaging data are used in the life and biomedical sciences to measure the molecular and structural composition and dynamics of cells, tissues, and organisms. Datasets range in size from megabytes to terabytes and usually contain a combination of binary pixel data and metadata that describe the acquisition process and any derived results. The OMERO image data management platform allows users to securely share image datasets according to specific permissions levels: data can be held privately, shared with a set of colleagues, or made available via a public URL. Users control access by assigning data to specific Groups with defined membership and access rights. OMERO’s Permission system supports simple data sharing in a lab, collaborative data analysis, and even teaching environments. OMERO software is open source and released by the OME Consortium at www.openmicroscopy.org

    Empirical study of the influence of social groups in evacuation scenarios

    Full text link
    The effects of social groups on pedestrian dynamics, especially in evacuation scenarios, have attracted some interest recently. However, due to the lack of reliable empirical data, most of the studies focussed on modelling aspects. It was shown that social groups can have a considerable effect, e.g. on evacuation times. In order to test the model predictions we have performed laboratory experiments of evacuations with different types and sizes of the social groups. The experiments have been performed with pupils of different ages. Parameters that have been considered are (1) group size, (2) strength of intra-group interactions, and (3) composition of the groups (young adults, children, and mixtures). For all the experiments high-quality trajectories for all participants have been obtained using the PeTrack software. This allows for a detailed analysis of the group effects. One surprising observation is a decrease of the evacuation time with increasing group size.Comment: 8 pages, 4 figures, to be published in Traffic and Granular Flow '15 (Springer, 2016

    Exploring the limits of knowledge on boreal peatland development using a new model: the Holocene Peatland Model

    Get PDF
    The Holocene Peatland Model (HPM) (Frolking et al. 2009, Frolking et al. in prep.) is a recently developed tool integrating up-to-date knowledge on peatland dynamics that explores peatland development and carbon dynamics on a millennial timescale. HPM combines the water and carbon cycles with net primary production and peat decomposition and takes the multiple feedbacks into account. The model remains simple and few site-specific inputs are needed. HPM simulates the transient development of the peatland and delivers peat age, peat depth, peat composition, carbon accumulation and water table depth for each simulated year. Evaluating the ability of the model to reproduce peatland development can be achieved in several manners. Commonly one could choose to compare simulations results with observations from field data. However, we argue that the overall response of the model does not give much information about the value of the model design. Modelling of peatlands dynamics requires a lot of information regarding the behaviour of a peatland system within its environment (including allogenic changes in climate, hydrological conditions, nutrient availability or autogenic processes such as microtopographical effects). The actual state of knowledge does not cover all processes, interactions or feedbacks and a lot of peatland properties are neither well defined nor measured yet, so that estimates have been needed to build the model. The work presented here aims at analyzing the role of the model parameterization on the simulation results. To do so, a sensitivity analysis is performed with a Monte-Carlo analysis and with help of the GUI-HDMR software (Ziehn and Tomlin, 2009). This method ranks the parameters and combinations of them according to their influence on simulation results. The results will emphasize how the simulation is sensitive to the parameter values. First, the distribution of outputs gives insight into the possible responses of the simulation to HPM’s assemblage of current knowledge. Second, the importance of some parameters on simulation results points out certain gaps in the current understanding of peatland dynamics. Thus, this study helps determine some avenues that should be explored in future in order to improve peatlands dynamics understanding

    Validation and Verification of LADEE Models and Software

    Get PDF
    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds

    Dynamic Composition of Cyber-Physical Systems

    Get PDF
    Future cyber-physical systems must fulfill strong demands on timeliness and reliability, so that the safety of their operational environment is never violated. At the same time, such systems are networked computers with the typical demand for reconfigurability and software modification. The combination of both expectations makes established modeling and analysis techniques difficult to apply, since they cannot scale with the number of possible operational constellations resulting from the dynamics. The problem increases when components with different non-functional demands are combined to one cyber-physical system and updated independent from each other. We propose a new approach for the design and development of composable, dynamic and dependable software architectures, with a focus on the area of networked embedded systems. Our key concept is the specification of software components and their non-functional composition constraints in the formal language TLA+. We discuss how this technique can be embedded in an overall software design workflow, and show the practical applicability with a detailed resource scheduling example

    Experimental analysis and simulation of catalytic converters

    Get PDF
    The purpose of this chapter is to present the results of an experimental study ofthe performance of ceramic monolith three-way catalytic converters (TWCC) employed in automotive exhaust lines for the reduction of gasoline emissions. Two ceramic converters of different cell density, substrate length, hydraulic channel diameter and wall thickness were investigated. After completing the test, the converters were cut to extract the substrate or 'honeycomb' inside the housing and being analyzed for microstructure and materials composition using Scanning Electron Microscopy (SEM) and Energy Dispersive Analysis (EDX). Simulation program using commercial computational fluid dynamics (CFD) software packages, GAMBIT and FLUENT 6.1 was used to verify experimental results
    corecore