649 research outputs found

    A mass action model of a fibroblast growth factor signaling pathway and its simplification

    Get PDF
    We consider a kinetic law of mass action model for Fibroblast Growth Factor (FGF) signaling, focusing on the induction of the RAS-MAP kinase pathway via GRB2 binding. Our biologically simple model suffers a combinatorial explosion in the number of differential equations required to simulate the system. In addition to numerically solving the full model, we show that it can be accurately simplified. This requires combining matched asymptotics, the quasi-steady state hypothesis, and the fact subsets of the equations decouple asymptotically. Both the full and simplified models reproduce the qualitative dynamics observed experimentally and in previous stochastic models. The simplified model also elucidates both the qualitative features of GRB2 binding and the complex relationship between SHP2 levels, the rate SHP2 induces dephosphorylation and levels of bound GRB2. In addition to providing insight into the important and redundant features of FGF signaling, such work further highlights the usefulness of numerous simplification techniques in the study of mass action models of signal transduction, as also illustrated recently by Borisov and co-workers (Borisov et al. in Biophys. J. 89, 951–66, 2005, Biosystems 83, 152–66, 2006; Kiyatkin et al. in J. Biol. Chem. 281, 19925–9938, 2006). These developments will facilitate the construction of tractable models of FGF signaling, incorporating further biological realism, such as spatial effects or realistic binding stoichiometries, despite a more severe combinatorial explosion associated with the latter

    On Quantitative Software Verification

    Full text link

    Modelling DNA Origami Self-Assembly at the Domain Level

    Full text link
    We present a modelling framework, and basic model parameterization, for the study of DNA origami folding at the level of DNA domains. Our approach is explicitly kinetic and does not assume a specific folding pathway. The binding of each staple is associated with a free-energy change that depends on staple sequence, the possibility of coaxial stacking with neighbouring domains, and the entropic cost of constraining the scaffold by inserting staple crossovers. A rigorous thermodynamic model is difficult to implement as a result of the complex, multiply connected geometry of the scaffold: we present a solution to this problem for planar origami. Coaxial stacking and entropic terms, particularly when loop closure exponents are taken to be larger than those for ideal chains, introduce interactions between staples. These cooperative interactions lead to the prediction of sharp assembly transitions with notable hysteresis that are consistent with experimental observations. We show that the model reproduces the experimentally observed consequences of reducing staple concentration, accelerated cooling and absent staples. We also present a simpler methodology that gives consistent results and can be used to study a wider range of systems including non-planar origami

    Tableaux for Policy Synthesis for MDPs with PCTL* Constraints

    Full text link
    Markov decision processes (MDPs) are the standard formalism for modelling sequential decision making in stochastic environments. Policy synthesis addresses the problem of how to control or limit the decisions an agent makes so that a given specification is met. In this paper we consider PCTL*, the probabilistic counterpart of CTL*, as the specification language. Because in general the policy synthesis problem for PCTL* is undecidable, we restrict to policies whose execution history memory is finitely bounded a priori. Surprisingly, no algorithm for policy synthesis for this natural and expressive framework has been developed so far. We close this gap and describe a tableau-based algorithm that, given an MDP and a PCTL* specification, derives in a non-deterministic way a system of (possibly nonlinear) equalities and inequalities. The solutions of this system, if any, describe the desired (stochastic) policies. Our main result in this paper is the correctness of our method, i.e., soundness, completeness and termination.Comment: This is a long version of a conference paper published at TABLEAUX 2017. It contains proofs of the main results and fixes a bug. See the footnote on page 1 for detail

    Multiple verification in computational modeling of bone pathologies

    Full text link
    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the investigation of metabolic syndromes, diabetes and cancer. Our implementation could make use of different logical combinations of these statistical estimators and could incorporate other biomarkers for other systemic co-morbidities (for example diabetes and thalassemia). We are delighted to report that the combination of stochastic modeling with formal methods motivate new diagnostic framework for complex pathologies. In particular our approach takes into consideration important properties of biosystems such as multiscale and self-adaptiveness. The multi-diagnosis could be further expanded, inching towards the complexity of human diseases. Finally, we briefly introduce self-adaptiveness in formal methods which is a key property in the regulative mechanisms of biological systems and well known in other mathematical and engineering areas.Comment: In Proceedings CompMod 2011, arXiv:1109.104

    Equilibria-based Probabilistic Model Checking for Concurrent Stochastic Games

    Get PDF
    Probabilistic model checking for stochastic games enables formal verification of systems that comprise competing or collaborating entities operating in a stochastic environment. Despite good progress in the area, existing approaches focus on zero-sum goals and cannot reason about scenarios where entities are endowed with different objectives. In this paper, we propose probabilistic model checking techniques for concurrent stochastic games based on Nash equilibria. We extend the temporal logic rPATL (probabilistic alternating-time temporal logic with rewards) to allow reasoning about players with distinct quantitative goals, which capture either the probability of an event occurring or a reward measure. We present algorithms to synthesise strategies that are subgame perfect social welfare optimal Nash equilibria, i.e., where there is no incentive for any players to unilaterally change their strategy in any state of the game, whilst the combined probabilities or rewards are maximised. We implement our techniques in the PRISM-games tool and apply them to several case studies, including network protocols and robot navigation, showing the benefits compared to existing approaches

    Poczucie kontroli a gotowość szkolna dzieci przedszkolnych

    Get PDF
    The aim of the study was to determine the relationship between the sense of location of control and school readiness, understood as the child’s interest in learning. This relationship, as is clear from the current knowledge of the subject, is important from the point of view of the specificity of the psychophysical functioning of a small child. The sense of control is a psychological concept denoting the dimension of personality, which determines how a person perceives the relationships between his behavior and the effects of this behavior. The main problem was formulated as follows: What is the relationship between the location of the sense of control and the school readiness of preschool children? In order to verify the problem, 80 children aged 6 and their parents were examined. The sense of location of control was measured using the Questionnaire of Sense of Control in Pre-school Children (SPK-DP), while school readiness using the Parental School Readiness Scale (RSGS). The obtained results clearly prove that children who have an external sense of control are rated higher in terms of all the skills that make up school readiness. This applies to both cognitive activity skills, knowledge of social skills and mathematical skills.Celem przeprowadzonych badań było określenie związku między poczuciem umiejscowienia kontroli a gotowością szkolną, rozumianą jako zainteresowanie dziecka uczeniem się. Związek ten, jak wynika z dotychczasowej literatury przedmiotu, jest istotny z punktu widzenia specyfiki funkcjonowania psychofizycznego małego dziecka. Poczucie kontroli jest pojęciem psychologicznym oznaczającym wymiar osobowości, który warunkuje to, jak człowiek postrzega zależności występujące między jego zachowaniem a skutkami tego zachowania. Problem główny sformułowano następująco: Jaki jest rodzaj związku umiejscowienia poczucia kontroli z gotowością szkolną dzieci w wieku przedszkolnym? W celu jego weryfikacji zbadano 80 dzieci w wieku 6 lat oraz ich rodziców. Poczucie umiejscowienia kontroli zmierzono za pomocą kwestionariusza Skali Poczucia Kontroli u Dzieci Przedszkolnych (SPK-DP), natomiast gotowość szkolną za pomocą Rodzicielskiej Skali Gotowości Szkolnej (RSGS). Otrzymane wyniki jednoznacznie dowodzą, że dzieci, u których stwierdza się zewnętrzne poczucie kontroli, są wyżej oceniane pod względem wszystkich umiejętności składających się na gotowość szkolną. Dotyczy to zarówno zdolności w zakresie aktywności poznawczej, wiedzy z zakresu umiejętności społecznych, jak i zdolności matematycznych

    When images work faster than words: The integration of content-based image retrieval with the Northumbria Watermark Archive

    Get PDF
    Information on the manufacture, history, provenance, identification, care and conservation of paper-based artwork/objects is disparate and not always readily available. The Northumbria Watermark Archive will incorporate such material into a database, which will be made freely available on the Internet providing an invaluable resource for conservation, research and education. The efficiency of a database is highly dependant on its search mechanism. Text based mechanisms are frequently ineffective when a range of descriptive terminologies might be used i.e. when describing images or translating from foreign languages. In such cases a Content Based Image Retrieval (CBIR) system can be more effective. Watermarks provide paper with unique visual identification characteristics and have been used to provide a point of entry to the archive that is more efficient and effective than a text based search mechanism. The research carried out has the potential to be applied to any numerically large collection of images with distinctive features of colour, shape or texture i.e. coins, architectural features, picture frame profiles, hallmarks, Japanese artists stamps etc. Although the establishment of an electronic archive incorporating a CBIR system can undoubtedly improve access to large collections of images and related data, the development is rarely trouble free. This paper discusses some of the issues that must be considered i.e. collaboration between disciplines; project management; copying and digitising objects; content based image retrieval; the Northumbria Watermark Archive; the use of standardised terminology within a database as well as copyright issues

    Design and analysis of DNA strand displacement devices using probabilistic model checking

    Get PDF
    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs

    A Study of the PDGF Signaling Pathway with PRISM

    Get PDF
    In this paper, we apply the probabilistic model checker PRISM to the analysis of a biological system -- the Platelet-Derived Growth Factor (PDGF) signaling pathway, demonstrating in detail how this pathway can be analyzed in PRISM. We show that quantitative verification can yield a better understanding of the PDGF signaling pathway.Comment: In Proceedings CompMod 2011, arXiv:1109.104
    corecore