538 research outputs found

    Multiple verification in computational modeling of bone pathologies

    Full text link
    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the investigation of metabolic syndromes, diabetes and cancer. Our implementation could make use of different logical combinations of these statistical estimators and could incorporate other biomarkers for other systemic co-morbidities (for example diabetes and thalassemia). We are delighted to report that the combination of stochastic modeling with formal methods motivate new diagnostic framework for complex pathologies. In particular our approach takes into consideration important properties of biosystems such as multiscale and self-adaptiveness. The multi-diagnosis could be further expanded, inching towards the complexity of human diseases. Finally, we briefly introduce self-adaptiveness in formal methods which is a key property in the regulative mechanisms of biological systems and well known in other mathematical and engineering areas.Comment: In Proceedings CompMod 2011, arXiv:1109.104

    Robust Online Monitoring of Signal Temporal Logic

    Full text link
    Signal Temporal Logic (STL) is a formalism used to rigorously specify requirements of cyberphysical systems (CPS), i.e., systems mixing digital or discrete components in interaction with a continuous environment or analog com- ponents. STL is naturally equipped with a quantitative semantics which can be used for various purposes: from assessing the robustness of a specification to guiding searches over the input and parameter space with the goal of falsifying the given property over system behaviors. Algorithms have been proposed and implemented for offline computation of such quantitative semantics, but only few methods exist for an online setting, where one would want to monitor the satisfaction of a formula during simulation. In this paper, we formalize a semantics for robust online monitoring of partial traces, i.e., traces for which there might not be enough data to decide the Boolean satisfaction (and to compute its quantitative counterpart). We propose an efficient algorithm to compute it and demonstrate its usage on two large scale real-world case studies coming from the automotive domain and from CPS education in a Massively Open Online Course (MOOC) setting. We show that savings in computationally expensive simulations far outweigh any overheads incurred by an online approach

    Mirror symmetry on K3 surfaces via Fourier-Mukai transform

    Full text link
    We use a relative Fourier-Mukai transform on elliptic K3 surfaces XX to describe mirror symmetry. The action of this Fourier-Mukai transform on the cohomology ring of XX reproduces relative T-duality and provides an infinitesimal isometry of the moduli space of algebraic structures on XX which, in view of the triviality of the quantum cohomology of K3 surfaces, can be interpreted as mirror symmetry.Comment: 15 pages, AMS-LaTeX v1.2. Final version to appear in Commun. Math. Phy

    Efficient Large-scale Trace Checking Using MapReduce

    Full text link
    The problem of checking a logged event trace against a temporal logic specification arises in many practical cases. Unfortunately, known algorithms for an expressive logic like MTL (Metric Temporal Logic) do not scale with respect to two crucial dimensions: the length of the trace and the size of the time interval for which logged events must be buffered to check satisfaction of the specification. The former issue can be addressed by distributed and parallel trace checking algorithms that can take advantage of modern cloud computing and programming frameworks like MapReduce. Still, the latter issue remains open with current state-of-the-art approaches. In this paper we address this memory scalability issue by proposing a new semantics for MTL, called lazy semantics. This semantics can evaluate temporal formulae and boolean combinations of temporal-only formulae at any arbitrary time instant. We prove that lazy semantics is more expressive than standard point-based semantics and that it can be used as a basis for a correct parametric decomposition of any MTL formula into an equivalent one with smaller, bounded time intervals. We use lazy semantics to extend our previous distributed trace checking algorithm for MTL. We evaluate the proposed algorithm in terms of memory scalability and time/memory tradeoffs.Comment: 13 pages, 8 figure

    A Quantitative Methodology to Measure Injector Fouling Through Image Analysis

    Get PDF
    Abstract The use of vegetables oils in a compression ignited internal combustion engine presents some critical issues as the large amount of carbon deposits on the tip of injectors, which significantly influence emissions and engine performance. A previous draft methodology was developed by the authors, based on images capture and post-processing. The carbon deposit was correlated with the number of pixels in the gray scale, so it was possible to determine a Fouling Index. First results showed interesting perspectives and some limits: the aim of the present work is the optimization of the test bench and methodology. At first an improvement of image acquisition, increasing sampling frequency and image resolution, is performed, replacing the old camera with a digital microscope and improving both injector and microscope positioning. The test bench prototype has been realized with the aid of 3D printing, obtaining fundamental mechanical components. Also an alternative methodology is proposed to evaluate carbon deposits volume through a Volumetric Index. The new methodology validation was done using images sampled with the previous test bench. The performances of the Fouling index and of the new Volumetric Index were compared and fouling was examined in the real case of a diesel engine, fed with diesel and sunflower oil. Results show a greater reliability of the new Volumetric Index

    Pyrolysis of Olive Stone for Energy Purposes

    Get PDF
    Abstract Pyrolysis of biomass is a promising technology for the production of distributed and renewable energy on small and micro-scale since it produces a gas with relatively high calorific value, which can be burned in an internal combustion engine or in a microturbine; pyrolysis also generates by products (char and tar) which can be used to provide energy to the process or for cogeneration purposes. This research is aimed at the exploitation of waste from agricultural production processes, in particular olive mill wastes whose management has critical environmental and disposal costs; the yields of pyrogas, tar and char obtained from the pyrolysis of olive stone in a batch reactor was measured. Pyrogas produced is sampled through a line for the sampling of condensable substances in accordance with existing regulations, CEN/TS 15439, and once purified from water vapor and tars is analyzed with micro-GC. The data collected is used to perform mass and energy balances and to determine the content of tars and the Low Heating Value (LHV) of the gas produced

    MoonLight: a lightweight tool for monitoring spatio-temporal properties

    Get PDF
    We present MoonLight, a tool for monitoring temporal and spatio-temporal properties of mobile, spatially distributed, and interacting entities such as biological and cyber-physical systems. In MoonLight the space is represented as a weighted graph describing the topological configuration in which the single entities are arranged. Both nodes and edges have attributes modeling physical quantities and logical states of the system evolving in time. MoonLight is implemented in Java and supports the monitoring of Spatio-Temporal Reach and Escape Logic (STREL). MoonLight can be used as a standalone command line tool, such as Java API, or via MatlabTM and Python interfaces. We provide here the description of the tool, its interfaces, and its scripting language using a sensor network and a bike sharing example. We evaluate the tool performances both by comparing it with other tools specialized in monitoring only temporal properties and by monitoring spatio-temporal requirements considering different sizes of dynamical and spatial graphs

    Cech and de Rham Cohomology of Integral Forms

    Full text link
    We present a study on the integral forms and their Cech/de Rham cohomology. We analyze the problem from a general perspective of sheaf theory and we explore examples in superprojective manifolds. Integral forms are fundamental in the theory of integration in supermanifolds. One can define the integral forms introducing a new sheaf containing, among other objects, the new basic forms delta(dtheta) where the symbol delta has the usual formal properties of Dirac's delta distribution and acts on functions and forms as a Dirac measure. They satisfy in addition some new relations on the sheaf. It turns out that the enlarged sheaf of integral and "ordinary" superforms contains also forms of "negative degree" and, moreover, due to the additional relations introduced, its cohomology is, in a non trivial way, different from the usual superform cohomology.Comment: 20 pages, LaTeX, we expanded the introduction, we add a complete analysis of the cohomology and we derive a new duality between cohomology group

    Clinical usefulness of splanchnic oxygenation in predicting necrotizing enterocolitis in extremely preterm infants:a cohort study

    Get PDF
    Background: Impaired intestinal microcirculation seems to play an important role in the pathogenesis of necrotizing enterocolitis (NEC). A previous study showed that a SrSO2 &lt; 30% is associated with an increased risk of developing of NEC. We aimed to determine the clinical usefulness of the cut off &lt; 30% for SrSO2 in predicting NEC in extremely preterm neonates.Methods: This is a combined cohort observational study. We added a second cohort from another university hospital to the previous cohort of extremely preterm infants. SrSO2 was measured for 1–2 h at days 2–6 after birth. To determine clinical usefulness we assessed sensitivity, specificity, positive and negative predictive values for mean SrSO2 &lt; 30. Odds ratio to develop NEC was assessed with generalized linear model analysis, adjusting for center.Results: We included 86 extremely preterm infants, median gestational age 26.3 weeks (range 23.0-27.9). Seventeen infants developed NEC. A mean SrSO2 &lt; 30% was found in 70.5% of infants who developed NEC compared to 33.3% of those who did not (p = 0.01). Positive and negative predictive values were 0.33 CI (0.24–0.44) and 0.90 CI (0.83–0.96), respectively. The odds of developing NEC were 4.5 (95% CI 1.4–14.3) times higher in infants with SrSO2 &lt; 30% compared to those with SrSO2 ≥ 30%.Conclusions: A mean SrSO2 cut off ≥ 30% in extremely preterm infants between days 2–6 after birth may be useful in identifying infants who will not develop NEC.</p
    • …
    corecore