21 research outputs found
Integrating Simulink Models into the Model Checker Cosmos
International audienceWe present an implementation for Simulink model executions in the statistical model-checker Cosmos. We take profit of this implementation for hybrid modeling and simulations combining Petri nets and Simulink models.Nous présentons une implémentation pour l'exécution de modèles Simulink dans le model-checker Cosmos. Cette implémentation est ensuite utilisée pour la simulation de modèles hybrides, combinant des réseaux de Petri et des modèles Simulink
Intégration des modèles Simulink dans le model-checker Cosmos
We present an implementation for Simulink model executions in the statistical model-checker Cosmos.We take profit of this implementation for an hybrid modeling combining Petri nets and Simulink models.Nous présentons une implémentation pour l'exécution de modèles Simulink dans le model-checker Cosmos.Cette implémentation est ensuite utilisée pour la simulation de modèles hybrides, combinant des réseaux de Petri et des modèles Simulink
Statistical Model Checking for Stochastic Hybrid Systems
This paper presents novel extensions and applications of the UPPAAL-SMC model
checker. The extensions allow for statistical model checking of stochastic
hybrid systems. We show how our race-based stochastic semantics extends to
networks of hybrid systems, and indicate the integration technique applied for
implementing this semantics in the UPPAAL-SMC simulation engine. We report on
two applications of the resulting tool-set coming from systems biology and
energy aware buildings.Comment: In Proceedings HSB 2012, arXiv:1208.315
SpaceCube: A NASA Family of Reconfigurable Hybrid On-Board Science Data Processors
SpaceCube is a family of Field Programmable Gate Array (FPGA) based on-board science-data processing systems developed at NASA Goddard Space Flight Center. This presentation provides an overview to the Future In-Space Operations Telecon Working Group
Automated Validation of State-Based Client-Centric Isolation with TLA <sup>+</sup>
Clear consistency guarantees on data are paramount for the design and implementation of distributed systems. When implementing distributed applications, developers require approaches to verify the data consistency guarantees of an implementation choice. Crooks et al. define a state-based and client-centric model of database isolation. This paper formalizes this state-based model in, reproduces their examples and shows how to model check runtime traces and algorithms with this formalization. The formalized model in enables semi-automatic model checking for different implementation alternatives for transactional operations and allows checking of conformance to isolation levels. We reproduce examples of the original paper and confirm the isolation guarantees of the combination of the well-known 2-phase locking and 2-phase commit algorithms. Using model checking this formalization can also help finding bugs in incorrect specifications. This improves feasibility of automated checking of isolation guarantees in synthesized synchronization implementations and it provides an environment for experimenting with new designs.</p
Contributions to Statistical Model Checking
Statistical Model Checking (SMC) is a powerful and widely used approach that consists in estimating the probability for a system to satisfy a temporal property. This is done by monitoring a finite number of executions of the system, and then extrapolating the result by using statistics. The answer is correct up to some confidence that can be parameterized by the user. It is known that SMC mitigates the state-space explosion problem and allows us to handle requirements that cannot be expressed in classical temporal logics. The approach has been implemented in several toolsets, and successfully applied in a wide range of diverse areas such as systems biology, robotic, or automotive. Unfortunately, SMC is not a panacea and many important classes of systems and properties are still out of its scope. Moreover, In addition, SMC still indirectly suffers from an explosion linked to the number of simulations needed to converge when estimating small probabilities. Finally,the approach has not yet been lifted to a professional toolset directly usable by industry people.In this thesis we propose several contributions to increase the efficiency of SMC and to wider its applicability to a larger class of systems. We show how to extend the applicability of SMC to estimate the probability of rare-events. The probability of such events is so small that classical estimators such as Monte Carlo would almost always estimate it to be null. We then show how to apply SMC to those systems that combine both non-deterministic and stochastic aspects. Contrary to existing work, we do not use a learning-based approach for the non-deterministic aspects, butrather exploit a smart sampling strategy. We then show that SMC can be extended to a new class of problems. More precisely, we consider the problem of detecting probability changes at runtime. We solve this problem by exploiting an algorithm coming from the signal processing area. We also propose an extension of SMC to real-time stochastic system. We provide a stochastic semantic for such systems, and show how to exploit it in a simulation-based approach. Finally, we also consider an extension of the approach for Systems of Systems.Our results have been implemented in Plasma Lab, a powerful but flexible toolset. The thesis illustrates the efficiency of this tool on several case studies going from classical verification to more quixotic applications such as robotic
Recommended from our members
Computational model validation using a novel multiscale multidimensional spatio-temporal meta model checking approach
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonComputational models of complex biological systems can provide a better understanding of how living systems function but need to be validated before they are employed for real-life (e.g. clinical) applications. One of the most frequently employed in silico approaches for validating such models is model checking. Traditional model checking approaches are limited to uniscale non-spatial computational models because they do not explicitly distinguish between different scales, and do not take properties of (emergent) spatial structures (e.g. density of multicellular population) into account. This thesis defines a novel multiscale multidimensional spatio-temporal meta model checking methodology which enables validating multiscale (spatial) computational models of biological systems relative to how both numeric (e.g. concentrations) and spatial system properties are expected to change over time and across multiple scales. The methodology has two important advantages. First it supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to produce them. Secondly the methodology is generic because it can be automatically reconfigured according to case study specific types of spatial structures and properties using the meta model checking approach. In addition the methodology could
be employed for multiple domains of science, but we illustrate its applicability here only against biological case studies. To automate the computational model validation process, the approach was implemented in software tools, which are made freely available online. Their efficacy is illustrated against two uniscale and four multiscale quantitative computational models encoding phase variation in bacterial colonies and the chemotactic aggregation of cells, respectively the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. This novel model checking approach will enable the efficient construction of
reliable multiscale computational models of complex systems.Brunel University Londo
Closed-Loop Quantitative Verification of Rate-Adaptive Pacemakers
Rate-adaptive pacemakers are cardiac devices able to automatically adjust the pacing rate in patients with
chronotropic incompetence, i.e. whose heart is unable to provide an adequate rate at increasing levels of
physical, mental or emotional activity. These devices work by processing data from physiological sensors in
order to detect the patient’s activity and update the pacing rate accordingly. Rate-adaptation parameters depend
on many patient-specific factors, and effective personalisation of such treatments can only be achieved
through extensive exercise testing, which is normally intolerable for a cardiac patient. In this work, we introduce
a data-driven and model-based approach for the automated verification of rate-adaptive pacemakers
and formal analysis of personalised treatments. To this purpose, we develop a novel dual-sensor pacemaker
model where the adaptive rate is computed by blending information from an accelerometer, and a metabolic
sensor based on the QT interval. Our approach enables personalisation through the estimation of heart
model parameters from patient data (electrocardiogram), and closed-loop analysis through the online generation
of synthetic, model-based QT intervals and acceleration signals. In addition to personalisation, we also
support the derivation of models able to account for the varied characteristics of a virtual patient population,
thus enabling safety verification of the device. To capture the probabilistic and non-linear dynamics of
the heart, we define a probabilistic extension of timed I/O automata with data and employ statistical model
checking for quantitative verification of rate modulation. We evaluate our rate-adaptive pacemaker design
on three subjects and a pool of virtual patients, demonstrating the potential of our approach to provide rigorous,
quantitative insights into the closed-loop behaviour of the device under different exercise levels and
heart conditions
Nutzerfreundliche Modellierung mit hybriden Systemen zur symbolischen Simulation in CLP
Die Dissertation beinhaltet die Sprachen MODEL-HS und VYSMO zur modularen, deklarativen Beschreibung hybrider Systeme, die dem Nachweis zeit- und sicherheitskritischer Eigenschaften für die symbolische Simulation in CLP dienen. Zum Erlangen sprachtheoretischer Erkenntnisse wie Entscheidbarkeit wurden hybride Systeme neu unter formal nachweisbaren Akzeptanzbedingungen definiert, welche durch praktische Beispiele belegt sind. Weitere Ergebnisse sind eine neue Klassifikation hybrider Systeme, ein Werkzeug ROSSY, Anfragebeschreibungen und deren Transformation in temporal-logische Ausdrücke, Anfragemasken und Anwendungen für Studiensysteme und parallele Programme.The dissertation includes the languages MODEL-HS and VYSMO for modular, declarative description of hybrid systems that serve the proof of time- and safety-critical properties for symbolic simulation in CLP. For coming to language-theoretical conclusions like decidability hybrid systems are newly defined under acceptance conditions that can be formally proved and for which practical examples bear witness. A new classification of hybrid systems, a tool ROSSY, query descriptions and their transformation into temporal-logic expressions, query forms and applications for study systems and parallel programs are further results