363 research outputs found

    Combining SysML and AADL for the design, validation and implementation of critical systems

    Get PDF
    The realization of critical systems goes through multiple phases of specification, design, integration, validation, and testing. It starts from high-level sketches down to the final product. Model-Based Design has been acknowledged as a good conveyor to capture these steps. Yet, there is no universal solution to represent all activities. Two candidates are the OMG-based SysML to perform high-level modeling tasks, and the SAE AADL to perform lower-level ones, down to the implementation. The paper shares an experience on the seamless use of SysML and the AADL to model, validate/verify and implement a flight management system

    Model-based dependability analysis : state-of-the-art, challenges and future outlook

    Get PDF
    Abstract: Over the past two decades, the study of model-based dependability analysis has gathered significant research interest. Different approaches have been developed to automate and address various limitations of classical dependability techniques to contend with the increasing complexity and challenges of modern safety-critical system. Two leading paradigms have emerged, one which constructs predictive system failure models from component failure models compositionally using the topology of the system. The other utilizes design models - typically state automata - to explore system behaviour through fault injection. This paper reviews a number of prominent techniques under these two paradigms, and provides an insight into their working mechanism, applicability, strengths and challenges, as well as recent developments within these fields. We also discuss the emerging trends on integrated approaches and advanced analysis capabilities. Lastly, we outline the future outlook for model-based dependability analysis

    Expressing and enforcing user-defined constraints of AADL models

    Get PDF
    The Architecture Analysis and Design Language AADL allows one to model complete systems, but also to define specific extensions through property sets and library of models. Yet, it does not define an explicit mechanism to enforce some semantics or consistency checks to ensure property sets are correctly used. In this paper, we present REAL (Requirements and Enforcements Analysis Language) as an integrated solution to this issue. REAL is defined as an AADL annex language. It adds the possibility to express constraints as theorems based on set theory to enforce implicit semantics of property sets or AADL models. We illustrate the use of the language on case studies we developed with industrial partners

    Validate implementation correctness using simulation: the TASTE approach

    Get PDF
    High-integrity systems operate in hostile environment and must guarantee a continuous operational state, even if unexpected events happen. In addition, these systems have stringent requirements that must be validated and correctly translated from high-level specifications down to code. All these constraints make the overall development process more time-consuming. This becomes especially complex because the number of system functions keeps increasing over the years. As a result, engineers must validate system implementation and check that its execution conforms to the specifications. To do so, a traditional approach consists in a manual instrumentation of the implementation code to trace system activity while operating. However, this might be error-prone because modifications are not automatic and still made manually. Furthermore, such modifications may have an impact on the actual behavior of the system. In this paper, we present an approach to validate a system implementation by comparing execution against simulation. In that purpose, we adapt TASTE, a set of tools that eases system development by automating each step as much as possible. In particular, TASTE automates system implementation from functional (system functions description with their properties – period, deadline, priority, etc.) and deployment(processors, buses, devices to be used) models. We tailored this tool-chain to create traces during system execution. Generated output shows activation time of each task, usage of communication ports (size of the queues, instant of events pushed/pulled, etc.) and other relevant execution metrics to be monitored. As a consequence, system engineers can check implementation correctness by comparing simulation and execution metrics
    corecore