14 research outputs found

    Functional Validation of AADL Models via Model Transformation to SystemC with ATL

    No full text
    6 pagesInternational audienceIn this paper, we put into action an ATL model transformation in order to automatically generate SystemC models from AADL models. The AADL models represent electronic systems to be embedded into FPGAs. Our contribution allows for an early analytical estimation of energetic needs and a rapid SystemC simulation before implementation. The transformation has been tested to simulate an existing video image processing system embedded into a Xilinx Virtex5 FPGA

    Design Space Exploration: Bridging the Gap Between High­‐Level Models and Virtual ExecutionPlatforms

    Get PDF
    International audienceThispaper presents a novel embedded systems modeling framework that fills the gap betweenhigh-­‐level AADL models and low-­‐level hardware virtual execution platforms. This approach allows refinement and improvement of system performance through exploration of architectures at different levels of abstraction. The aim of the proposed approach is to achieve virtual prototyping of the complete system in order to allow validation to begin early in the design flow, thereby accelerating its development while improving system performances

    Exploring AADL verification tool through model transformation

    Get PDF
    International audienceArchitecture Analysis and Design Language (AADL) is often used to model safety-critical real-time systems. Model transformation is widely used to extract a formal specification so that AADL models can be verified and analyzed by existing tools. Timed Abstract State Machine (TASM) is a formalism not only able to specify behavior and communication but also timing and resource aspects of the system. To verify functional and nonfunctional properties of AADL models, this paper presents a methodology for translating AADL to TASM. Our main contribution is to formally define the translation rules from an adequate subset of AADL (including thread component, port communication, behavior annex and mode change) into TASM. Based on these rules, a tool called AADL2TASM is implemented using Atlas Transformation Language (ATL). Finally, a case study from an actual data processing unit of a satellite is provided to validate the transformation and illustrate the practicality of the approach

    System-level Co-simulation of Integrated Avionics Using Polychrony

    Get PDF
    International audienceThe design of embedded systems from multiple views and heterogeneous models is ubiquitous in avionics as, in partic- ular, different high-level modeling standards are adopted for specifying the structure, hardware and software components of a system. The system-level simulation of such composite models is necessary but difficult task, allowing to validate global design choices as early as possible in the system de- sign ïŹ‚ow. This paper presents an approach to the issue of composing, integrating and simulating heterogeneous mod- els in a system co-design ïŹ‚ow. First, the functional behavior of an application is modeled with synchronous data-ïŹ‚ow and statechart diagrams using Simulink/Gene-Auto. The system architecture is modeled in the AADL standard. These high- level, synchronous and asynchronous, models are then trans- lated into a common model, based on a polychronous model of computation, allowing for a Globally Asynchronous Lo- cally Synchronous (GALS) interpretation of the composed models. This translation is implemented as an automatic model transformation within Polychrony, a toolkit for em- bedded systems design. Simulation, including proïŹling and value change dump demonstration, has been carried out based on the common model within Polychrony. An avionic case study, consisting of a simpliïŹed doors and slides control system, is presented to illustrate our approach

    Élaboration d'une mĂ©thodologie de conception des systĂšmes embarquĂ©s basĂ©e sur la transformation du modĂšle fonctionnel de haut niveau vers le prototype virtuel

    Get PDF
    La croissance rapide des progrĂšs technologiques combinĂ©e aux demandes exigeantes de l’industrie entraĂźne une augmentation de la complexitĂ© des systĂšmes embarquĂ©s. Cette complexitĂ© impose plusieurs contraintes et critĂšres Ă  respecter pour produire des systĂšmes compĂ©titifs et robustes. Aussi, les mĂ©thodologies de conception ont grandement Ă©voluĂ© au cours des derniĂšres annĂ©es pour encadrer le dĂ©veloppement de ces systĂšmes complexes et assurer leur conformitĂ© aux requis initiaux. C’est ainsi que de nouvelles approches basĂ©es sur des modĂšles sont apparues, pour pallier Ă  ces difficultĂ©s et maĂźtriser le niveau de complexitĂ©. Mais souvent ces approches basĂ©es sur des modĂšles traitent les aspects fonctionnels et logiciels du systĂšme sans prendre en considĂ©ration les aspects d’exĂ©cution sur de rĂ©elles plateformes matĂ©rielles. Les travaux dĂ©veloppĂ©s dans le cadre de ce projet de recherche visent Ă  mettre en oeuvre une nouvelle mĂ©thodologie de conception des systĂšmes embarquĂ©s. Cette mĂ©thodologie permet d’établir un lien entre le niveau fonctionnel des modĂšles et la plateforme d’exĂ©cution matĂ©rielle de l’application en question. L’approche dĂ©veloppĂ©e est basĂ©e sur l’utilisation du langage de modĂ©lisation AADL pour dĂ©crire le comportement logiciel du systĂšme embarquĂ© Ă  un haut niveau d’abstraction. Ensuite, une chaĂźne de transformation automatique convertit le modĂšle AADL vers un modĂšle SystemC. Finalement, l’environnement Space Studio est utilisĂ© pour construire un prototype virtuel de la plateforme. Cet environnement permet l’exĂ©cution des aspects fonctionnels du systĂšme sur des ressources matĂ©rielles. Les performances du systĂšme peuvent ainsi ĂȘtre validĂ©es et raffinĂ©es en se basant sur une exploration architecturale de la plateforme matĂ©rielle. Une application d’imagerie a Ă©tĂ© exploitĂ©e en tant qu’étude de cas pour expĂ©rimenter ce flot. Il s’agit d’une application de dĂ©codage vidĂ©o MJPEG (Motion JPEG). Durant l’expĂ©rimentation, un modĂšle AADL de l’application MJPEG a Ă©tĂ© dĂ©veloppĂ© dĂ©crivant son comportement fonctionnel. Ensuite, la chaĂźne de transformation utilisĂ©e traduit automatiquement le modĂšle AADL en un modĂšle SystemC. Le modĂšle SystemC a servi comme Ă©lĂ©ment de base reprĂ©sentant l’aspect logiciel dans l’environnement de prototypage virtuel et de conception conjointe Space Studio. L’outil Space Studio s’est montrĂ© utile en permettant la crĂ©ation rapide d’un prototype de plateforme matĂ©rielle d’exĂ©cution, le partitionnement des fonctions logicielles sur des ressources matĂ©rielles et la validation et raffinement des performances du systĂšme. Les rĂ©sultats d’expĂ©rimentation obtenus furent concluants. La vitesse d’exĂ©cution a Ă©tĂ© visiblement augmentĂ©e et le temps pris pour achever la simulation du systĂšme a Ă©tĂ© rĂ©duit de 81.86%. En ce qui concerne le taux d’occupation du processeur quant Ă  lui a considĂ©rablement diminuĂ©, ce qui pourra ainsi diminuer le taux de puissance consommĂ©e par les ressources matĂ©rielles. Ainsi le traitement de donnĂ©es par unitĂ© de temps s’est amĂ©liorĂ© 12 fois de plus aprĂšs le raffinement portĂ© sur l’assignement des fonctions logicielles sur la plateforme matĂ©rielle. Dans le cadre de ce projet, un article scientifique a Ă©tĂ© publiĂ© (Benyoussef et al., FĂ©vrier 2014) Ă  la confĂ©rence ERTS 2014 (Embedded Real Time Software and Systems). Ce travail prĂ©sente le contexte et la problĂ©matique liĂ©e aux mĂ©thodologies basĂ©es sur des modĂšles, la nouvelle approche de modĂ©lisation dĂ©veloppĂ©e ainsi qu’une preuve de concept avec une application de dĂ©codage MJPEG

    Traceability of Requirements and Software Architecture for Change Management

    Get PDF
    At the present day, software systems get more and more complex. The requirements of software systems change continuously and new requirements emerge frequently. New and/or modified requirements are integrated with the existing ones, and adaptations to the architecture and source code of the system are made. The process of integration of the new/modified requirements and adaptations to the software system is called change management. The size and complexity of software systems make change management costly and time consuming. To reduce the cost of changes, it is important to apply change management as early as possible in the software development cycle. Requirements traceability is considered crucial in change management for establishing and maintaining consistency between software development artifacts. It is the ability to link requirements back to stakeholders’ rationales and forward to corresponding design artifacts, code, and test cases. When changes for the requirements of the software system are proposed, the impact of these changes on other requirements, design elements and source code should be traced in order to determine parts of the software system to be changed. Determining the impact of changes on the parts of development artifacts is called change impact analysis. Change impact analysis is applicable to many development artifacts like requirements documents, detailed design, source code and test cases. Our focus is change impact analysis in requirements and software architecture. The need for change impact analysis is observed in both requirements and software architecture. When a change is introduced to a requirement, the requirements engineer needs to find out if any other requirement related to the changed requirement is impacted. After determining the impacted requirements, the software architect needs to identify the impacted architectural elements by tracing the changed requirements to software architecture. It is hard, expensive and error prone to manually trace impacted requirements and architectural elements from the changed requirements. There are tools and approaches that automate change impact analysis like IBM Rational RequisitePro and DOORS. In most of these tools, traces are just simple relations and their semantics is not considered. Due to the lack of semantics of traces in these tools, all requirements and architectural elements directly or indirectly traced from the changed requirement are candidate impacted. The requirements engineer has to inspect all these candidate impacted requirements and architectural elements to identify changes if there are any. In this thesis we address the following problems which arise in performing change impact analysis for requirements and software architecture. Explosion of impacts in requirements after a change in requirements. In practice, requirements documents are often textual artifacts with implicit structure. Most of the relations among requirements are not given explicitly. There is a lack of precise definition of relations among requirements in most tools and approaches. Due to the lack of semantics of requirements relations, change impact analysis may produce high number of false positive and false negative impacted requirements. A requirements engineer may have to analyze all requirements in the requirements document for a single change. This may result in neglecting the actual impact of a change. Manual, expensive and error prone trace establishment. Considerable research has been devoted to relating requirements and design artifacts with source code. Less attention has been paid to relating Requirements (R) with Architecture (A) by using well-defined semantics of traces. Designing architecture based on requirements is a problem solving process that relies on human experience and creativity, and is mainly manual. The software architect may need to manually assign traces between R&A. Manual trace assignment is time-consuming, expensive and error prone. The assigned traces might be incomplete and invalid. Explosion of impacts in software architecture after a change in requirements. Due to the lack of semantics of traces between R&A, change impact analysis may produce high number of false positive and false negative impacted architectural elements. A software architect may have to analyze all architectural elements in the architecture for a single requirements change. In this thesis we propose an approach that reduces the explosion of impacts in R&A. The approach employs semantic information of traces and is supported by tools. We consider that every relation between software development artifacts or between elements in these artifacts can play the role of a trace for a certain traceability purpose like change impact analysis. We choose Model Driven Engineering (MDE) as a solution platform for our approach. MDE provides a uniform treatment of software artifacts (e.g. requirements documents, software design and test documents) as models. It also enables using different formalisms to reason about development artifacts described as models. To give an explicit structure to requirements documents and treat requirements, architecture and traces in a uniform way, we use metamodels and models with formally defined semantics. The thesis provides the following contributions: A modeling language for definition of requirements models with formal semantics. The language is defined according to the MDE principles by defining a metamodel. It is based on a survey about the most commonly found requirements types and relation types. With this language, the requirements engineer can explicitly specify the requirements and the relations among them. The semantics of these entities is given in First Order Logic (FOL) and allows two activities. First, new relations among requirements can be inferred from the initial set of relations. Second, requirements models can be automatically checked for consistency of the relations. Tool for Requirements Inferencing and Consistency Checking (TRIC) is developed to support both activities. The defined semantics is used in a technique for change impact analysis in requirements models. A change impact analysis technique for requirements using semantics of requirements relations and requirements change types. The technique aims at solving the problem of explosion of impacts in requirements when semantics of requirements relations is missing. The technique uses formal semantics of requirements relations and requirements change types. A classification of requirements changes based on the structure of a textual requirement is given and formalized. The semantics of requirements change types is based on FOL. We support three activities for impact analysis. First, the requirements engineer proposes changes according to the change classification before implementing the actual changes. Second, the requirements engineer indentifies the propagation of the changes to related requirements. The change alternatives in the propagation are determined based on the semantics of change types and requirements relations. Third, possible contradicting changes are identified. We extend TRIC with a support for these activities. The tool automatically determines the change propagation paths, checks the consistency of the changes, and suggests alternatives for implementing the change. A technique that provides trace establishment between R&A by using architecture verification and semantics of traces. It is hard, expensive and error prone to manually establish traces between R&A. We present an approach that provides trace establishment by using architecture verification together with semantics of requirements relations and traces. We use a trace metamodel with commonly used trace types. The semantics of traces is formalized in FOL. Software architectures are expressed in the Architecture Analysis and Design Language (AADL). AADL is provided with a formal semantics expressed in Maude. The Maude tool set allows simulation and verification of architectures. The first way to establish traces is to use architecture verification techniques. A given requirement is reformulated as a property in terms of the architecture. The architecture is executed and a state space is produced. This execution simulates the behavior of the system on the architectural level. The property derived from the requirement is checked by the Maude model checker. Traces are generated between the requirement and the architectural components used in the verification of the property. The second way to establish traces is to use the requirements relations together with the semantics of traces. Requirements relations are reflected in the connections among the traced architectural elements based on the semantics of traces. Therefore, new traces are inferred from existing traces by using requirements relations. We use semantics of requirements relations and traces to both generate/validate traces and generate/validate requirements relations. There is a tool support for our approach. The tool provides the following: (1) generation/validation of traces by using requirements relations and/or verification of architecture, (2) generation/validation of requirements relations by using traces. A change impact analysis technique for software architecture using architecture verification and semantics of traces between R&A. The software architect needs to identify the impacted architectural elements after requirements change. We present a change impact analysis technique for software architecture using architecture verification and semantics of traces. The technique is semi-automatic and requires participation of the software architect. Our technique has two parts. The first part is to identify the architectural elements that implement the system properties to which proposed requirements changes are introduced. By having the formal semantics of requirements relations and traces, we identify which parts of software architecture are impacted by a proposed change in requirements. We have extended TRIC for determining candidate impacted architectural elements. The second part of our technique is to propose possible changes for software architecture when the software architecture does not satisfy the new and/or changed requirements. The technique is based on architecture verification. The output of verification is a counter example if the requirements are not satisfied. The counter example is used with a classification of architectural changes in order to propose changes in the software architecture. These changes produce a new version of the architecture that possibly satisfies the new or the changed requirements

    A Catalog of Reusable Design Decisions for Developing UML/MOF-based Domain-specific Modeling Languages

    Get PDF
    In model-driven development (MDD), domain-specific modeling languages (DSMLs) act as a communication vehicle for aligning the requirements of domain experts with the needs of software engineers. With the rise of the UML as a de facto standard, UML/MOF-based DSMLs are now widely used for MDD. This paper documents design decisions collected from 90 UML/MOF-based DSML projects. These recurring design decisions were gained, on the one hand, by performing a systematic literature review (SLR) on the development of UML/MOF-based DSMLs. Via the SLR, we retrieved 80 related DSML projects for review. On the other hand, we collected decisions from developing ten DSML projects by ourselves. The design decisions are presented in the form of reusable decision records, with each decision record corresponding to a decision point in DSML development processes. Furthermore, we also report on frequently observed (combinations of) decision options as well as on associations between options which may occur within a single decision point or between two decision points. This collection of decision-record documents targets decision makers in DSML development (e.g., DSML engineers, software architects, domain experts).Series: Technical Reports / Institute for Information Systems and New Medi

    Survey of Template-Based Code Generation

    Full text link
    L'automatisation de la génération des artefacts textuels à partir des modÚles est une étape critique dans l'Ingénierie Dirigée par les ModÚles (IDM). C'est une transformation de modÚles utile pour générer le code source, sérialiser les modÚles dans de stockages persistents, générer les rapports ou encore la documentation. Parmi les différents paradigmes de transformation de modÚle-au-texte, la génération de code basée sur les templates (TBCG) est la plus utilisée en IDM. La TBCG est une technique de génération qui produit du code à partir des spécifications de haut niveau appelées templates. Compte tenu de la diversité des outils et des approches, il est nécessaire de classifier et de comparer les techniques de TBCG existantes afin d'apporter un soutien approprié aux développeurs. L'objectif de ce mémoire est de mieux comprendre les caractéristiques des techniques de TBCG, identifier les tendances dans la recherche, et éxaminer l'importance du rÎle de l'IDM par rapport à cette approche. J'évalue également l'expressivité, la performance et la mise à l'échelle des outils associés selon une série de modÚles. Je propose une étude systématique de cartographie de la littérature qui décrit une intéressante vue d'ensemble de la TBCG et une étude comparitive des outils de la TBCG pour mieux guider les dévloppeurs dans leur choix. Cette étude montre que les outils basés sur les modÚles offrent plus d'expressivité tandis que les outils basés sur le code sont les plus performants. Enfin, Xtend2 offre le meilleur compromis entre l'expressivité et la performance.A critical step in model-driven engineering (MDE) is the automatic synthesis of a textual artifact from models. This is a very useful model transformation to generate application code, to serialize the model in persistent storage, generate documentation or reports. Among the various model-to-text transformation paradigms, Template-Based Code Generation (TBCG) is the most popular in MDE. TBCG is a synthesis technique that produces code from high-level specifications, called templates. It is a popular technique in MDE given that they both emphasize abstraction and automation. Given the diversity of tools and approaches, it is necessary to classify and compare existing TBCG techniques to provide appropriate support to developers. The goal of this thesis is to better understand the characteristics of TBCG techniques, identify research trends, and assess the importance of the role of MDE in this code synthesis approach. We also evaluate the expressiveness, performance and scalability of the associated tools based on a range of models that implement critical patterns. To this end, we conduct a systematic mapping study of the literature that paints an interesting overview of TBCG and a comparative study on TBCG tools to better guide developers in their choices. This study shows that model-based tools offer more expressiveness whereas code-based tools performed much faster. Xtend2 offers the best compromise between the expressiveness and the performance

    Property driven verification framework: application to real time property for UML MARTE software design

    Get PDF
    Les techniques formelles de la famille « vĂ©rification de modĂšles » (« model checking ») se heurtent au problĂšme de l’explosion combinatoire. Ceci limite les perspectives d’exploitation dans des projets industriels. Ce problĂšme est provoquĂ© par la combinatoire dans la construction de l’espace des Ă©tats possibles durant l’exĂ©cution des systĂšmes modĂ©lisĂ©s. Le nombre d’états pour des modĂšles de systĂšmes industriels rĂ©alistes dĂ©passe rĂ©guliĂšrement les capacitĂ©s des ressources disponibles en calcul et stockage. Cette thĂšse dĂ©fend l’idĂ©e qu’il est possible de rĂ©duire cette combinatoire en spĂ©cialisant les outils pour des familles de propriĂ©tĂ©s. Elle propose puis valide expĂ©rimentalement un ensemble de mĂ©thodes pour le dĂ©veloppement de ce type d’outils en suivant une approche guidĂ©e par les propriĂ©tĂ©s appliquĂ©e au contexte temps rĂ©el. Il s’agit donc de construire des outils d’analyse performants pour des propriĂ©tĂ©s temps rĂ©el qui soient exploitables pour des modĂšles industriels de taille rĂ©aliste. Les langages considĂ©rĂ©s sont, d’une part UML Ă©tendu par le profil MARTE pour la modĂ©lisation par les utilisateurs, et d’autre part les rĂ©seaux de Petri temporisĂ©s comme support pour la vĂ©rification. Les propositions sont validĂ©es sur un cas d’étude industriel rĂ©aliste issu du monde avionique : l’étude de la latence et la fraicheur des donnĂ©es dans un systĂšme de gestion des alarmes exploitant les technologies d’Avionique Modulaire IntĂ©grĂ©e. Ces propositions ont Ă©tĂ© mise en oeuvre comme une boite Ă  outils qui intĂšgre les cinq contributions suivantes: la dĂ©finition de la sĂ©mantique d’exĂ©cution spĂ©cifiques aux propriĂ©tĂ©s temps rĂ©el pour les modĂšles d’architecture et de comportement spĂ©cifiĂ©s en UML/MARTE; la spĂ©cification des exigences temps rĂ©el en s’appuyant sur un ensemble de patrons de vĂ©rification atomiques dĂ©diĂ©s aux propriĂ©tĂ© temps rĂ©el; une mĂ©thode itĂ©rative d’analyse Ă  base d’observateurs pour des rĂ©seaux de Petri temporisĂ©s; des techniques de rĂ©duction de l’espace d’états spĂ©cifiques aux propriĂ©tĂ©s temps rĂ©el pour des RĂ©seaux de Petri temporisĂ©s; une approche pour l’analyse des erreurs dĂ©tectĂ©es par « vĂ©rification des modĂšles » en s’appuyant sur des idĂ©es inspirĂ©es de la « fouille de donnĂ©es » (« data mining »). ABSTRACT : Automatic formal verification such as model checking faces the combinatorial explosion issue. This limits its application in indus- trial projects. This issue is caused by the explosion of the number of states during system’s execution , as it may easily exceed the amount of available computing or storage resources. This thesis designs and experiments a set of methods for the development of scalable verification based on the property-driven approach. We propose efficient approaches based on model checking to verify real-time requirements expressed in large scale UML-MARTE real-time system designs. We rely on the UML and its profile MARTE as the end-user modeling language, and on the Time Petri Net (TPN) as the verification language. The main contribution of this thesis is the design and implementation of a property-driven verification prototype toolset dedicated to real-time properties verification for UML-MARTE real-time software designs. We validate this toolset using an avionic use case and its user requirements. The whole prototype toolset includes five contributions: definition of real-time property specific execution semantics for UML-MARTE architecture and behavior models; specification of real- time requirements relying on a set of verification dedicated atomic real- time property patterns; real-time property specific observer-based model checking approach in TPN; real-time property specific state space reduction approach for TPN; and fault localization approach in model checking
    corecore