34 research outputs found

    Dynamic Validation of OCL Constraints with mOdCL

    Get PDF
    This paper presents mOdCL, a Maude based evaluator of OCL expressions and validator of OCL constraints. Given its OCL expression evaluator, the use of execution strategies allows mOdCL, not only validating invariant constraints on concrete system states, but also dynamically validating invariants and pre- and post-conditions on the successive states obtained during system execution

    Tracing Properties of UML and OCL Models with Maude

    Full text link
    The starting point of this paper is a system described in form of a UML class diagram where system states are characterized by OCL invariants and system transitions are defined by OCL pre- and postconditions. The aim of our approach is to assist the developer in learning about the consequences of the described system states and transitions and about the formal implications of the properties that are explicitly given. We propose to draw conclusions about the stated constraints by translating the UML and OCL model into the algebraic specification language and system Maude, which is based on rewrite logic. We will concentrate in this paper on employing Maude's capabilities for state search. Maude's state search offers the possibility to describe a start configuration of the system and then explore all configurations reachable by rewriting. The search can be adjusted by formulating requirements for the allowed states and the allowed transitions.Comment: In Proceedings AMMSE 2011, arXiv:1106.596

    Generating Effective Test Suites for Model Transformations Using Classifying Terms

    Get PDF
    Generating sample models for testing a model transformation is no easy task. This paper explores the use of classifying terms and stratified sampling for developing richer test cases for model transformations. Classifying terms are used to define the equivalence classes that characterize the relevant subgroups for the test cases. From each equivalence class of object models, several representative models are chosen depending on the required sample size. We compare our results with test suites developed using random sampling, and conclude that by using an ordered and stratified approach the coverage and effectiveness of the test suite can be significantly improved.Universidad de MĂĄlaga. Campus de Excelencia Internacional AndalucĂ­a Tech

    Towards the Formal Verification of Model Transformations: An Application to Kermeta

    Get PDF
    Model-Driven Engineering (MDE) is becoming a popular engineering methodology for developing large-scale software applications, using models and transformations as primary principles. MDE is now being successfully applied to domain-specific languages (DSLs), which target a narrow subject domain like process management, telecommunication, product lines, smartphone applications among others, providing experts high-level and intuitive notations very close to their problem domain. More recently, MDE has been applied to safety-critical applications, where failure may have dramatic consequences, either in terms of economic, ecologic or human losses. These recent application domains call for more robust and more practical approaches for ensuring the correctness of models and model transformations. Testing is the most common technique used in MDE for ensuring the correctness of model transformations, a recurrent, yet unsolved problem in MDE. But testing suffers from the so-called coverage problem, which is unacceptable when safety is at stake. Rather, exhaustive coverage is required in this application domain, which means that transformation designers need to use formal analysis methods and tools to meet this requirement. Unfortunately, two factors seem to limit the use of such methods in an engineer’s daily life. First, a methodological factor, because MDE engineers rarely possess the effective knowledge for deploying formal analysis techniques in their daily life developments. Second, a practical factor, because DSLs do not necessarily have a formal explicit semantics, which is a necessary enabler for exhaustive analysis. In this thesis, we contribute to the problem of formal analysis of model transformations regarding each perspective. On the conceptual side, we propose a methodological framework for engineering verified model transformations based on current best practices. For that purpose, we identify three important dimensions: (i) the transformation being built; (ii) the properties of interest ensuring the transformation’s correctness; and finally, (iii) the verification technique that allows proving these properties with minimal effort. Finding which techniques are better suited for which kind of properties is the concern of the Computer-Aided Verification community. Consequently in this thesis, we focus on studying the relationship between transformations and properties. Our methodological framework introduces two novel notions. A transformation intent gathers all transformations sharing the same purpose, abstracting from the way the transformation is expressed. A property class captures under the same denomination all properties sharing the same form, abstracting away from their underlying property languages. The framework consists of mapping each intent with its characteristic set of property classes, meaning that for proving the correctness of a particular transformation obeying this intent, one has to prove properties of these specific classes. We illustrate the use and utility of our framework through the detailed description of five common intents in MDE, and their application to a case study drawn from the automative software domain, consisting of a chain of more than thirty transformations. On a more practical side, we study the problem of verifying DSLs whose behaviour is expressed with Kermeta. Kermeta is an object-oriented transformation framework aligned with Object Management Group standard specification MOF (Meta-Object Facility). It can be used for defining metamodels and models, as well as their behaviour. Kermeta lacks a formal semantics: we first specify such a semantics, and then choose an appropriate verification domain for handling the analysis one is interested in. Since the semantics is defined at the level of Kermeta’s transformation language itself, our work presents two interesting features: first, any DSL whose behaviour is defined using Kermeta (more precisely, any transformation defined with Kermeta) enjoys a de facto formal underground for free; second, it is easier to define appropriate abstractions for targeting specific analysis for this full-fledged semantics than defining specific semantics for each possible kind of analysis. To illustrate this point, we have selected Maude, a powerful rewriting system based on algebraic specifications equipped with model-checking and theorem-proving capabilities. Maude was chosen because its underlying formalism is close to the mathematical tools we use for specifying the formal semantics, reducing the implementation gap and consequently limiting the possible implementation mistakes. We validate our approach by illustrating behavioural properties of small, yet representative DSLs from the literature

    On Formalizing UML and OCL Features and Their Employment to Runtime Verification

    Get PDF
    Model-driven development (MDD) has been identified as a promising approach for developing software. By using abstract models of a system and by generating parts of the system out of these models, one tries to improve the efficiency of the overall development process and the quality of the resulting software. In the context of MDD the Unified Modeling Language (UML) and its related textual Object Constraint Language (OCL) have gained a high recognition. To be able to generate systems of high quality and to allow for interoperability between modeling tools, a well-defined semantics for these languages is required. This thesis summarizes published work in this context that employs an endogenous metamodeling approach to define the semantics of newer elements of the UML. While the covered elements are exhaustively used to define relations between elements of the metamodel of the UML, the UML specification leaves out a precise definition of their semantics. Our proposed approach uses models, not only to define the abstract syntax, but also to define the semantics of UML. By using UML and OCL for this, existing modeling tools can be used to validate the definition. The second part of this thesis covers work on the usage of UML and OCL models for runtime verification. It is shown how models can still be used at the end of a software development process, i. e., after an implementation has manually been added to generated parts, even though they are not used as central parts of the development process. This work also influenced the integration of protocol state machines into a modeling tool, which lead to publications about the runtime semantics of state machines and the capabilities to declaratively specify behavior using state machines

    Towards a K Semantics for OCL

    Get PDF
    International audienceWe give a formal definition to a significant subset of the Object Constraint Language (OCL) in the K framework. The chosen subset includes the usual arithmetical, Boolean (including quantifiers), and string expressions; collection expressions (including iterators and navigation); and pre/post conditions for methods. Being executable, our definition provides us, for free, with an interpreter for the chosen subset of OCL. It can be used for free in K definitions of languages having OCL as a component We illustrate some of the advantages of K by comparing our semantical definition of OCL with the official semantics from the language's standard. We also report on a tool implementing our definition that users can try online.Nous donnons une sĂ©mantique Ă  un sous-ensemble significatif du langage OCL (Object Constraint Langage) dans le cadre formel K. Le sous-ensemble choisi inclut les expressions habituelles arithmĂ©tiques, logiques (y compris avec quantifications), et de type chaĂźne de caractĂšres; les expressions de type collection (y compris les itĂ©rateurs et la navigation); et les pre/post conditions pour les mĂ©thodes des classes. Notre dĂ©finition est exĂ©cutable et produit par construction un interprĂ©teur pour le sous-ensemble d'OCL choisi. L'interprĂ©teur peut ĂȘtre inclus comme composante dans d'autre langages dĂ©finis en K qui incluent OCL en tant que sous-langage. Nous illustrons les avantages de notre sĂ©mantique en la comparant avec la sĂ©mantique issue de la norme (standard) OCL. Enfin, nous prĂ©sentons un outil, disponible en ligne, qui implĂ©mente notre approche

    Extremely high data-rate, reliable network systems research

    Get PDF
    Significant progress was made over the year in the four focus areas of this research group: gigabit protocols, extensions of metropolitan protocols, parallel protocols, and distributed simulations. Two activities, a network management tool and the Carrier Sensed Multiple Access Collision Detection (CSMA/CD) protocol, have developed to the point that a patent is being applied for in the next year; a tool set for distributed simulation using the language SIMSCRIPT also has commercial potential and is to be further refined. The year's results for each of these areas are summarized and next year's activities are described

    Debugging Maude programs via runtime assertion checking and trace slicing

    Full text link
    [EN] This is the author’s version of a work that was accepted for publication in . Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Journal of Logical and Algebraic Methods in Programming, [VOL 85, ISSUE 5, (2016)] DOI 10.1016/j.jlamp.2016.03.001.In this paper we propose a dynamic analysis methodology for improving the diagnosis of erroneous Maude programs. The key idea is to combine runtime checking and dynamic trace slicing for automatically catching errors at runtime while reducing the size and complexity of the erroneous traces to be analyzed (i.e., those leading to states failing to satisfy some of the assertions). First, we formalize a technique that is aimed at automatically detecting deviations of the program behavior (symptoms) with respect to two types of user-defined assertions: functional assertions and system assertions. The proposed dynamic checking is provably sound in the sense that all errors flagged are definitely violations of the specifications. Then, upon eventual assertion violations we generate accurate trace slices that help identify the cause of the error. Our methodology is based on (i) a logical notation for specifying assertions that are imposed on execution runs; (ii) a runtime checking technique that dynamically tests the assertions; and (iii) a mechanism based on (equational) least general generalization that automatically derives accurate criteria for slicing from falsified assertions. Finally, we report on an implementation of the proposed technique in the assertion-based, dynamic analyzer ABETS and show how the forward and backward tracking of asserted program properties leads to a thorough trace analysis algorithm that can be used for program diagnosis and debugging. © 2016 Elsevier Inc. All rights reserved.This work has been partially supported by the EU (FEDER) and the Spanish MINECO under grants TIN2015-69175-C4-1-R and TIN2013-45732-C4-1-P, and by Generalitat Valenciana Ref. PROMETEOII/2015/013. F. Frechina was supported by FPU-ME grant AP2010-5681, and J. Sapiña was supported by FPI-UPV grant SP2013-0083 and mobility grant VIIT-3946.Alpuente Frasnedo, M.; Ballis, D.; Frechina, F.; Sapiña-Sanchis, J. (2016). Debugging Maude programs via runtime assertion checking and trace slicing. Journal of Logical and Algebraic Methods in Programming. 85(5):707-736. https://doi.org/10.1016/j.jlamp.2016.03.001S70773685

    ReprĂ©sentation et vĂ©riïŹcation d’un environnement intelligent Ă  partir de spĂ©ciïŹcations utilisateur en langage naturel

    Get PDF
    International audienceAujourd'hui des capteurs et actionneurs associĂ©s Ă  des pĂ©riphĂ©riques de contrĂŽle peuvent ĂȘtre installĂ©s n'importe oĂč, notamment dans nos maisons, crĂ©ant des environnements intelligents. Notre objectif est de permettre Ă  un utilisateur de configurer son propre environne-ment intelligent en dĂ©crivant ses besoins, i.e. les rĂšgles de comportement de l'environnement, en langage naturel (LN). Nous explorons les possibilitĂ©s offertes par une ontologie formelle pour faire le lien entre spĂ©cifications en LN et spĂ©cifications formelles. L'analyse des spĂ©cifications LN permet l'instanciation automatique de l'ontologie afin qu'elle reprĂ©sente le comportement dĂ©crit par l'utilisateur. Les rĂšgles de comportement reprĂ©sentĂ©es sont alors traduites en spĂ©cifi-cations Maude, afin de complĂ©ter les vĂ©rifications possibles sous OWL. Nous montrons que tout au long de ce processus de formalisation, il est possible de vĂ©rifier la complĂ©tude, la cohĂ©rence et la conformitĂ© des exigences spĂ©cifiĂ©es et de maintenir une traçabilitĂ© entre spĂ©cification LN et spĂ©cifications formelles autorisant un retour prĂ©cis Ă  l'utilisateur. ABSTRACT. Nowadays sensors and actuators associated with control devices can be installed anywhere, as in our homes creating smart environments. Our goal is to allow a user to configure her own smart environment by describing her needs, i.e. the environment behavioral rules, in natural language (NL). We explore the possibilities offered by an ontology, to transform NL specifications into formal specifications. Analysis of user requirements allows us an automatic instantiation of the ontology so that it represents the behavior described by the user. The represented behavioral rules are then translated into Maude specifications to complement ve-rifications realized in OWL. We show that throughout this formalization process, it is possible to check the completeness, the consistency and the conformity of the specified requirements and maintain traceability between NL requirements and formal specifications to allow a precise feedback to the user. MOTS-CLÉS : environnement intelligent, ontologie, spĂ©cifications, vĂ©rification formelle

    Investigations into the model driven design of distribution patterns for web service compositions

    Get PDF
    Increasingly, distributed systems are being used to provide enterprise level solutions with high scalability and fault tolerance These solutins are often built using Web servces that are composed to perform useful business functions Acceptance of these composed systems is often constrained by a number of non-functional properties of the system such as availability, scalability and performance There are a number of drstribution patterns that each exhibit different non-functional charactmstics These patterns are re-occuring distribution schemes that express how a system is to be assembled and subsequently deployed. Traditional approaches to development of Web service compositions exhibit a number of Issues Firstly, Web service composition development is often ad-hoc and requires considerable low level coding effort for realisatlon Such systems often exhibit fixed architectures, making maintenance difficult and error prone Additionally, a number of the non-funchonal reqwements cannot be easily assessed by exammng low level code. In this thesis we explicitly model the compositional aspects of Web service compositions usmg UML Activity diagrams Ths approach uses a modehng and transformation framework, based on Model Dnven Software Development (MDSD), going from high level models to an executable system The framework is guided by a methodological framework whose primary artifact is a distribution pattern model, chosen from the supplied catalog. Our modelling and transfomation framework improves the development process of Web service compositions, with respect to a number of criteria, when compared to the traditional handcrafted approach Specifically, we negate the coding effort traditionally associated with Web service composition development Maintenance overheads of the solution are also slgnificantly reduced, while improved mutability 1s achieved through a flexible architecture when compared with existing tools We also improve the product output from the development process by exposing the non-functional runtime properties of Web service compositlons using distribution patterns
    corecore