693 research outputs found

    ASM refinement preserving invariants

    Get PDF

    Formal semantics for refinement verification of entreprise models

    Get PDF
    In this dissertation we investigate how Business/IT alignment in enterprise models can be enhanced by using a software engineering stepwise refinement paradigm. To have an IT system that supports an enterprise and meets the enterprise business needs, management seeks to align business system with IT systems. Enterprise Architecture (EA) is the discipline that addresses the design of aligned business and IT systems. SEAM is an Enterprise Architecture method, developed in the Laboratory of Systemic Modeling (LAMS) at EPFL. SEAM defines a visual language for building an enterprise model of an organization. In this work, we develop a theory and propose a technique to validate an alignment between the system specifications expressed in the SEAM language. We base our reasoning on the idea that each system (an organization, a business system, or an IT system) can be modeled using a set of hierarchical specifications, explicitly related to each other. Considering these relations as refinement relations, we transform the problem of alignment validation into the problem of refinement verification for system specifications: we consider that two system specifications are aligned if one is correctly refines the other. Model-driven engineering (MDE) defines refinement as a transformation between two visual (or program) specifications, where a specification is gradually refined into an implementation. MDE, however, does not formalize refinement verification. Software engineering (SE) formalizes refinement for program specifications. It provides a theory and techniques for refinement verification. To benefit from the formal theories and the refinement verification techniques defined in SE, we extend the SEAM language with additional concepts (e.g. preconditions, postconditions, invariants, etc). This extension enables us to increase the precision of the SEAM visual specifications. Then we define a formal semantics for the extended SEAM modeling language. This semantics is based on first-order logic and set theory; it allows us to reduce the problem of refinement verification to the validation of a first-order logic formula. In software engineering, the tools for the automated analysis of program specifications are defined. To use these tools for refinement verification, we define a translation from SEAM visual specifications to formal specification languages. We apply, using case studies, our theory and technique in several problem areas to verify: (1) if a business process design and re-design correspond to high level business process specifications; (2) if a service implementation corresponds to its specifications. These case studies have been presented to a group of domain experts who practice business/IT alignment. This inquiry has shown that our research has a potential practical value

    Formal design of data warehouse and OLAP systems : a dissertation presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Information Systems at Massey University, Palmerston North, New Zealand

    Get PDF
    A data warehouse is a single data store, where data from multiple data sources is integrated for online business analytical processing (OLAP) of an entire organisation. The rationale being single and integrated is to ensure a consistent view of the organisational business performance independent from different angels of business perspectives. Due to its wide coverage of subjects, data warehouse design is a highly complex, lengthy and error-prone process. Furthermore, the business analytical tasks change over time, which results in changes in the requirements for the OLAP systems. Thus, data warehouse and OLAP systems are rather dynamic and the design process is continuous. In this thesis, we propose a method that is integrated, formal and application-tailored to overcome the complexity problem, deal with the system dynamics, improve the quality of the system and the chance of success. Our method comprises three important parts: the general ASMs method with types, the application tailored design framework for data warehouse and OLAP, and the schema integration method with a set of provably correct refinement rules. By using the ASM method, we are able to model both data and operations in a uniform conceptual framework, which enables us to design an integrated approach for data warehouse and OLAP design. The freedom given by the ASM method allows us to model the system at an abstract level that is easy to understand for both users and designers. More specifically, the language allows us to use the terms from the user domain not biased by the terms used in computer systems. The pseudo-code like transition rules, which gives the simplest form of operational semantics in ASMs, give the closeness to programming languages for designers to understand. Furthermore, these rules are rooted in mathematics to assist in improving the quality of the system design. By extending the ASMs with types, the modelling language is tailored for data warehouse with the terms that are well developed for data-intensive applications, which makes it easy to model the schema evolution as refinements in the dynamic data warehouse design. By providing the application-tailored design framework, we break down the design complexity by business processes (also called subjects in data warehousing) and design concerns. By designing the data warehouse by subjects, our method resembles Kimball's "bottom-up" approach. However, with the schema integration method, our method resolves the stovepipe issue of the approach. By building up a data warehouse iteratively in an integrated framework, our method not only results in an integrated data warehouse, but also resolves the issues of complexity and delayed ROI (Return On Investment) in Inmon's "top-down" approach. By dealing with the user change requests in the same way as new subjects, and modelling data and operations explicitly in a three-tier architecture, namely the data sources, the data warehouse and the OLAP (online Analytical Processing), our method facilitates dynamic design with system integrity. By introducing a notion of refinement specific to schema evolution, namely schema refinement, for capturing the notion of schema dominance in schema integration, we are able to build a set of correctness-proven refinement rules. By providing the set of refinement rules, we simplify the designers's work in correctness design verification. Nevertheless, we do not aim for a complete set due to the fact that there are many different ways for schema integration, and neither a prescribed way of integration to allow designer favored design. Furthermore, given its °exibility in the process, our method can be extended for new emerging design issues easily

    Rigorous development process of a safety-critical system: from ASM models to Java code

    Get PDF
    The paper presents an approach for rigorous development of safety-critical systems based on the Abstract State Machine formal method. The development process starts from a high level formal view of the system and, through refinement, derives more detailed models till the desired level of specification. Along the process, different validation and verification activities are available, as simulation, model review, and model checking. Moreover, each refinement step can be proved correct using an SMT-based approach. As last step of the refinement process, a Java implementation can be developed and linked to the formal specification. The correctness of the implementation w.r.t. its formal specification can be proved by means of model-based testing and runtime verification. The process is exemplified by using a Landing Gear System as case study

    Software & system verification with KIV

    Get PDF

    A formal framework for specification-based embedded real-time system engineering

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Aeronautics and Astronautics, 2008.Includes bibliographical references (v. 2, p. 517-545).The increasing size and complexity of modern software-intensive systems present novel challenges when engineering high-integrity artifacts within aggressive budgetary constraints. Among these challenges, ensuring confidence in the engineered system, through validation and verification activities, represents the high cost item on many projects. The expensive nature of engineering high-integrity systems using traditional approaches can be partly attributed to the lack of analysis facilities during the early phases of the lifecycle, causing the validation and verification activities to begin too late in the engineering lifecycle. Other challenges include the management of complexity, opportunities for reuse without compromising confidence, and the ability to trace system features across lifecycle phases. The use of models as a specification mechanism provides an approach to mitigate complexity through abstraction. Furthermore, if the specification approach has formal underpinnings, the use of models can be leveraged to automate engineering activities such as formal analysis and test case generation. The research presented in this thesis proposes an engineering framework which addresses the high cost of validation and verification activities through specification-based system engineering. More specifically, the framework provides an integrated approach to embedded real-time system engineering which incorporates specification, simulation, formal verification, and test-case generation. The framework aggregates the state-of-the-art in individual software engineering disciplines to provide an end-to-end approach to embedded real-time system engineering. The key aspects of the framework include: * A novel specification language, the Timed Abstract State Machine (TASM) language, which extends the theory of Abstract State Machines (ASM).(cont.) The TASM language is a literate formal specification language which can be applied and multiple levels of abstraction and which can express the three key aspects of embedded real-time systems - function, time, and resources. * Automated verification capabilities achieved through the integration of mature analysis engines, namely the UPPAAL tool suite and the SAT4J SAT solver. The verification capabilities provided by the framework include completeness and consistency verification, model checking, execution time analysis, and resource consumption analysis. * Bi-directional traceability of model features across levels of abstraction and lifecycle phases. Traceability is achieved syntactically through archetypical refinement types; each refinement type provides correctness criteria, which, if met, guarantee semantic integrity through the refinement. * Automated test case generation capabilities for unit testing, integration testing, and regression testing. Unit test cases are generated to achieve TASM specification coverage through the rule coverage criterion. Integration test case generation is achieved through the hierarchical composition of unit test cases. Regression test case generation is achieved by leveraging the bi-directional traceability of model features. The framework is implemented into an integrated tool suite, the TASM toolset, which incorporates the UPPAAL tool suite and the SAT4J SAT solver. The toolset and framework are evaluated through experimentation on three industrial case studies - an automated manufacturing system, a "drive-by-wire" system used at a major automotive manufacturer, and a scripting environment used on the International Space Station.by Martin Ouimet.Ph.D

    Weak Progressive Forward Simulation Is Necessary and Sufficient for Strong Observational Refinement

    Get PDF
    Hyperproperties are correctness conditions for labelled transition systems that are more expressive than traditional trace properties, with particular relevance to security. Recently, Attiya and Enea studied a notion of strong observational refinement that preserves all hyperproperties. They analyse the correspondence between forward simulation and strong observational refinement in a setting with only finite traces. We study this correspondence in a setting with both finite and infinite traces. In particular, we show that forward simulation does not preserve hyperliveness properties in this setting. We extend the forward simulation proof obligation with a (weak) progress condition, and prove that this weak progressive forward simulation is equivalent to strong observational refinement

    Optimal market making under partial information and numerical methods for impulse control games with applications

    Get PDF
    The topics treated in this thesis are inherently two-fold. The first part considers the problem of a market maker who wants to optimally set bid/ask quotes over a finite time horizon, to maximize her expected utility. The intensities of the orders she receives depend not only on the spreads she quotes, but also on unobservable factors modelled by a hidden Markov chain. This stochastic control problem under partial information is solved by means of stochastic filtering, control and piecewise-deterministic Markov processes theory. The value function is characterized as the unique continuous viscosity solution of its dynamic programming equation. Afterwards, the analogous full information problem is solved and results are compared numerically through a concrete example. The optimal full information spreads are shown to be biased when the exact market regime is unknown, as the market maker needs to adjust for additional regime uncertainty in terms of P&L sensitivity and observable order ow volatility. The second part deals with numerically solving nonzero-sum stochastic differential games with impulse controls. These offer a realistic and far-reaching modelling framework for applications within finance, energy markets and other areas, but the diffculty in solving such problems has hindered their proliferation. Semi-analytical approaches make strong assumptions pertaining very particular cases. To the author's best knowledge, there are no numerical methods available in the literature. A policy-iteration-type solver is proposed to solve an underlying system of quasi-variational inequalities, and it is validated numerically with reassuring results. In particular, it is observed that the algorithm does not enjoy global convergence and a heuristic methodology is proposed to construct initial guesses. Eventually, the focus is put on games with a symmetric structure and a substantially improved version of the former algorithm is put forward. A rigorous convergence analysis is undertaken with natural assumptions on the players strategies, which admit graph-theoretic interpretations in the context of weakly chained diagonally dominant matrices. A provably convergent single-player impulse control solver, often outperforming classical policy iteration, is also provided. The main algorithm is used to compute with high precision equilibrium payoffs and Nash equilibria of otherwise too challenging problems, and even some for which results go beyond the scope of all the currently available theory

    Proving the Fidelity of Simulations of Event-B Models

    Get PDF
    International audienceA major hindrance to the use of formal methods is the difficulty to validate the models, particularly at the early stages of the development. We propose to build simulations: programs automatically generated from the specifications but with user-provided implementations of the non-executable traits of the models. We present such a simulation. Of course, the question of the fidelity of the simulation to the model is raised in such a setting. We provide a formal definition of fidelity and the proof obligations that can be attached to each hand-coded element so that fidelity can be proven
    • 

    corecore