5,074 research outputs found

    Adaptable transition systems

    Get PDF
    We present an essential model of adaptable transition systems inspired by white-box approaches to adaptation and based on foundational models of component based systems. The key feature of adaptable transition systems are control propositions, imposing a clear separation between ordinary, functional behaviours and adaptive ones. We instantiate our approach on interface automata yielding adaptable interface automata, but it may be instantiated on other foundational models of component-based systems as well. We discuss how control propositions can be exploited in the specification and analysis of adaptive systems, focusing on various notions proposed in the literature, like adaptability, control loops, and control synthesis

    A comparative reliability analysis of ETCS train radio communications

    Get PDF
    StoCharts have been proposed as a UML statechart extension for performance and dependability evaluation, and were applied in the context of train radio reliability assessment to show the principal tractability of realistic cases with this approach. In this paper, we extend on this bare feasibility result in two important directions. First, we sketch the cornerstones of a mechanizable translation of StoCharts to MoDeST. The latter is a process algebra-based formalism supported by the Motor/Möbius tool tandem. Second, we exploit this translation for a detailed analysis of the train radio case study

    From StoCharts to MoDeST: a comparative reliability analysis of train radio communications

    Get PDF
    StoCharts have been proposed as a UML statechart extension for performance and dependability evaluation, and have been applied in the context of train radio reliability assessment to show the principal tractability of realistic cases with this approach. In this paper, we extend on this bare feasibility result in two important directions. First, we sketch the cornerstones of a mechanizable translation of StoCharts to MoDeST. The latter is a process algebra-based formalism supported by the Motor/Möbius tool tandem. Second, we exploit this translation for a detailed analysis of the train radio case study

    Synthesis from Recursive-Components Libraries

    Full text link
    Synthesis is the automatic construction of a system from its specification. In classical synthesis algorithms it is always assumed that the system is "constructed from scratch" rather than composed from reusable components. This, of course, rarely happens in real life. In real life, almost every non-trivial commercial software system relies heavily on using libraries of reusable components. Furthermore, other contexts, such as web-service orchestration, can be modeled as synthesis of a system from a library of components. In 2009 we introduced LTL synthesis from libraries of reusable components. Here, we extend the work and study synthesis from component libraries with "call and return"' control flow structure. Such control-flow structure is very common in software systems. We define the problem of Nested-Words Temporal Logic (NWTL) synthesis from recursive component libraries, where NWTL is a specification formalism, richer than LTL, that is suitable for "call and return" computations. We solve the problem, providing a synthesis algorithm, and show the problem is 2EXPTIME-complete, as standard synthesis.Comment: In Proceedings GandALF 2011, arXiv:1106.081

    Probabilistic Model Checking for Energy Analysis in Software Product Lines

    Full text link
    In a software product line (SPL), a collection of software products is defined by their commonalities in terms of features rather than explicitly specifying all products one-by-one. Several verification techniques were adapted to establish temporal properties of SPLs. Symbolic and family-based model checking have been proven to be successful for tackling the combinatorial blow-up arising when reasoning about several feature combinations. However, most formal verification approaches for SPLs presented in the literature focus on the static SPLs, where the features of a product are fixed and cannot be changed during runtime. This is in contrast to dynamic SPLs, allowing to adapt feature combinations of a product dynamically after deployment. The main contribution of the paper is a compositional modeling framework for dynamic SPLs, which supports probabilistic and nondeterministic choices and allows for quantitative analysis. We specify the feature changes during runtime within an automata-based coordination component, enabling to reason over strategies how to trigger dynamic feature changes for optimizing various quantitative objectives, e.g., energy or monetary costs and reliability. For our framework there is a natural and conceptually simple translation into the input language of the prominent probabilistic model checker PRISM. This facilitates the application of PRISM's powerful symbolic engine to the operational behavior of dynamic SPLs and their family-based analysis against various quantitative queries. We demonstrate feasibility of our approach by a case study issuing an energy-aware bonding network device.Comment: 14 pages, 11 figure

    Using Inhabitation in Bounded Combinatory Logic with Intersection Types for Composition Synthesis

    Full text link
    We describe ongoing work on a framework for automatic composition synthesis from a repository of software components. This work is based on combinatory logic with intersection types. The idea is that components are modeled as typed combinators, and an algorithm for inhabitation {\textemdash} is there a combinatory term e with type tau relative to an environment Gamma? {\textemdash} can be used to synthesize compositions. Here, Gamma represents the repository in the form of typed combinators, tau specifies the synthesis goal, and e is the synthesized program. We illustrate our approach by examples, including an application to synthesis from GUI-components.Comment: In Proceedings ITRS 2012, arXiv:1307.784

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    The earlier the better: a theory of timed actor interfaces

    Get PDF
    Programming embedded and cyber-physical systems requires attention not only to functional behavior and correctness, but also to non-functional aspects and specifically timing and performance constraints. A structured, compositional, model-based approach based on stepwise refinement and abstraction techniques can support the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose, we introduce in this paper a general theory of timed actor interfaces. Our theory supports a notion of refinement that is based on the principle of worst-case design that permeates the world of performance-critical systems. This is in contrast with the classical behavioral and functional refinements based on restricting or enlarging sets of behaviors. An important feature of our refinement is that it allows time-deterministic abstractions to be made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis. We also show how our theory relates to, and can be used to reconcile a number of existing time and performance models and how their established theories can be exploited to represent and analyze interface specifications and refinement steps.\u
    corecore