11 research outputs found

    Expressiveness and Completeness in Abstraction

    Full text link
    We study two notions of expressiveness, which have appeared in abstraction theory for model checking, and find them incomparable in general. In particular, we show that according to the most widely used notion, the class of Kripke Modal Transition Systems is strictly less expressive than the class of Generalised Kripke Modal Transition Systems (a generalised variant of Kripke Modal Transition Systems equipped with hypertransitions). Furthermore, we investigate the ability of an abstraction framework to prove a formula with a finite abstract model, a property known as completeness. We address the issue of completeness from a general perspective: the way it depends on certain abstraction parameters, as well as its relationship with expressiveness.Comment: In Proceedings EXPRESS/SOS 2012, arXiv:1208.244

    Hierarchical hidden Markov structure for dynamic correlations: the hierarchical RSDC model.

    Get PDF
    This paper presents a new multivariate GARCH model with time-varying conditional correlation structure which is a generalization of the Regime Switching Dynamic Correlation (RSDC) of Pelletier (2006). This model, which we name Hierarchical RSDC, is building with the hierarchical generalization of the hidden Markov model introduced by Fine et al. (1998). This can be viewed graphically as a tree-structure with different types of states. The first are called production states and they can emit observations, as in the classical Markov-Switching approach. The second are called abstract states. They can't emit observations but establish vertical and horizontal probabilities that define the dynamic of the hidden hierarchical structure. The main gain of this approach compared to the classical Markov-Switching model is to increase the granularity of the regimes. Our model is also compared to the new Double Smooth Transition Conditional Correlation GARCH model (DSTCC), a STAR approach for dynamic correlations proposed by Silvennoinen and Teräsvirta (2007). The reason is that under certain assumptions, the DSTCC and our model represent two classical competing approaches to modeling regime switching. We also perform Monte-Carlo simulations and we apply the model to two empirical applications studying the conditional correlations of selected stock returns. Results show that the Hierarchical RSDC provides a good measure of the correlations and also has an interesting explanatory power.Multivariate GARCH; Dynamic correlations; Regime switching; Markov chain; Hidden Markov models; Hierarchical Hidden Markov models

    Refinement sensitive formal semantics of state machines with persistent choice

    Get PDF
    Modeling languages usually support two kinds of nondeterminism, an external one for interactions of a system with its environment, and one that stems from under-specification as familiar in models of behavioral requirements. Both forms of nondeterminism are resolvable by composing a system with an environment model and by refining under-specified behavior (respectively). Modeling languages usually dont support nondeterminism that is persistent in that neither the composition with an environment nor refinements of under-specification will resolve it. Persistent nondeterminism is used, e.g., for modeling faulty systems. We present a formal semantics for UML state machines enriched with an operator persistent choice that models persistent nondeterminism. This semantics is based on abstract models - μ-automata with a novel refinement relation - and a sound three-valued satisfaction relation for properties expressed in the μ-calculus. © 2009 Elsevier B.V. All rights reserved

    Improvement and Analysis of behavioural models with variability

    Get PDF
    Product Lines or Families represent a new paradigm widely used to describe company products with similar functionality and requirements in order to improve efficiency and productivity of a company. In this context many studies are focused on the research of the best behavioural model useful to describe a product family and to reason about properties of the family itself. In addition the model must allow to describe in a simple way different types of variability, needed to characterize several products of the family. One of the most important of these models is the Modal Transition System (MTS), an extension of a Labelled Transition System (LTS), which introduces two types of transitions useful to describe the necessary and allowed requirements. These models have been broadly studied and several its extensions have been described. These extensions follow different approaches which entail the introduction of more and more complex and expressive requirements. Furthermore MTS and its extensions define a concept of refinement which represents a step of design process, namely a step where some allowed requirements are discarded and other ones become necessary. In this thesis we introduce a new model, the Constrained Modal Transition System (CMTS ), which is a particular and more expressive extension of MTS. Moreover we study different and useful properties correlated to the CMTS. Also, we use CMTS as an useful tool to determine and to define a a hierarchy of expressivity of the known extensions with variability of LTSs and MTSs. In order to check different properties of a product family, we introduce a new deontic-temporal logic based on CTL* interpreted over CMTSs able to express classical safety and liveness properties as well as concepts like obligatory, permission and prohibition. Finally some useful optimizations are introduced to guarantee a less expensive verification from complexity point of view

    Zur Rolle von Nichtdeterminismus und Verfeinerung in der modellgetriebenen Top-Down-Entwicklung von Softwaresystemen

    Get PDF
    Large-scale software systems need to be accurately planned and designed. This includes the determination of requirements, the definition of specifications, and the development of models conforming to specifications. These models are expressed in modeling languages like process algebras, the Unified Modeling Language (UML), or variants of state diagrams (e.g. UML state machines or Harel's statecharts). Such modeling languages are usually underspecified, since they only express certain aspects of the system to be designed, leaving out implementation details. The process of refining such abstract descriptions in a stepwise fashion, until finally the concrete, executable implementation is reached, is called (model-driven) top-down development. Finding bugs as early as possible in this process often saves considerable development costs. This thesis considers methods for proven-to-be-correct top-down development, with specification conformance being guaranteed at all levels of abstraction, either by applying model checking techniques or by employing pre-defined refinement patterns that are already proven to be sound. In order to apply formal proof methods, models on all levels of abstraction, e.g. presented as process algebra terms or state diagrams, need to be given a precise semantics in some semantic domain, usually based on (extensions of) labeled transition systems. We call semantic domains that support underspecification refinement settings. One of the contributions of this thesis is a new kind of comparison of a dozen such settings proposed in the literature with respect to their expressible sets of implementations. This comparison is done by providing transformations that not only establish the implementation-based expressiveness hierarchy of the most commonly used refinement settings, but can also be employed to convert models between the settings, thus enabling tool reuse. Some kinds of abstract models require a setting as semantic domain that not only features resolvable nondetermism expressing underspecification, but also persistent nondeterminism that is not to be resolved in refinements, as characterized by bisimulation equivalence on labeled transition systems. We show that such a setting is needed for process algebras if they specify concurrent systems, because concurrency may introduce resolvable nondeterminism which is resolved by the scheduler of the operating system, and the choice operator, which is common to process algebras, may correspond to persistent nondeterminism. This is the first work in the literature making this observation. A simple process algebra of this kind is given an operational semantics, using the refinement setting of mu-automata, as well as a sound and complete axiomatic semantics. Sometimes state diagrams, such as UML state machines or Harel's statecharts, also require a refinement setting with both kinds of nondeterminism, because (i) they are underspecified and (ii) the underlying action language may contain operators exhibiting persistent nondeterministic behavior. This thesis is the first publication presenting a state diagram semantics with both kinds of nondeterminism. In this context, existing refinement settings like mu-automata lead to unnecessarily complex semantic models. Therefore, we develop a new and in this context more succinct refinement setting, called nu-automata, and give a semantic mapping for a simple state diagram variant, as well as a general transformation that can be applied when extending existing semantics by persistent nondeterminism. Thus, we make state diagrams accessible to persistent nondeterminism. Support for both kinds of nondeterminism, however, does not necessarily imply the practical feasibility of top-down development in state diagrams. In existing state diagram variants, expressing resolvable nondeterminism is only possible to a certain degree, because the notations for underspecification (i) often have no precise semantics, and (ii) are not expressive enough to reflect the requirements of the top-down development process, such as starting with interface definitions and subsequent parallel development of mostly independent modules. Therefore, we develop a new variant of state diagrams that allows more explicit and more expressive modeling of underspecification than existing variants. This variant is given a semantics in a newly developed semantic setting that distinguishes between input and output events. A set of refinement patterns is then provided that enables proven-to-be-correct stepwise refinement without the need to re-check correctness after each refinement step. Consequently, we deliver the formal foundations for the development of a state-diagram-based design tool that ensures correctness at all stages of the development process.Große Softwaresysteme bedürfen sorgfältiger Planung und Entwicklung, einschließlich einer Anforderungsanalyse, dem Aufstellen von Spezifikationen und der Entwicklung von Modellen, die die Spezifikation einhalten. Solche Modelle werden in Modellierungssprachen wie Prozessalgebren, der Unified Modeling Language (UML) oder Varianten von Zustandsdiagrammen (z.B. UML state machines oder Harel's statecharts) ausgedrückt. Diese Modellierungssprachen sind üblicherweise unterspezifiziert, d.h. sie beschreiben nur bestimmte Aspekte des zu entwickelten Systems und lassen Implementationsdetails weg. Der Prozess, solche abstrakten Beschreibungen schrittweise zu verfeinern, bis schließlich die konkrete, ausführbare Implementation erreicht ist, wird (modellgetriebene) Top-Down-Entwicklung genannt. Es spart oft beachtliche Entwicklungskosten, wenn Programmierfehler so früh wie möglich in diesem Prozess gefunden werden. Die vorliegende Arbeit betrachtet Methoden für per-Konstruktion-korrekte Top-Down-Entwicklung, für die Spezifikationstreue auf allen Abstraktionsstufen gewährleistet ist, entweder durch die Anwendung von Modelchecking-Techniken oder durch die Verwendung von vordefinierten Verfeinerungsmustern, deren Korrektheit bereits bewiesen ist. Um formale Methoden anwenden zu können, muss den Modellen auf allen Abstraktionsstufen, ausgedrückt etwa in Prozessalgebren oder Zustandsdiagrammen, eine präzise Semantik gegeben werden. Solche Semantiken werden üblicherweise mittels (Erweiterungen von) Transitionssystemen ausgedrückt. Wir nennen solche semantischen Formalismen, die Unterspezifikation unterstützen, Verfeinerungsformalismen. Einer der Beiträge dieser Arbeit ist eine neue Art von Vergleich von einem Dutzend solcher Formalismen, in Hinblick auf ihre ausdrückbaren Mengen von Implementationen. Dieser Vergleich erfolgt durch die Angabe von Transformationen, die nicht nur die implementationsbasierte Ausdrucksstärkenhierarchie der meistbenutzten Verfeinerungsformalismen begründen, sondern auch dafür verwendet werden können, Modelle zwischen Formalismen zu konvertieren und damit die Wiederverwendung von Werkzeugen zu ermöglichen. Einige abstrakte Modelle benötigen einen semantischen Formalismus, der nicht nur auflösbaren Nichtdeterminismus für das Ausdrücken von Unterspezifikation, sondern auch persistenten Nichtdeterminismus enthält. Letzterer soll nicht in Verfeinerungen aufgelöst werden, wie es durch Bisimulationsäquivalenz auf Transitionssystemen charakterisiert wird. Wir zeigen, dass ein solches Modell für Prozessalgebren im Kontext nebenläufiger Systeme benötigt wird, weil Nebenläufigkeit auflösbaren Nichtdeterminismus einführen kann, der vom Scheduler des Betriebssystems aufgelöst wird, und der Choice-Operator, welcher in Prozessalgebren üblich ist, persistentem Nichtdeterminismus entsprechen kann. Dieses ist die erste publizierte Arbeit, die diese Beobachtung macht. Wir geben für eine einfache Prozessalgebra eine operationelle Semantik mittels mu-Automaten, sowie eine korrekte und vollständige axiomatische Semantik an. Auch Zustandsdiagramme wie UML state machines oder Harel's statecharts benötigen manchmal semantische Formalismen mit beiden Arten von Nichtdeterminismus, weil Zustandsdiagramme (i) unterspezifiziert sind und (ii) die zugrundeliegende Aktionssprache Operatoren enthalten kann, die persistent-nichtdeterministisches Verhalten zeigen. Die vorliegende Arbeit ist die erste, die eine Zustandsdiagramm-Semantik mit beiden Arten von Nichtdeterminismus vorstellt. In diesem Kontext würden existierende semantische Modelle wie mu-Automaten zu unnötig komplexen semantischen Modellen führen. Daher entwickeln wir einen neuen, in diesem Kontext bündigeren Verfeinerungsformalismus, nämlich nu-Automaten, und geben eine semantische Abbildung für eine einfache Zustandsdiagrammvariante, sowie eine allgemeine Transformation an, die auf existierende Semantiken, die um persistenten Nichtdeterminismus erweitert werden sollen, angewendet werden kann. Wir machen also Zustandsdiagramme im Allgemeinen für persistenten Nichtdeterminismus zugänglich. Die Unterstützung von beiden Arten von Nichtdeterminismus impliziert jedoch nicht notwendigerweise die praktische Umsetzbarkeit von Top-Down-Entwicklung in Zustandsdiagrammen. In existierenden Zustandsdiagrammvarianten ist das Ausdrücken von auflösbarem Nichtdeterminismus nur zu einem gewissen Grade möglich, weil die Notationen für Unterspezifikation (i) oft keine präzise Semantik haben, und (ii) nicht ausdrucksstark genug sind, um die Anforderungen des Top-down-Entwicklungsprozesses widerzuspiegeln, wie das Starten mit der Definition von Schnittstellen und nachfolgende parallele Entwicklung größtenteils unabhängiger Module. Daher entwickeln wir eine neue Zustandsdiagrammvariante, die expliziteres und ausdrucksstärkeres Modellieren von Unterspezifikation als existente Varianten unterstützt. Ihre Semantik wird in einem neu entwickelten semantischen Formalismus gegeben, der zwischen Eingabe- und Ausgabeereignissen unterscheidet. Eine Kollektion von gegebenen Verfeinerungsmustern erlaubt korrekt-bewiesene schrittweise Verfeinerung, ohne dass die Korrektheit nach jedem Verfeinerungsschritt erneut bewiesen werden muss. Wir liefern also die formale Basis für die Entwicklung eines zustandsdiagrammbasierten Entwicklungswerkzeugs, welches Korrektheit in allen Stadien des Entwicklungsprozesses sicherstellt

    Principles of Markov automata

    Get PDF
    A substantial amount of today's engineering problems revolve around systems that are concurrent and stochastic by their nature. Solution approaches attacking these problems often rely on the availability of formal mathematical models that reflect such systems as comprehensively as possible. In this thesis, we develop a compositional model, Markov automata, that integrates concurrency, and probabilistic and timed stochastic behaviour. This is achieved by blending two well-studied constituent models, probabilistic automata and interactive Markov chains. A range of strong and weak bisimilarity notions are introduced and evaluated as candidate relations for a natural behavioural equivalence between systems. Among them, weak distribution bisimilarity stands out as a natural notion being more oblivious to the probabilistic branching structure than prior notions. We discuss compositionality, axiomatizations, decision and minimization algorithms, state-based characterizations and normal forms for weak distribution bisimilarity. In addition, we detail how Markov automata and weak distribution bisimilarity can be employed as a semantic basis for generalized stochastic Petri nets, in such a way that known shortcomings of their classical semantics are ironed out in their entirety.Ein beträchtlicher Teil gegenwärtiger ingenieurwissenschafter Probleme erstreckt sich auf Sys- teme, die ihrer Natur nach sowohl stochastisch als auch nebenläufig sind. Lösungsansätze fußen hierbei häufig auf der Verfügbarkeit formaler mathematischer Modelle, die es erlauben, die Spez- ifika jener Systeme möglichst erschöpfend zu erfassen. In dieser Dissertation entwickeln wir ein kompositionelles Modell namens Markov-Automaten, das Nebenläufigkeit mit probabilistis- chen und stochastischen Prozessen integriert. Dies wird durch die Verschmelzung der zweier bekannter Modellklassen erreicht, und zwar die der probabilistischen Automaten und die der interaktiven Markovketten. Wir entwickeln dabei ein Spektrum verschiedener, starker und schwacher Bisimulationsrelationen und beurteilen sie im Hinblick auf ihre Eignung als natür- liche Verhaltensäquivalenz zwischen Systemen. Die schwache Wahrscheinlichkeitsverteilungs- bisimulation sticht dabei als natürliche Wahl hervor, da sie die probabilistische Verzwei- gungsstruktur treffender abstrahiert als bisher bekannte Bisimulationsrelationen. Wir betra- chten des Weiteren Kompositionalitätseigenschaften, Axiomatisierungen, Entscheidungs- und Minimierungsalgorithmen, sowie zustandsbasierte Charakterisierungen und Normalformen für die schwache Wahrscheinlichkeitsverteilungsbisimulation. Abschließend legen wir dar, dass Markov-Automaten und die schwacheWahrscheinlichkeitsverteilungsbisimulation als Grundlage für eine verbesserte Semantik von verallgemeinerten stochastischen Petrinetzen dienen kann, welche bekannte Mängel der klassischen Semantik vollständig behebt

    Hierarchical and compositional verification of cryptographic protocols

    Get PDF
    Nella verifica dei protocolli di sicurezza ci sono due importanti approcci che sono conosciuti sotto il nome di approccio simbolico e computazionale, rispettivamente. Nell'approccio simbolico i messaggi sono termini di un'algebra e le primitive crittografiche sono idealmente sicure; nell'approccio computazionale i messaggi sono sequenze di bit e le primitive crittografiche sono sicure con elevata probabilit\ue0. Questo significa, per esempio, che nell'approccio simbolico solo chi conosce la chiave di decifratura pu\uf2 decifrare un messaggio cifrato, mentre nell'approccio computazionale la probabilit\ue0 di decifrare un testo cifrato senza conoscere la chiave di decifratura \ue8 trascurabile. Di solito, i protocolli crittografici sono il risultato dell'interazione di molte componenti: alcune sono basate su primitive crittografiche, altre su altri principi. In generale, quello che risulta \ue8 un sistema complesso che vorremmo poter analizzare in modo modulare invece che doverlo studiare come un singolo sistema. Una situazione simile pu\uf2 essere trovata nel contesto dei sistemi distribuiti, dove ci sono molti componenti probabilistici che interagiscono tra loro implementando un algoritmo distribuito. In questo contesto l'analisi della correttezza di un sistema complesso \ue8 molto rigorosa ed \ue8 basata su strumenti che derivano dalla teoria dell'informazione, strumenti come il metodo di simulazione che permette di decomporre grossi problemi in problemi pi\uf9 piccoli e di verificare i sistemi in modo gerarchico e composizionale. Il metodo di simulazione consiste nello stabilire delle relazioni tra gli stati di due automi, chiamate relazioni di simulazione, e nel verificare che tali relazioni soddisfano delle condizioni di passo appropriate, come che ogni transizione del sistema simulato pu\uf2 essere imitata dal sistema simulante nel rispetto della relazione data. Usando un approccio composizionale possiamo studiare le propriet\ue0 di ogni singolo sotto-problema indipendentemente dagli altri per poi derivare le propriet\ue0 del sistema complessivo. Inoltre, la verifica gerarchica ci permette di definire molti raffinamenti intermedi tra la specifica e l'implementazione. Spesso la verifica gerarchica e composizionale \ue8 pi\uf9 semplice e chiara che l'intera verifica fatta in una volta sola. In questa tesi introduciamo una nuova relazione di simulazione, che chiamiamo simulazione polinomialmente accurata o simulazione approssimata, che \ue8 composizionale e che permette di usare l\u2019approccio gerarchico nelle nostre analisi. Le simulazioni polinomialmente accurate estendono le relazioni di simulazione definite nel contesto dei sistemi distribuiti sia nel caso forte sia in quello debole tenendo conto delle lunghezze delle esecuzioni e delle propriet\ue0 computazionali delle primitive crittografiche. Oltre alle simulazioni polinomialmente accurate, forniamo altri strumenti che possono semplificare l\u2019analisi dei protocolli crittografici: il primo \ue8 il concetto di automa condizionale che permette di rimuovere eventi che occorrono con probabilit\ue0 trascurabile in modo sicuro. Data una macchina che \ue8 attaccabile con probabilit\ue0 trascurabile, se costruiamo un automa che \ue8 condizionale all'assenza di questi attacchi, allora esiste una simulazione tra i due. Questo ci permette, tra l'altro, di lavorare con le relazioni di simulazione tutto il tempo e in particolare possiamo anche dimostrare in modo composizionale che l'eliminazione di eventi trascurabili \ue8 sicura. Questa propriet\ue0 \ue8 giustificata dal teorema dell\u2019automa condizionale che afferma che gli eventi sono trascurabili se e solo se la relazione identit\ue0 \ue8 una simulazione approssimata dall\u2019automa alla sua controparte condizionale. Un altro strumento \ue8 il teorema della corrispondenza delle esecuzioni, che estende quello del contesto dei sistemi distribuiti, che giustifica l\u2019approccio gerarchico. Infatti, il teorema afferma che se abbiamo molti automi e una catena di simulazioni tra di essi, allora con elevata probabilit\ue0 ogni esecuzione del primo automa della catena \ue8 in relazione con un\u2019esecuzione dell'ultimo automa della catena. In altre parole, abbiamo che la probabilit\ue0 che l'ultimo automa non sia in grado di simulare un\u2019esecuzione del primo \ue8 trascurabile. Infine, usiamo il framework delle simulazioni polinomialmente accurate per fornire delle famiglie di automi che implementano le primitive crittografiche comunemente usate e per dimostrare che l'approccio simbolico \ue8 corretto rispetto all\u2019approccio computazionale.Two important approaches to the verification of security protocols are known under the general names of symbolic and computational, respectively. In the symbolic approach messages are terms of an algebra and the cryptographic primitives are ideally secure; in the computational approach messages are bitstrings and the cryptographic primitives are secure with overwhelming probability. This means, for example, that in the symbolic approach only who knows the decryption key can decrypt a ciphertext, while in the computational approach the probability to decrypt a ciphertext without knowing the decryption key is negligible. Usually, the cryptographic protocols are the outcome of the interaction of several components: some of them are based on cryptographic primitives, other components on other principles. In general, the result is a complex system that we would like to analyse in a modular way instead of studying it as a single system. A similar situation can be found in the context of distributed systems, where there are several probabilistic components that interact with each other implementing a distributed algorithm. In this context, the analysis of the correctness of a complex system is very rigorous and it is based on tools from information theory such as the simulation method that allows us to decompose large problems into smaller problems and to verify systems hierarchically and compositionally. The simulation method consists of establishing relations between the states of two automata, called simulation relations, and to verify that such relations satisfy appropriate step conditions: each transition of the simulated system can be matched by the simulating system up to the given relation. Using a compositional approach we can study the properties of each small problem independently from the each other, deriving the properties of the overall system. Furthermore, the hierarchical verification allows us to build several intermediate refinements between specification and implementation. Often hierarchical and compositional verification is simpler and cleaner than direct one-step verification, since each refinement may focus on specific homogeneous aspects of the implementation. In this thesis we introduce a new simulation relation, that we call polynomially accurate simulation, or approximated simulation, that is compositional and that allows us to adopt the hierarchical approach in our analyses. The polynomially accurate simulations extend the simulation relations of the distributed systems context in both strong and weak cases taking into account the lengths of the computations and of the computational properties of the cryptographic primitives. Besides the polynomially accurate simulations, we provide other tools that can simplify the analysis of cryptographic protocols: the first one is the concept of conditional automaton, that permits to safely remove events that occur with negligible probability. Starting from a machine that is attackable with negligible probability, if we build an automaton that is conditional to the absence of these attacks, then there exists a simulation. And this allows us to work with the simulation relations all the time and in particular we can also prove in a compositional way that the elimination of negligible events from an automaton is safe. This property is justified by the conditional automaton theorem that states that events are negligible if and only if the identity relation is an approximated simulation from the automaton and its conditional counterpart. Another tool is the execution correspondence theorem, that extends the one of the distributed systems context, that allows us to use the hierarchical approach. In fact, the theorem states that if we have several automata and a chain of simulations between them, then with overwhelming probability each execution of the first automaton is related to an execution of the last automaton. In other words, we have that the probability that the last automaton is not able to simulate an execution of the first one is negligible. Finally, we use the polynomially accurate simulation framework to provide families of automata that implement commonly used cryptographic primitives and to prove that the symbolic approach is sound with respect to the computational approach

    A Model-driven Approach for the Automatic Generation of System-Level Test Cases

    Get PDF
    Systems at the basis of the modern society, as the as the homeland security, the environment protection, the public and private transportations, the healthcare or the energy supply depend on the correct functioning of one or more embedded systems. In several cases, such systems shall be considered critical, since the consequences of their failures may result in economic losses, damages to the environment or even injuries to human life. Possible disastrous consequences of embedded critical systems, suggest that discover flaws during systems development and avoid their propagation to the system execution, is a crucial task. In fact, most of the failures found during the usage of embedded critical systems, is due to errors introduced during early stages of the system development. Thus, it is desiderable to start Verification and Validation (V&V) activities during early stages of a system life cycle. However such V&V activities can account over the 50% of times and costs of a system life cycle and there is therefore the need to introduce techniques able to reduce the accounted resources without losses in term efficiency. Among the methodologies found in scientific and industrial literature there is a large interest in the V&V automation. In particular, automatic verification can be performed during different stages of a system development life cycle and can assume different meanings. In this thesis, the focus is on the automation of the test cases generation phase performed at the System level starting from SUT and test specifications. A recent research trend, related to this, is to support such process providing a flexible tool chain allowing for effective Model Driven Engineering (MDE) approaches. The adoption of a model-driven techniques requires the modelling of the SUT to drive the generation process, by using suitable domain-specific modelling languages and model transformations. Thus, a successful application of the MDE principles is related to the choice of the high-level language for SUT specification and the tools and techniques provided to support the V\&V processes. According to this, the model-driven approach define in this thesis relies on three key factors: (1) the definition of new domain-specific modelling languages (DSMLs) for the SUT and the test specifications, (2) the adoption of model checking techniques to realize the generation of the test cases and (3) the implementation of a concrete framework providing a complete tool chain supporting the automation process. This work is partially involved in an ARTEMIS European project CRYSTAL (CRitical sYSTem engineering AcceLeration). CRYSTAL is strongly industry-oriented and aims at achieving technical innovation by a user-driven approach based on the idea to apply engineering methods to industrially relevant Use Cases from the automotive, aerospace, rail and health-care sectors. The DSML that will be presented in this thesis, emerged as an attempt to address the modelling requirements and the design practices of the industrial partners of the project, within a rigorous and well-founded formal specification and verification approach. In fact, the main requirement that a modelling language suitable for the industry should have is to be small and as simple as possible. Thus, the modelling language should provide an adequate set of primitive constructs to allow for a natural modelling of the system of interest. Furthermore, the larger the gap between the design specification and the actual implementation is, the less useful the results of the design analysis would be. The test case generation is supported by model checking techniques; the SUT and test models are in fact translated in specifications expressed by the language adopted by a model checker. The thesis discusses all the issues addressed in the mapping process and provides their implementations by means of model transformations. A class of test specifications is addressed to exemplify the generation process over a common class of reachability requirements. The model-driven approach discussed in the thesis is applied in the contest of the railway control systems, and in particular on some of the key functionalities of the Radio Block Center, the main component of the ERTMS/ETCS standards for the interoperability of the railway control systems in the European Community. The thesis is organized as follows. The first chapter introduces embedded critical systems and outlines the main research trends related to their V&V process. The Chapter 2 outlines the state of the art in testing automation with a particular focus on model-driven approaches for automatic test generation. The same Chapter 2 provides also the necessary technical background supporting to understand the development process of the supporting framework. The Chapter 3 describes the context of the CRYSTAL project and the proposed model-driven approach partially involved in its activities. The Chapter 4 describes the domains pecific modelling languages defined for the modelling of the SUT specifications and of the test generation outcomes. Moreover the guidelines defined for modelling test specifications are discussed. The Chapter 5 focuses on the mapping process that enable the translation of the high-level language for the modelling of the SUT specification to the language adopted by the chosen model checker. The implementation of the overall framework is addressed in Chapter 6. Here model transformations realizing the defined mappings and the architecture of the Test Case Generator (TCG) framework are described and discussed. The Chapter 7 shows the results of the application of the approach in the context of the railway control systems and in particular to the Radio Block Centre system, a key component in the ERTMS/ETCS standard. Chapter 8 end the thesis, giving some conclusive remarks
    corecore