4,479 research outputs found
A methodology for the requirements analysis of critical real-time systems
PhD ThesisThis thesis describes a methodology for the requirements analysis of critical real-time
systems. The methodology is based on formal methods, and provides a systematic way
in which requirements can be analysed and specifications produced. The proposed
methodology consists of a framework with distinct phases of analysis, a set oftechniques
appropriate for the issues to be analysed at each phase of the framework, a hierarchical
structure of the specifications obtained from the process of analysis, and techniques to
perform quality assessment of the specifications.
The phases of the framework, which are abstraction levels for the analysis of the
requirements, follow directly from a general structure adopted for critical real-time
systems. The intention is to define abstraction levels, or domains, in which the analysis
of requirements can be performed in terms of specific properties of the system, thus
reducing the inherent complexity of the analysis.
Depending on the issues to be analysed in each domain, the choice of the appropriate
formalism is determined by the set of features, related to that domain, that a formalism
should possess. In this work, instead of proposing new formalisms we concentrate on
identifying and enumerating those features that a formalism should have.
The specifications produced at each phase of the framework are organised by means of
a specification hierarchy, which facilitates our assessment of the quality of the
requirements specifications, and their traceability. Such an assessment should be
performed by qualitative and quantitative means in order to obtain high confidence
(assurance) that the level of safety is acceptable.
In order to exemplify the proposed methodology for the requirements analysis of critical
real-time systems we discuss a case study based on a crossing of two rail tracks (in a
model railway), which raises safety issues that are similar to those found at a traditional
level crossing (i.e. rail-road)CAPES/Ministry of Education (Brazil
Using Inhabitation in Bounded Combinatory Logic with Intersection Types for Composition Synthesis
We describe ongoing work on a framework for automatic composition synthesis
from a repository of software components. This work is based on combinatory
logic with intersection types. The idea is that components are modeled as typed
combinators, and an algorithm for inhabitation {\textemdash} is there a
combinatory term e with type tau relative to an environment Gamma?
{\textemdash} can be used to synthesize compositions. Here, Gamma represents
the repository in the form of typed combinators, tau specifies the synthesis
goal, and e is the synthesized program. We illustrate our approach by examples,
including an application to synthesis from GUI-components.Comment: In Proceedings ITRS 2012, arXiv:1307.784
Metamodel-based model conformance and multiview consistency checking
Model-driven development, using languages such as UML and BON, often makes use of multiple diagrams (e.g., class and sequence diagrams) when modeling systems. These diagrams, presenting different views of a system of interest, may be inconsistent. A metamodel provides a unifying framework in which to ensure and check consistency, while at the same time providing the means to distinguish between valid and invalid models, that is, conformance. Two formal specifications of the metamodel for an object-oriented modeling language are presented, and it is shown how to use these specifications for model conformance and multiview consistency checking. Comparisons are made in terms of completeness and the level of automation each provide for checking multiview consistency and model conformance. The lessons learned from applying formal techniques to the problems of metamodeling, model conformance, and multiview consistency checking are summarized
Generating a Performance Stochastic Model from UML Specifications
Since its initiation by Connie Smith, the process of Software Performance
Engineering (SPE) is becoming a growing concern. The idea is to bring
performance evaluation into the software design process. This suitable
methodology allows software designers to determine the performance of software
during design. Several approaches have been proposed to provide such
techniques. Some of them propose to derive from a UML (Unified Modeling
Language) model a performance model such as Stochastic Petri Net (SPN) or
Stochastic process Algebra (SPA) models. Our work belongs to the same category.
We propose to derive from a UML model a Stochastic Automata Network (SAN) in
order to obtain performance predictions. Our approach is more flexible due to
the SAN modularity and its high resemblance to UML' state-chart diagram
Mechanizing a Process Algebra for Network Protocols
This paper presents the mechanization of a process algebra for Mobile Ad hoc
Networks and Wireless Mesh Networks, and the development of a compositional
framework for proving invariant properties. Mechanizing the core process
algebra in Isabelle/HOL is relatively standard, but its layered structure
necessitates special treatment. The control states of reactive processes, such
as nodes in a network, are modelled by terms of the process algebra. We propose
a technique based on these terms to streamline proofs of inductive invariance.
This is not sufficient, however, to state and prove invariants that relate
states across multiple processes (entire networks). To this end, we propose a
novel compositional technique for lifting global invariants stated at the level
of individual nodes to networks of nodes.Comment: This paper is an extended version of arXiv:1407.3519. The
Isabelle/HOL source files, and a full proof document, are available in the
Archive of Formal Proofs, at http://afp.sourceforge.net/entries/AWN.shtm
Leveraging Semantic Web Service Descriptions for Validation by Automated Functional Testing
Recent years have seen the utilisation of Semantic Web Service descriptions for automating a wide range of service-related activities, with a primary focus on service discovery, composition, execution and mediation. An important area which so far has received less attention is service validation, whereby advertised services are proven to conform to required behavioural specifications. This paper proposes a method for validation of service-oriented systems through automated functional testing. The method leverages ontology-based and rule-based descriptions of service inputs, outputs, preconditions and effects (IOPE) for constructing a stateful EFSM specification. The specification is subsequently utilised for functional testing and validation using the proven Stream X-machine (SXM) testing methodology. Complete functional test sets are generated automatically at an abstract level and are then applied to concrete Web services, using test drivers created from the Web service descriptions. The testing method comes with completeness guarantees and provides a strong method for validating the behaviour of Web services
S+Net: extending functional coordination with extra-functional semantics
This technical report introduces S+Net, a compositional coordination language
for streaming networks with extra-functional semantics. Compositionality
simplifies the specification of complex parallel and distributed applications;
extra-functional semantics allow the application designer to reason about and
control resource usage, performance and fault handling. The key feature of
S+Net is that functional and extra-functional semantics are defined
orthogonally from each other. S+Net can be seen as a simultaneous
simplification and extension of the existing coordination language S-Net, that
gives control of extra-functional behavior to the S-Net programmer. S+Net can
also be seen as a transitional research step between S-Net and AstraKahn,
another coordination language currently being designed at the University of
Hertfordshire. In contrast with AstraKahn which constitutes a re-design from
the ground up, S+Net preserves the basic operational semantics of S-Net and
thus provides an incremental introduction of extra-functional control in an
existing language.Comment: 34 pages, 11 figures, 3 table
Qualitative networks: a symbolic approach to analyze biological signaling networks
BACKGROUND: A central goal of Systems Biology is to model and analyze biological signaling pathways that interact with one another to form complex networks. Here we introduce Qualitative networks, an extension of Boolean networks. With this framework, we use formal verification methods to check whether a model is consistent with the laboratory experimental observations on which it is based. If the model does not conform to the data, we suggest a revised model and the new hypotheses are tested in-silico. RESULTS: We consider networks in which elements range over a small finite domain allowing more flexibility than Boolean values, and add target functions that allow to model a rich set of behaviors. We propose a symbolic algorithm for analyzing the steady state of these networks, allowing us to scale up to a system consisting of 144 elements and state spaces of approximately 10(86 )states. We illustrate the usefulness of this approach through a model of the interaction between the Notch and the Wnt signaling pathways in mammalian skin, and its extensive analysis. CONCLUSION: We introduce an approach for constructing computational models of biological systems that extends the framework of Boolean networks and uses formal verification methods for the analysis of the model. This approach can scale to multicellular models of complex pathways, and is therefore a useful tool for the analysis of complex biological systems. The hypotheses formulated during in-silico testing suggest new avenues to explore experimentally. Hence, this approach has the potential to efficiently complement experimental studies in biology
- …