74 research outputs found
On Extensibility of Software Systems
This report contains the progress report written as part of the author's PhD qualifying exam. It describes initial work carried out in analyzing and improving the extensibility of software systems, including a detailed case study analyzing the extensibility of the Proof Obligation Generator (POG) of the Overture tool. Additional extension work includes improving the output format of the POG and support for additional logic systems. Future work for the remaining half of the PhD is also discussed, including ways to combine formal modelling and extensibility analysis and also techniques for multi-paradigm extensibility
On the Extensibility of Formal Methods Tools
Modern software systems often have long lifespans over which they must continually evolve to meet new, and sometimes unforeseen, requirements. One way to effectively deal with this is by developing the system as a series of extensions. As requirements change, the system evolves through the addition of new extensions and, potentially, the removal of existing extensions. In order for this kind of development process to thrive, it is necessary that the system have a high level of extensibility. Extensibility is the capability of a system to support the gradual addition of new, unplanned functionalities. This dissertation investigates extensibility of software systems and focuses on a particular class of software: formal methods tools. The approach is broad in scope. Extensibility of systems is addressed in terms of design, analysis and improvement, which are carried out in terms of source code and software architecture. For additional perspective, extensibility is also considered in the context of formal modelling. The work carried out in this dissertation led to the development of various extensions to the Overture tool supporting the Vienna Development Method, including a new proof obligation generator and integration with theorem provers. Additionally, the extensibility of Overture itself was also improved and it now better supports the development and integration of various kinds of extensions. Finally, extensibility techniques have been applied to formal modelling, leading to an extensible architectural style for formal models
On the mechanisation of the logic of partial functions
PhD ThesisIt is well known that partial functions arise frequently in formal reasoning
about programs. A partial function may not yield a value for every member
of its domain. Terms that apply partial functions thus may not denote, and
coping with such terms is problematic in two-valued classical logic. A question
is raised: how can reasoning about logical formulae that can contain references
to terms that may fail to denote (partial terms) be conducted formally? Over
the years a number of approaches to coping with partial terms have been
documented. Some of these approaches attempt to stay within the realm
of two-valued classical logic, while others are based on non-classical logics.
However, as yet there is no consensus on which approach is the best one to
use. A comparison of numerous approaches to coping with partial terms is
presented based upon formal semantic definitions.
One approach to coping with partial terms that has received attention over
the years is the Logic of Partial Functions (LPF), which is the logic underlying
the Vienna Development Method. LPF is a non-classical three-valued logic
designed to cope with partial terms, where both terms and propositions may
fail to denote. As opposed to using concrete undfined values, undefinedness
is treated as a \gap", that is, the absence of a defined value. LPF is based
upon Strong Kleene logic, where the interpretations of the logical operators
are extended to cope with truth value \gaps".
Over the years a large body of research and engineering has gone into the
development of proof based tool support for two-valued classical logic. This
has created a major obstacle that affects the adoption of LPF, since such proof
support cannot be carried over directly to LPF. Presently, there is a lack of
direct proof support for LPF.
An aim of this work is to investigate the applicability of mechanised (automated)
proof support for reasoning about logical formulae that can contain
references to partial terms in LPF. The focus of the investigation is on the basic
but fundamental two-valued classical logic proof procedure: resolution and
the associated technique proof by contradiction. Advanced proof techniques
are built on the foundation that is provided by these basic fundamental proof
techniques. Looking at the impact of these basic fundamental proof techniques
in LPF is thus the essential and obvious starting point for investigating proof
support for LPF. The work highlights the issues that arise when applying
these basic techniques in LPF, and investigates the extent of the modifications needed to carry them over to LPF. This work provides the essential foundation
on which to facilitate research into the modification of advanced proof
techniques for LPF.EPSR
Reasoning about correctness properties of a coordination programming language
Safety critical systems place additional requirements to the programming
language used to implement them with respect to traditional environments.
Examples of features that in
uence the suitability of a programming language
in such environments include complexity of de nitions, expressive
power, bounded space and time and veri ability. Hume is a novel programming
language with a design which targets the rst three of these, in some
ways, contradictory features: fully expressive languages cannot guarantee
bounds on time and space, and low-level languages which can guarantee
space and time bounds are often complex and thus error-phrone. In Hume,
this contradiction is solved by a two layered architecture: a high-level fully
expressive language, is built on top of a low-level coordination language
which can guarantee space and time bounds.
This thesis explores the veri cation of Hume programs. It targets safety
properties, which are the most important type of correctness properties,
of the low-level coordination language, which is believed to be the most
error-prone. Deductive veri cation in Lamport's temporal logic of actions
(TLA) is utilised, in turn validated through algorithmic experiments. This
deductive veri cation is mechanised by rst embedding TLA in the Isabelle
theorem prover, and then embedding Hume on top of this. Veri cation of
temporal invariants is explored in this setting.
In Hume, program transformation is a key feature, often required to guarantee
space and time bounds of high-level constructs. Veri cation of transformations
is thus an integral part of this thesis. The work with both invariant
veri cation, and in particular, transformation veri cation, has pinpointed
several weaknesses of the Hume language. Motivated and in
uenced by
this, an extension to Hume, called Hierarchical Hume, is developed and
embedded in TLA. Several case studies of transformation and invariant veri
cation of Hierarchical Hume in Isabelle are conducted, and an approach
towards a calculus for transformations is examined.James Watt ScholarshipEngineering and Physical Sciences Research Council (EPSRC) Platform grant GR/SO177
Reasoning about programs using operational semantics and the role of a proof support tool
A computer program is a text in a de ned programming language; each such program can be thought of as de ning state transitions | that is, the execution of a program in an initial state will result in a (or possibly one of a set of) nal state(s). A program speci cation de nes properties that relate initial and nal states. For example, a speci cation might state that some property will hold in the nal state after the program has been executed, as long as it was executed in an initial state where some other property held. De ning the semantics of a programming language xes the behaviour of the language and gives meaning to programs that are written in the language. One straightforward way of giving the semantics of a programming language is using operational semantics, which describes a language in terms of the e ect execution has on the state: a program still de nes state transitions, but for an abstract state. This thesis investigates the possibility of using the operational semantic description of a programming language to reason about programs that are written in that language. Programs are shown to be correct with respect to a speci cation, which consists of a precondition and a postcondition. Natural deduction proofs about a program are written to show that if it is executed in a state that satis es the precondition, then execution will result in a state that relates to the initial state such that the postcondition is true of the two states. The rules of an operational semantic description are used in the proof to show the steps a program will take to reach a nal state, and the e ect execution has on the state. This is contrasted with the use of axiomatic semantics, observing that using operational semantics allows us to handle a wider class of language features. The acceptability of this approach will almost certainly depend on appropriate tool support. A prototype proof support tool is therefore developed as a `proof of concept', to assist a user in creating these kinds of proof. The tool manages inference rules and semantic descriptions, and the layout of the proof, but does not automate the proof process.EThOS - Electronic Theses Online ServiceEngineering and Physical Sciences Research Council (EPSRC)GBUnited Kingdo
Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods
- …