1,081 research outputs found
Formal Methods Specification and Analysis Guidebook for the Verification of Software and Computer Systems
This guidebook, the second of a two-volume series, is intended to facilitate the transfer of formal methods to the avionics and aerospace community. The 1st volume concentrates on administrative and planning issues [NASA-95a], and the second volume focuses on the technical issues involved in applying formal methods to avionics and aerospace software systems. Hereafter, the term "guidebook" refers exclusively to the second volume of the series. The title of this second volume, A Practitioner's Companion, conveys its intent. The guidebook is written primarily for the nonexpert and requires little or no prior experience with formal methods techniques and tools. However, it does attempt to distill some of the more subtle ingredients in the productive application of formal methods. To the extent that it succeeds, those conversant with formal methods will also nd the guidebook useful. The discussion is illustrated through the development of a realistic example, relevant fragments of which appear in each chapter. The guidebook focuses primarily on the use of formal methods for analysis of requirements and high-level design, the stages at which formal methods have been most productively applied. Although much of the discussion applies to low-level design and implementation, the guidebook does not discuss issues involved in the later life cycle application of formal methods
On the Syntax of Logic and Set Theory
We introduce an extension of the propositional calculus to include abstracts
of predicates and quantifiers, employing a single rule along with a novel
comprehension schema and a principle of extensionality, which are substituted
for the Bernays postulates for quantifiers and the comprehension schemata of ZF
and other set theories. We prove that it is consistent in any finite Boolean
subset lattice. We investigate the antinomies of Russell, Cantor, Burali-Forti,
and others, and discuss the relationship of the system to other set theoretic
systems ZF, NBG, and NF. We discuss two methods of axiomatizing higher order
quantification and abstraction, and then very briefly discuss the application
of one of these methods to areas of mathematics outside of logic.Comment: 34 pages, accepted, to appear in the Review of Symbolic Logi
Special Libraries, December 1945
Volume 36, Issue 10https://scholarworks.sjsu.edu/sla_sl_1945/1009/thumbnail.jp
Verifying data- and control-oriented properties combining static and runtime verification : theory and tools
Static verification techniques are used to analyse and prove properties about programs before they are executed. Many of these techniques work directly on the source code
and are used to verify data-oriented properties over all possible executions. The analysis is
necessarily an over-approximation as the real executions of the program are not available
at analysis time. In contrast, runtime verification techniques have been extensively used for
control-oriented properties, analysing the current execution path of the program in a fully
automatic manner. In this article, we present a novel approach in which data-oriented and
control-oriented properties may be stated in a single formalism amenable to both static and
dynamic verification techniques. The specification language we present to achieve this that
of ppDATEs, which enhances the control-oriented property language of DATEs, with data-
oriented pre/postconditions. For runtime verification of ppDATE specifications, the language
is translated into a DATE. We give a formal semantics to ppDATEs, which we use to prove
the correctness of our translation from ppDATEs to DATEs. We show how ppDATE specifi-
cations can be analysed using a combination of the deductive theorem prover KeY and the
runtime verification tool LARVA. Verification is performed in two steps: KeY first partially
proves the data-oriented part of the specification, simplifying the specification which is then
passed on to LARVA to check at runtime for the remaining parts of the specification including
the control-oriented aspects. We show the applicability of our approach on two case studies.peer-reviewe
Discovering ePassport Vulnerabilities using Bisimilarity
We uncover privacy vulnerabilities in the ICAO 9303 standard implemented by
ePassports worldwide. These vulnerabilities, confirmed by ICAO, enable an
ePassport holder who recently passed through a checkpoint to be reidentified
without opening their ePassport. This paper explains how bisimilarity was used
to discover these vulnerabilities, which exploit the BAC protocol - the
original ICAO 9303 standard ePassport authentication protocol - and remains
valid for the PACE protocol, which improves on the security of BAC in the
latest ICAO 9303 standards. In order to tackle such bisimilarity problems, we
develop here a chain of methods for the applied -calculus including a
symbolic under-approximation of bisimilarity, called open bisimilarity, and a
modal logic, called classical FM, for describing and certifying attacks.
Evidence is provided to argue for a new scheme for specifying such
unlinkability problems that more accurately reflects the capabilities of an
attacker
AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments
This report considers the application of Articial Intelligence (AI) techniques to
the problem of misuse detection and misuse localisation within telecommunications
environments. A broad survey of techniques is provided, that covers inter alia
rule based systems, model-based systems, case based reasoning, pattern matching,
clustering and feature extraction, articial neural networks, genetic algorithms, arti
cial immune systems, agent based systems, data mining and a variety of hybrid
approaches. The report then considers the central issue of event correlation, that
is at the heart of many misuse detection and localisation systems. The notion of
being able to infer misuse by the correlation of individual temporally distributed
events within a multiple data stream environment is explored, and a range of techniques,
covering model based approaches, `programmed' AI and machine learning
paradigms. It is found that, in general, correlation is best achieved via rule based approaches,
but that these suffer from a number of drawbacks, such as the difculty of
developing and maintaining an appropriate knowledge base, and the lack of ability
to generalise from known misuses to new unseen misuses. Two distinct approaches
are evident. One attempts to encode knowledge of known misuses, typically within
rules, and use this to screen events. This approach cannot generally detect misuses
for which it has not been programmed, i.e. it is prone to issuing false negatives.
The other attempts to `learn' the features of event patterns that constitute normal
behaviour, and, by observing patterns that do not match expected behaviour, detect
when a misuse has occurred. This approach is prone to issuing false positives,
i.e. inferring misuse from innocent patterns of behaviour that the system was not
trained to recognise. Contemporary approaches are seen to favour hybridisation,
often combining detection or localisation mechanisms for both abnormal and normal
behaviour, the former to capture known cases of misuse, the latter to capture
unknown cases. In some systems, these mechanisms even work together to update
each other to increase detection rates and lower false positive rates. It is concluded
that hybridisation offers the most promising future direction, but that a rule or state
based component is likely to remain, being the most natural approach to the correlation
of complex events. The challenge, then, is to mitigate the weaknesses of
canonical programmed systems such that learning, generalisation and adaptation
are more readily facilitated
Graphical representation of canonical proof: two case studies
An interesting problem in proof theory is to find representations of proof that do
not distinguish between proofs that are ‘morally’ the same. For many logics, the presentation
of proofs in a traditional formalism, such as Gentzen’s sequent calculus, introduces
artificial syntactic structure called ‘bureaucracy’; e.g., an arbitrary ordering
of freely permutable inferences. A proof system that is free of bureaucracy is called
canonical for a logic. In this dissertation two canonical proof systems are presented,
for two logics: a notion of proof nets for additive linear logic with units, and ‘classical
proof forests’, a graphical formalism for first-order classical logic.
Additive linear logic (or sum–product logic) is the fragment of linear logic consisting
of linear implication between formulae constructed only from atomic formulae and
the additive connectives and units. Up to an equational theory over proofs, the logic
describes categories in which finite products and coproducts occur freely. A notion of
proof nets for additive linear logic is presented, providing canonical graphical representations
of the categorical morphisms and constituting a tractable decision procedure
for this equational theory. From existing proof nets for additive linear logic without
units by Hughes and Van Glabbeek (modified to include the units naively), canonical
proof nets are obtained by a simple graph rewriting algorithm called saturation. Main
technical contributions are the substantial correctness proof of the saturation algorithm,
and a correctness criterion for saturated nets.
Classical proof forests are a canonical, graphical proof formalism for first-order
classical logic. Related to Herbrand’s Theorem and backtracking games in the style
of Coquand, the forests assign witnessing information to quantifiers in a structurally
minimal way, reducing a first-order sentence to a decidable propositional one. A similar
formalism ‘expansion tree proofs’ was presented by Miller, but not given a method
of composition. The present treatment adds a notion of cut, and investigates the possibility
of composing forests via cut-elimination. Cut-reduction steps take the form
of a rewrite relation that arises from the structure of the forests in a natural way.
Yet reductions are intricate, and initially not well-behaved: from perfectly ordinary
cuts, reduction may reach unnaturally configured cuts that may not be reduced. Cutelimination
is shown using a modified version of the rewrite relation, inspired by the
game-theoretic interpretation of the forests, for which weak normalisation is shown,
and strong normalisation is conjectured. In addition, by a more intricate argument,
weak normalisation is also shown for the original reduction relation
- …