128 research outputs found
Verifying the Mondex Case Study - The KeY Approach
The Mondex Case study is still the most substantial contribution
to the Grand Challenge repository. It has been the target of a
number of formal verification efforts. Those efforts
concentrated on correctness proofs for refinement steps of the
specification in various specification formalisms using
different verification tools. Here, the results of full
functional verification of a Javacard implementation of the case
study is reported. The functional behavior of the application
as well as the security properties to be proven were formalized
in JML and verified using the KeY tool, a
verification tool for deductive verifying Javacard code. The
implementation developed followed, as closely as possible, the
concrete layer of the case study\u27s original Z specification.
The result demonstrates that, with an appropriate specification
language and verification tool, it is possible to bridge the gap
between specification and implementation ensuring a fully
verified result. The complete material - source code, proofs
and binaries of the verification system - is available at
http://www.key-project.org/case_studies/mondex.htm
JCML: A specification language for the runtime verification of Java Card programs
AbstractJava Card is a version of Java developed to run on devices with severe storage and processing restrictions. The applets that run on these devices are frequently intended for use in critical, highly distributed, mobile conditions. They are required to be portable and safe. Often, the requirements of the application impose the use of dynamic, on-card verifications, but most of the research developed to improve the safety of Java Card applets concentrates on static verification methods. This work presents a runtime verification approach based on Design by Contract to improve the safety of Java Card applications. To this end, we propose JCML (Java Card Modelling Language) a specification language derived from JML (Java Modelling Language) and its implementation: a compiler that generates runtime verification code. We also present some experiments and quality indicators. This paper extends previous published work from the authors with a more complete and precise definition of the JCML language and new experiments and results
Formal Verification of Security Protocol Implementations: A Survey
Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approac
Verifying data- and control-oriented properties combining static and runtime verification : theory and tools
Static verification techniques are used to analyse and prove properties about programs before they are executed. Many of these techniques work directly on the source code
and are used to verify data-oriented properties over all possible executions. The analysis is
necessarily an over-approximation as the real executions of the program are not available
at analysis time. In contrast, runtime verification techniques have been extensively used for
control-oriented properties, analysing the current execution path of the program in a fully
automatic manner. In this article, we present a novel approach in which data-oriented and
control-oriented properties may be stated in a single formalism amenable to both static and
dynamic verification techniques. The specification language we present to achieve this that
of ppDATEs, which enhances the control-oriented property language of DATEs, with data-
oriented pre/postconditions. For runtime verification of ppDATE specifications, the language
is translated into a DATE. We give a formal semantics to ppDATEs, which we use to prove
the correctness of our translation from ppDATEs to DATEs. We show how ppDATE specifi-
cations can be analysed using a combination of the deductive theorem prover KeY and the
runtime verification tool LARVA. Verification is performed in two steps: KeY first partially
proves the data-oriented part of the specification, simplifying the specification which is then
passed on to LARVA to check at runtime for the remaining parts of the specification including
the control-oriented aspects. We show the applicability of our approach on two case studies.peer-reviewe
The Mondex case study: from specifications to code
In this paper we introduce three different implementations for the Mondex electronic purse verification challenge [Woo06] [SCW00]. In previous work ([SGHR06] [SGH+07] and [HSGR06]) we verified security and correctness properties of the Mondex money transfer protocol. Here we present a way to translate the formal specifications into running JavaCard code. We introduce three different ways to implement the protocol, one using symmetric cryptography, one using asymmetric cryptography and finally one using special datatypes for cryptographic protocols and symmetric cryptography. All implementations presented in this paper are able to run on a Gemplus GemxpressoRAD ProR3 SmartCard
Event analytics
The process analysis toolkit (PAT) integrates the expressiveness of state, event, time, and probability-based languages with the power of model checking. PAT is a self-contained reasoning system for system specification, simulation, and verification. PAT currently supports a wide range of 12 different expressive modeling languages with many application domains and has attracted thousands of registered users from hundreds of organizations. In this invited talk, we will present the PAT system and its vision on “Event Analytics” (EA) which is beyond “Data Analytics”. The EA research is based on applying model checking to event planning, scheduling, prediction, strategy analysis and decision making. Various new EA research directions will be discussed.No Full Tex
KeY + LARVA = STARVOORS
Over the past decades, various forms of automated verification techniques have been proposed and explored in the literature, mostly falling in one of two categories — static and dynamic verification. On one hand, static verification techniques have the ability to verify properties across all possible executions of a system, but fully automated verification is typically not easy to perform. On the other hand, dynamic verification techniques, such as runtime verification, can only give feedback on single executions of the system, but are easy to automate. However, despite the fact that monitoring is typically easy to automate, its adoption in industry is limited — primarily due to the overheads in the system execution that such monitoring typically induces. In this paper, we explore a novel approach in which we combine the two approaches — using static analysis to prune parts of the specification, thus reducing the overheads for the dynamic verification process. We present our framework STARVOORS, which embodies this approach by combining the static analysis tool KeY and the dynamic verification tool LARVA, and discuss how it has been applied to the Mondex case study — an electronic purse implementation. The results presented here have been accepted for publication in and builds upon our previous results from.peer-reviewe
Invariant discovery and refinement plans for formal modelling in Event-B
The continuous growth of complex systems makes the development of correct software
increasingly challenging. In order to address this challenge, formal methods o er rigorous
mathematical techniques to model and verify the correctness of systems. Refinement
is one of these techniques. By allowing a developer to incrementally introduce design
details, refinement provides a powerful mechanism for mastering the complexities that
arise when formally modelling systems. Here the focus is on a posit-and-prove style of
refinement, where a design is developed as a series of abstract models introduced via
refinement steps. Each refinement step generates proof obligations which must be discharged
in order to verify its correctness – typically requiring a user to understand the
relationship between modelling and reasoning.
This thesis focuses on techniques to aid refinement-based formal modelling, specifically,
when a user requires guidance in order to overcome a failed refinement step. An integrated
approach has been followed: combining the complementary strengths of bottomup
theory formation, in which theories about domains are built based on basic background
information; and top-down planning, in which meta-level reasoning is used to guide the
search for correct models.
On the theory formation perspective, we developed a technique for the automatic discovery
of invariants. Refinement requires the definition of properties, called invariants,
which relate to the design. Formulating correct and meaningful invariants can be tedious
and a challenging task. A heuristic approach to the automatic discovery of invariants has
been developed building upon simulation, proof-failure analysis and automated theory
formation. This approach exploits the close interplay between modelling and reasoning
in order to provide systematic guidance in tailoring the search for invariants for a given
model.
On the planning perspective, we propose a new technique called refinement plans.
Refinement plans provide a basis for automatically generating modelling guidance when
a step fails but is close to a known pattern of refinement. This technique combines both
modelling and reasoning knowledge, and, contrary to traditional pattern techniques, allow
the analysis of failure and partial matching. Moreover, when the guidance is only partially
instantiated, and it is suitable, refinement plans provide specialised knowledge to further
tailor the theory formation process in an attempt to fully instantiate the guidance.
We also report on a series of experiments undertaken in order to evaluate the approaches
and on the implementation of both techniques into prototype tools. We believe
the techniques presented here allow the developer to focus on design decisions rather than
on analysing low-level proof failures
- …