16,415 research outputs found
The Complexity of Reasoning with FODD and GFODD
Recent work introduced Generalized First Order Decision Diagrams (GFODD) as a
knowledge representation that is useful in mechanizing decision theoretic
planning in relational domains. GFODDs generalize function-free first order
logic and include numerical values and numerical generalizations of existential
and universal quantification. Previous work presented heuristic inference
algorithms for GFODDs and implemented these heuristics in systems for decision
theoretic planning. In this paper, we study the complexity of the computational
problems addressed by such implementations. In particular, we study the
evaluation problem, the satisfiability problem, and the equivalence problem for
GFODDs under the assumption that the size of the intended model is given with
the problem, a restriction that guarantees decidability. Our results provide a
complete characterization placing these problems within the polynomial
hierarchy. The same characterization applies to the corresponding restriction
of problems in first order logic, giving an interesting new avenue for
efficient inference when the number of objects is bounded. Our results show
that for formulas, and for corresponding GFODDs, evaluation and
satisfiability are complete, and equivalence is
complete. For formulas evaluation is complete, satisfiability
is one level higher and is complete, and equivalence is
complete.Comment: A short version of this paper appears in AAAI 2014. Version 2
includes a reorganization and some expanded proof
Formal Verification of Security Protocol Implementations: A Survey
Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approac
On Verifying Complex Properties using Symbolic Shape Analysis
One of the main challenges in the verification of software systems is the
analysis of unbounded data structures with dynamic memory allocation, such as
linked data structures and arrays. We describe Bohne, a new analysis for
verifying data structures. Bohne verifies data structure operations and shows
that 1) the operations preserve data structure invariants and 2) the operations
satisfy their specifications expressed in terms of changes to the set of
objects stored in the data structure. During the analysis, Bohne infers loop
invariants in the form of disjunctions of universally quantified Boolean
combinations of formulas. To synthesize loop invariants of this form, Bohne
uses a combination of decision procedures for Monadic Second-Order Logic over
trees, SMT-LIB decision procedures (currently CVC Lite), and an automated
reasoner within the Isabelle interactive theorem prover. This architecture
shows that synthesized loop invariants can serve as a useful communication
mechanism between different decision procedures. Using Bohne, we have verified
operations on data structures such as linked lists with iterators and back
pointers, trees with and without parent pointers, two-level skip lists, array
data structures, and sorted lists. We have deployed Bohne in the Hob and Jahob
data structure analysis systems, enabling us to combine Bohne with analyses of
data structure clients and apply it in the context of larger programs. This
report describes the Bohne algorithm as well as techniques that Bohne uses to
reduce the ammount of annotations and the running time of the analysis
Optimizing Abstract Abstract Machines
The technique of abstracting abstract machines (AAM) provides a systematic
approach for deriving computable approximations of evaluators that are easily
proved sound. This article contributes a complementary step-by-step process for
subsequently going from a naive analyzer derived under the AAM approach, to an
efficient and correct implementation. The end result of the process is a two to
three order-of-magnitude improvement over the systematically derived analyzer,
making it competitive with hand-optimized implementations that compute
fundamentally less precise results.Comment: Proceedings of the International Conference on Functional Programming
2013 (ICFP 2013). Boston, Massachusetts. September, 201
Bounded Refinement Types
We present a notion of bounded quantification for refinement types and show
how it expands the expressiveness of refinement typing by using it to develop
typed combinators for: (1) relational algebra and safe database access, (2)
Floyd-Hoare logic within a state transformer monad equipped with combinators
for branching and looping, and (3) using the above to implement a refined IO
monad that tracks capabilities and resource usage. This leap in expressiveness
comes via a translation to "ghost" functions, which lets us retain the
automated and decidable SMT based checking and inference that makes refinement
typing effective in practice.Comment: 14 pages, International Conference on Functional Programming, ICFP
201
Resource Usage Protocols for Iterators
We discuss usage protocols for iterator objects that prevent concurrent modifications of the underlying collection while iterators are in progress. We formalize these protocols in Java-like object interfaces, enriched with separation logic contracts. We present examples of iterator clients and proofs that they adhere to the iterator protocol, as well as examples of iterator implementations and proofs that they implement the iterator interface
What's Decidable About Sequences?
We present a first-order theory of sequences with integer elements,
Presburger arithmetic, and regular constraints, which can model significant
properties of data structures such as arrays and lists. We give a decision
procedure for the quantifier-free fragment, based on an encoding into the
first-order theory of concatenation; the procedure has PSPACE complexity. The
quantifier-free fragment of the theory of sequences can express properties such
as sortedness and injectivity, as well as Boolean combinations of periodic and
arithmetic facts relating the elements of the sequence and their positions
(e.g., "for all even i's, the element at position i has value i+3 or 2i"). The
resulting expressive power is orthogonal to that of the most expressive
decidable logics for arrays. Some examples demonstrate that the fragment is
also suitable to reason about sequence-manipulating programs within the
standard framework of axiomatic semantics.Comment: Fixed a few lapses in the Mergesort exampl
- …