331,703 research outputs found
Proof-Producing Symbolic Execution for Binary Code Verification
We propose a proof-producing symbolic execution for verification of
machine-level programs. The analysis is based on a set of core inference rules
that are designed to give control over the tradeoff between preservation of
precision and the introduction of overapproximation to make the application to
real world code useful and tractable. We integrate our symbolic execution in a
binary analysis platform that features a low-level intermediate language
enabling the application of analyses to many different processor architectures.
The overall framework is implemented in the theorem prover HOL4 to be able to
obtain highly trustworthy verification results. We demonstrate our approach to
establish sound execution time bounds for a control loop program implemented
for an ARM Cortex-M0 processor
Featherweight VeriFast
VeriFast is a leading research prototype tool for the sound modular
verification of safety and correctness properties of single-threaded and
multithreaded C and Java programs. It has been used as a vehicle for
exploration and validation of novel program verification techniques and for
industrial case studies; it has served well at a number of program verification
competitions; and it has been used for teaching by multiple teachers
independent of the authors. However, until now, while VeriFast's operation has
been described informally in a number of publications, and specific
verification techniques have been formalized, a clear and precise exposition of
how VeriFast works has not yet appeared. In this article we present for the
first time a formal definition and soundness proof of a core subset of the
VeriFast program verification approach. The exposition aims to be both
accessible and rigorous: the text is based on lecture notes for a graduate
course on program verification, and it is backed by an executable
machine-readable definition and machine-checked soundness proof in Coq
A nonparametric analysis of the Cournot model
An observer makes a number of observations of an industry producing a homogeneous good. Each observation consists of the market price, the output of individual firms and perhaps information on each firm's production cost. We provide various tests (typically, linear programs) with which the observer can determine if the data set is consistent with the hypothesis that firms in this industry are playing a Cournot game at each observation. When cost information is wholly or partially unavailable, these tests could potentially be used to derive cost information on the firms. This paper is a contribution to the literature that aims to characterize (in various contexts) the restrictions that a data set must satisfy for it to be consistent with Nash outcomes in a game. It is also inspired by the seminal result of Afriat (and the subsequent literature) which addresses similar issues in the context of consumer demand, though one important technical difference from most of these results is that the objective functions of firms in a Cournot game are not necessarily quasiconcave
A Computable Measure of Algorithmic Probability by Finite Approximations with an Application to Integer Sequences
Given the widespread use of lossless compression algorithms to approximate
algorithmic (Kolmogorov-Chaitin) complexity, and that lossless compression
algorithms fall short at characterizing patterns other than statistical ones
not different to entropy estimations, here we explore an alternative and
complementary approach. We study formal properties of a Levin-inspired measure
calculated from the output distribution of small Turing machines. We
introduce and justify finite approximations that have been used in some
applications as an alternative to lossless compression algorithms for
approximating algorithmic (Kolmogorov-Chaitin) complexity. We provide proofs of
the relevant properties of both and and compare them to Levin's
Universal Distribution. We provide error estimations of with respect to
. Finally, we present an application to integer sequences from the Online
Encyclopedia of Integer Sequences which suggests that our AP-based measures may
characterize non-statistical patterns, and we report interesting correlations
with textual, function and program description lengths of the said sequences.Comment: As accepted by the journal Complexity (Wiley/Hindawi
Automated Discharging Arguments for Density Problems in Grids
Discharging arguments demonstrate a connection between local structure and
global averages. This makes it an effective tool for proving lower bounds on
the density of special sets in infinite grids. However, the minimum density of
an identifying code in the hexagonal grid remains open, with an upper bound of
and a lower bound of . We present a new, experimental framework for producing discharging
arguments using an algorithm. This algorithm replaces the lengthy case analysis
of human-written discharging arguments with a linear program that produces the
best possible lower bound using the specified set of discharging rules. We use
this framework to present a lower bound of on
the density of an identifying code in the hexagonal grid, and also find several
sharp lower bounds for variations on identifying codes in the hexagonal,
square, and triangular grids.Comment: This is an extended abstract, with 10 pages, 2 appendices, 5 tables,
and 2 figure
Formal Verification of Differential Privacy for Interactive Systems
Differential privacy is a promising approach to privacy preserving data
analysis with a well-developed theory for functions. Despite recent work on
implementing systems that aim to provide differential privacy, the problem of
formally verifying that these systems have differential privacy has not been
adequately addressed. This paper presents the first results towards automated
verification of source code for differentially private interactive systems. We
develop a formal probabilistic automaton model of differential privacy for
systems by adapting prior work on differential privacy for functions. The main
technical result of the paper is a sound proof technique based on a form of
probabilistic bisimulation relation for proving that a system modeled as a
probabilistic automaton satisfies differential privacy. The novelty lies in the
way we track quantitative privacy leakage bounds using a relation family
instead of a single relation. We illustrate the proof technique on a
representative automaton motivated by PINQ, an implemented system that is
intended to provide differential privacy. To make our proof technique easier to
apply to realistic systems, we prove a form of refinement theorem and apply it
to show that a refinement of the abstract PINQ automaton also satisfies our
differential privacy definition. Finally, we begin the process of automating
our proof technique by providing an algorithm for mechanically checking a
restricted class of relations from the proof technique.Comment: 65 pages with 1 figur
- …