227,113 research outputs found
Recommended from our members
Effective Interactive Proofs for Higher-Order Imperative Programs
We present a new approach for constructing and verifying higher-order, imperative programs using the Coq proof assistant. We build on the past work on the Ynot system, which is based on Hoare Type Theory. That original system was a proof of concept, where every program verification was accomplished via laborious manual proofs, with much code devoted to uninteresting low-level details. In this paper, we present a re-implementation of Ynot which makes it possible to implement fully-verified, higher-order imperative programs with reasonable proof burden. At the same time, our new system is implemented entirely in Coq source files, showcasing the versatility of that proof assistant as a platform for research on language design and verification. Both versions of the system have been evaluated with case studies in the verification of imperative data structures, such as hash tables with higher-order iterators. The verification burden in our new system is reduced by at least an order of magnitude compared to the old system, by replacing manual proof with automation. The core of the automation is a simplification procedure for implications in higher-order separation logic, with hooks that allow programmers to add domain-specific simplification rules.
We argue for the effectiveness of our infrastructure by verifying a number of data structures and a packrat parser, and we compare to similar efforts within other projects. Compared to competing approaches to data structure verification, our system includes much less code that must be trusted; namely, about a hundred lines of Coq code defining a program logic. All of our theorems and decision procedures have or build machine-checkable correctness proofs from first principles, removing opportunities for tool bugs to create faulty verifications.Engineering and Applied Science
Facticity as the amount of self-descriptive information in a data set
Using the theory of Kolmogorov complexity the notion of facticity {\phi}(x)
of a string is defined as the amount of self-descriptive information it
contains. It is proved that (under reasonable assumptions: the existence of an
empty machine and the availability of a faithful index) facticity is definite,
i.e. random strings have facticity 0 and for compressible strings 0 < {\phi}(x)
< 1/2 |x| + O(1). Consequently facticity measures the tension in a data set
between structural and ad-hoc information objectively. For binary strings there
is a so-called facticity threshold that is dependent on their entropy. Strings
with facticty above this threshold have no optimal stochastic model and are
essentially computational. The shape of the facticty versus entropy plot
coincides with the well-known sawtooth curves observed in complex systems. The
notion of factic processes is discussed. This approach overcomes problems with
earlier proposals to use two-part code to define the meaningfulness or
usefulness of a data set.Comment: 10 pages, 2 figure
Quantum Kolmogorov Complexity Based on Classical Descriptions
We develop a theory of the algorithmic information in bits contained in an
individual pure quantum state. This extends classical Kolmogorov complexity to
the quantum domain retaining classical descriptions. Quantum Kolmogorov
complexity coincides with the classical Kolmogorov complexity on the classical
domain. Quantum Kolmogorov complexity is upper bounded and can be effectively
approximated from above under certain conditions. With high probability a
quantum object is incompressible. Upper- and lower bounds of the quantum
complexity of multiple copies of individual pure quantum states are derived and
may shed some light on the no-cloning properties of quantum states. In the
quantum situation complexity is not sub-additive. We discuss some relations
with ``no-cloning'' and ``approximate cloning'' properties.Comment: 17 pages, LaTeX, final and extended version of quant-ph/9907035, with
corrections to the published journal version (the two displayed equations in
the right-hand column on page 2466 had the left-hand sides of the displayed
formulas erroneously interchanged
Strong Turing Degrees for Additive BSS RAM's
For the additive real BSS machines using only constants 0 and 1 and order
tests we consider the corresponding Turing reducibility and characterize some
semi-decidable decision problems over the reals. In order to refine,
step-by-step, a linear hierarchy of Turing degrees with respect to this model,
we define several halting problems for classes of additive machines with
different abilities and construct further suitable decision problems. In the
construction we use methods of the classical recursion theory as well as
techniques for proving bounds resulting from algebraic properties. In this way
we extend a known hierarchy of problems below the halting problem for the
additive machines using only equality tests and we present a further
subhierarchy of semi-decidable problems between the halting problems for the
additive machines using only equality tests and using order tests,
respectively
Be My Guest: Normalizing and Compiling Programs using a Host Language
In programming language research, normalization is a process of fundamental importance to the theory of computing and reasoning about programs.In practice, on the other hand, compilation is a process that transforms programs in a language to machine code, and thus makes the programminglanguage a usable one. In this thesis, we investigate means of normalizing and compiling programs in a language using another language as the "host".Leveraging a host to work with programs of a "guest" language enables reuse of the host\u27s features that would otherwise be strenuous to develop.The specific tools of interest are Normalization by Evaluation and Embedded Domain-Specific Languages, both of which rely on a host language for their purposes. These tools are applied to solve problems in three different domains: to show that exponentials (or closures) can be eliminated from a categorical combinatory calculus, to propose a new proof technique based on normalization for showing noninterference, and to enable the programming of resource-constrained IoT devices from Haskell
- …