105 research outputs found
Data Definitions in the ACL2 Sedan
We present a data definition framework that enables the convenient
specification of data types in ACL2s, the ACL2 Sedan. Our primary motivation
for developing the data definition framework was pedagogical. We were teaching
undergraduate students how to reason about programs using ACL2s and wanted to
provide them with an effective method for defining, testing, and reasoning
about data types in the context of an untyped theorem prover. Our framework is
now routinely used not only for pedagogical purposes, but also by advanced
users.
Our framework concisely supports common data definition patterns, e.g. list
types, map types, and record types. It also provides support for polymorphic
functions. A distinguishing feature of our approach is that we maintain both a
predicative and an enumerative characterization of data definitions.
In this paper we present our data definition framework via a sequence of
examples. We give a complete characterization in terms of tau rules of the
inclusion/exclusion relations a data definition induces, under suitable
restrictions. The data definition framework is a key component of
counterexample generation support in ACL2s, but can be independently used in
ACL2, and is available as a community book.Comment: In Proceedings ACL2 2014, arXiv:1406.123
Polymorphic Types in ACL2
This paper describes a tool suite for the ACL2 programming language which
incorporates certain ideas from the Hindley-Milner paradigm of functional
programming (as exemplified in popular languages like ML and Haskell),
including a "typed" style of programming with the ability to define polymorphic
types. These ideas are introduced via macros into the language of ACL2, taking
advantage of ACL2's guard-checking mechanism to perform type checking on both
function definitions and theorems. Finally, we discuss how these macros were
used to implement features of Specware, a software specification and
implementation system.Comment: In Proceedings ACL2 2014, arXiv:1406.123
Extracting Imperative Programs from Proofs: In-place Quicksort
The process of program extraction is primarily associated with
functional programs with less focus on imperative program extraction. In this paper we consider a standard problem for imperative programming: In-place Quicksort. We formalize a proof that every array of natural numbers can be sorted and apply a realizability
interpretation to extract a program from the proof. Using monads we
are able to exhibit the inherent imperative nature of the extracted
program. We see this as a first step towards an automated extraction of imperative programs. The case study is carried out in the interactive proof assistant Minlog
From LCF to Isabelle/HOL
Interactive theorem provers have developed dramatically over the past four
decades, from primitive beginnings to today's powerful systems. Here, we focus
on Isabelle/HOL and its distinctive strengths. They include automatic proof
search, borrowing techniques from the world of first order theorem proving, but
also the automatic search for counterexamples. They include a highly readable
structured language of proofs and a unique interactive development environment
for editing live proof documents. Everything rests on the foundation conceived
by Robin Milner for Edinburgh LCF: a proof kernel, using abstract types to
ensure soundness and eliminate the need to store proofs. Compared with the
research prototypes of the 1970s, Isabelle is a practical and versatile tool.
It is used by system designers, mathematicians and many others
Mechanized Metatheory for the Masses: The \u3cb\u3ePOPLMARK\u3c/b\u3e Challenge
How close are we to a world where every paper on programming languages is accompanied by an electronic appendix with machinechecked proofs?
We propose an initial set of benchmarks for measuring progress in this area. Based on the metatheory of System F, a typed lambda-calculus with second-order polymorphism, subtyping, and records, these benchmarks embody many aspects of programming languages that are challenging to formalize: variable binding at both the term and type levels, syntactic forms with variable numbers of components (including binders), and proofs demanding complex induction principles. We hope that these benchmarks will help clarify the current state of the art, provide a basis for comparing competing technologies, and motivate further research
Applying Formal Methods to Networking: Theory, Techniques and Applications
Despite its great importance, modern network infrastructure is remarkable for
the lack of rigor in its engineering. The Internet which began as a research
experiment was never designed to handle the users and applications it hosts
today. The lack of formalization of the Internet architecture meant limited
abstractions and modularity, especially for the control and management planes,
thus requiring for every new need a new protocol built from scratch. This led
to an unwieldy ossified Internet architecture resistant to any attempts at
formal verification, and an Internet culture where expediency and pragmatism
are favored over formal correctness. Fortunately, recent work in the space of
clean slate Internet design---especially, the software defined networking (SDN)
paradigm---offers the Internet community another chance to develop the right
kind of architecture and abstractions. This has also led to a great resurgence
in interest of applying formal methods to specification, verification, and
synthesis of networking protocols and applications. In this paper, we present a
self-contained tutorial of the formidable amount of work that has been done in
formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial
Theorem Provers as Libraries -- An Approach to Formally Verifying Functional Programs
Property-directed verification of functional programs tends to take one of two paths. First, is the traditional testing approach, where properties are expressed in the original programming language and checked with a collection of test data. Alternatively, for those desiring a more rigorous approach, properties can be written and checked with a formal tool; typically, an external proof system. This dissertation details a hybrid approach that captures the best of both worlds: the formality of a proof system paired with the native integration of an embedded, domain specific language (EDSL) for testing. At the heart of this hybridization is the titular concept -- a theorem prover as a library. The verification capabilities of this prover, HaskHOL, are introduced to a Haskell development environment as a GHC compiler plugin. Operating at the compiler level provides for a comparatively simpler integration and allows verification to co-exist with the numerous other passes that stand between source code and program
- …