11,297 research outputs found
Set Theory or Higher Order Logic to Represent Auction Concepts in Isabelle?
When faced with the question of how to represent properties in a formal proof
system any user has to make design decisions. We have proved three of the
theorems from Maskin's 2004 survey article on Auction Theory using the
Isabelle/HOL system, and we have produced verified code for combinatorial
Vickrey auctions. A fundamental question in this was how to represent some
basic concepts: since set theory is available inside Isabelle/HOL, when
introducing new definitions there is often the issue of balancing the amount of
set-theoretical objects and of objects expressed using entities which are more
typical of higher order logic such as functions or lists. Likewise, a user has
often to answer the question whether to use a constructive or a
non-constructive definition. Such decisions have consequences for the proof
development and the usability of the formalization. For instance, sets are
usually closer to the representation that economists would use and recognize,
while the other objects are closer to the extraction of computational content.
In this paper we give examples of the advantages and disadvantages for these
approaches and their relationships. In addition, we present the corresponding
Isabelle library of definitions and theorems, most prominently those dealing
with relations and quotients.Comment: Preprint of a paper accepted for the forthcoming CICM 2014 conference
(cicm-conference.org/2014): S.M. Watt et al. (Eds.): CICM 2014, LNAI 8543,
Springer International Publishing Switzerland 2014. 16 pages, 1 figur
On Constructive Axiomatic Method
In this last version of the paper one may find a critical overview of some
recent philosophical literature on Axiomatic Method and Genetic Method.Comment: 25 pages, no figure
Doing and Showing
The persisting gap between the formal and the informal mathematics is due to
an inadequate notion of mathematical theory behind the current formalization
techniques. I mean the (informal) notion of axiomatic theory according to which
a mathematical theory consists of a set of axioms and further theorems deduced
from these axioms according to certain rules of logical inference. Thus the
usual notion of axiomatic method is inadequate and needs a replacement.Comment: 54 pages, 2 figure
Z2SAL: a translation-based model checker for Z
Despite being widely known and accepted in industry, the Z formal specification language has not so far been well supported by automated verification tools, mostly because of the challenges in handling the abstraction of the language. In this paper we discuss a novel approach to building a model-checker for Z, which involves implementing a translation from Z into SAL, the input language for the Symbolic Analysis Laboratory, a toolset which includes a number of model-checkers and a simulator. The Z2SAL translation deals with a number of important issues, including: mapping unbounded, abstract specifications into bounded, finite models amenable to a BDD-based symbolic checker; converting a non-constructive and piecemeal style of functional specification into a deterministic, automaton-based style of specification; and supporting the rich set-based vocabulary of the Z mathematical toolkit. This paper discusses progress made towards implementing as complete and faithful a translation as possible, while highlighting certain assumptions, respecting certain limitations and making use of available optimisations. The translation is illustrated throughout with examples; and a complete working example is presented, together with performance data
Mechanized semantics
The goal of this lecture is to show how modern theorem provers---in this
case, the Coq proof assistant---can be used to mechanize the specification of
programming languages and their semantics, and to reason over individual
programs and over generic program transformations, as typically found in
compilers. The topics covered include: operational semantics (small-step,
big-step, definitional interpreters); a simple form of denotational semantics;
axiomatic semantics and Hoare logic; generation of verification conditions,
with application to program proof; compilation to virtual machine code and its
proof of correctness; an example of an optimizing program transformation (dead
code elimination) and its proof of correctness
Did Lobachevsky Have A Model Of His "imaginary Geometry"?
The invention of non-Euclidean geometries is often seen through the optics of
Hilbertian formal axiomatic method developed later in the 19th century. However
such an anachronistic approach fails to provide a sound reading of
Lobachevsky's geometrical works. Although the modern notion of model of a given
theory has a counterpart in Lobachevsky's writings its role in Lobachevsky's
geometrical theory turns to be very unusual. Lobachevsky doesn't consider
various models of Hyperbolic geometry, as the modern reader would expect, but
uses a non-standard model of Euclidean plane (as a particular surface in the
Hyperbolic 3-space). In this paper I consider this Lobachevsky's construction,
and show how it can be better analyzed within an alternative non-Hilbertian
foundational framework, which relates the history of geometry of the 19th
century to some recent developments in the field.Comment: 31 pages, 8 figure
Elaboration in Dependent Type Theory
To be usable in practice, interactive theorem provers need to provide
convenient and efficient means of writing expressions, definitions, and proofs.
This involves inferring information that is often left implicit in an ordinary
mathematical text, and resolving ambiguities in mathematical expressions. We
refer to the process of passing from a quasi-formal and partially-specified
expression to a completely precise formal one as elaboration. We describe an
elaboration algorithm for dependent type theory that has been implemented in
the Lean theorem prover. Lean's elaborator supports higher-order unification,
type class inference, ad hoc overloading, insertion of coercions, the use of
tactics, and the computational reduction of terms. The interactions between
these components are subtle and complex, and the elaboration algorithm has been
carefully designed to balance efficiency and usability. We describe the central
design goals, and the means by which they are achieved
Computability and analysis: the legacy of Alan Turing
We discuss the legacy of Alan Turing and his impact on computability and
analysis.Comment: 49 page
- …