1,611 research outputs found
ENIGMA: Efficient Learning-based Inference Guiding Machine
ENIGMA is a learning-based method for guiding given clause selection in
saturation-based theorem provers. Clauses from many proof searches are
classified as positive and negative based on their participation in the proofs.
An efficient classification model is trained on this data, using fast
feature-based characterization of the clauses . The learned model is then
tightly linked with the core prover and used as a basis of a new parameterized
evaluation heuristic that provides fast ranking of all generated clauses. The
approach is evaluated on the E prover and the CASC 2016 AIM benchmark, showing
a large increase of E's performance.Comment: Submitted to LPAR 201
A Vernacular for Coherent Logic
We propose a simple, yet expressive proof representation from which proofs
for different proof assistants can easily be generated. The representation uses
only a few inference rules and is based on a frag- ment of first-order logic
called coherent logic. Coherent logic has been recognized by a number of
researchers as a suitable logic for many ev- eryday mathematical developments.
The proposed proof representation is accompanied by a corresponding XML format
and by a suite of XSL transformations for generating formal proofs for
Isabelle/Isar and Coq, as well as proofs expressed in a natural language form
(formatted in LATEX or in HTML). Also, our automated theorem prover for
coherent logic exports proofs in the proposed XML format. All tools are
publicly available, along with a set of sample theorems.Comment: CICM 2014 - Conferences on Intelligent Computer Mathematics (2014
A formalized general theory of syntax with bindings
We present the formalization of a theory of syntax with bindings that has been developed and refined over the last decade to support several large formalization efforts. Terms are defined for an arbitrary number of constructors of varying numbers of inputs, quotiented to alpha-equivalence and sorted according to a binding signature. The theory includes a rich collection of properties of the standard operators on terms, such as substitution and freshness. It also includes induction and recursion principles and support for semantic interpretation, all tailored for smooth interaction with the bindings and the standard operators
A NLTE model atmosphere analysis of the pulsating sdO star SDSS J1600+0748
We started a program to construct several grids of suitable model atmospheres
and synthetic spectra for hot subdwarf O stars computed, for comparative
purposes, in LTE, NLTE, with and without metals. For the moment, we use our
grids to perform fits on our spectrum of SDSS J160043.6+074802.9 (J1600+0748
for short), this unique pulsating sdO star. Our best fit is currently obtained
with NLTE model atmospheres including carbon, nitrogen and oxygen in solar
abundances, which leads to the following parameters for SDSS J1600+0748 : Teff
= 69 060 +/- 2080 K, log g = 6.00 +/- 0.09 and log N(He)/N(H) = -0.61 +/- 0.06.
Improvements are needed, however, particularly for fitting the available He II
lines. It is hoped that the inclusion of Fe will help remedy the situation.Comment: 4 pages, 4 figures, accepted in Astrophysics and Space Science
(24/02/2010), Special issue Hot sudbwarf star
Sharing HOL4 and HOL Light proof knowledge
New proof assistant developments often involve concepts similar to already
formalized ones. When proving their properties, a human can often take
inspiration from the existing formalized proofs available in other provers or
libraries. In this paper we propose and evaluate a number of methods, which
strengthen proof automation by learning from proof libraries of different
provers. Certain conjectures can be proved directly from the dependencies
induced by similar proofs in the other library. Even if exact correspondences
are not found, learning-reasoning systems can make use of the association
between proved theorems and their characteristics to predict the relevant
premises. Such external help can be further combined with internal advice. We
evaluate the proposed knowledge-sharing methods by reproving the HOL Light and
HOL4 standard libraries. The learning-reasoning system HOL(y)Hammer, whose
single best strategy could automatically find proofs for 30% of the HOL Light
problems, can prove 40% with the knowledge from HOL4
Poincaré on the Foundation of Geometry in the Understanding
This paper is about Poincaré’s view of the foundations of geometry. According to the established view, which has been inherited from the logical positivists, Poincaré, like Hilbert, held that axioms in geometry are schemata that provide implicit definitions of geometric terms, a view he expresses by stating that the axioms of geometry are “definitions in disguise.” I argue that this view does not accord well with Poincaré’s core commitment in the philosophy of geometry: the view that geometry is the study of groups of operations. In place of the established view I offer a revised view, according to which Poincaré held that axioms in geometry are in fact assertions about invariants of groups. Groups, as forms of the understanding, are prior in conception to the objects of geometry and afford the proper definition of those objects, according to Poincaré. Poincaré’s view therefore contrasts sharply with Kant’s foundation of geometry in a unique form of sensibility. According to my interpretation, axioms are not definitions in disguise because they themselves implicitly define their terms, but rather because they disguise the definitions which imply them
Harnessing Higher-Order (Meta-)Logic to Represent and Reason with Complex Ethical Theories
The computer-mechanization of an ambitious explicit ethical theory, Gewirth's
Principle of Generic Consistency, is used to showcase an approach for
representing and reasoning with ethical theories exhibiting complex logical
features like alethic and deontic modalities, indexicals, higher-order
quantification, among others. Harnessing the high expressive power of Church's
type theory as a meta-logic to semantically embed a combination of quantified
non-classical logics, our work pushes existing boundaries in knowledge
representation and reasoning. We demonstrate that intuitive encodings of
complex ethical theories and their automation on the computer are no longer
antipodes.Comment: 14 page
Foundational (co)datatypes and (co)recursion for higher-order logic
We describe a line of work that started in 2011 towards enriching Isabelle/HOL's language with coinductive datatypes, which allow infinite values, and with a more expressive notion of inductive datatype than previously supported by any system based on higher-order logic. These (co)datatypes are complemented by definitional principles for (co)recursive functions and reasoning principles for (co)induction. In contrast with other systems offering codatatypes, no additional axioms or logic extensions are necessary with our approach
Isabelle/DOF: Design and Implementation
This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this record17th International Conference, SEFM 2019
Oslo, Norway, September 18–20, 2019DOF is a novel framework for defining ontologies and enforcing them during document development and evolution. A major goal of DOF is the integrated development of formal certification documents (e. g., for Common Criteria or CENELEC 50128) that require consistency across both formal and informal arguments. To support a consistent development of formal and informal parts of a document, we provide Isabelle/DOF, an implementation of DOF on top of the formal methods framework Isabelle/HOL. A particular emphasis is put on a deep integration into Isabelleâs IDE, which allows for smooth ontology development as well as immediate ontological feedback during the editing of a document. In this paper, we give an in-depth presentation of the design concepts of DOFâs Ontology Definition Language (ODL) and key aspects of the technology of its implementation. Isabelle/DOF is the first ontology language supporting machine-checked links between the formal and informal parts in an LCF-style interactive theorem proving environment. Sufficiently annotated, large documents can easily be developed collabo- ratively, while ensuring their consistency, and the impact of changes (in the formal and the semi-formal content) is tracked automatically.IRT SystemX, Paris-Saclay, Franc
- …