17 research outputs found

    Correspondences between Classical, Intuitionistic and Uniform Provability

    Get PDF
    Based on an analysis of the inference rules used, we provide a characterization of the situations in which classical provability entails intuitionistic provability. We then examine the relationship of these derivability notions to uniform provability, a restriction of intuitionistic provability that embodies a special form of goal-directedness. We determine, first, the circumstances in which the former relations imply the latter. Using this result, we identify the richest versions of the so-called abstract logic programming languages in classical and intuitionistic logic. We then study the reduction of classical and, derivatively, intuitionistic provability to uniform provability via the addition to the assumption set of the negation of the formula to be proved. Our focus here is on understanding the situations in which this reduction is achieved. However, our discussions indicate the structure of a proof procedure based on the reduction, a matter also considered explicitly elsewhere.Comment: 31 page

    Resoluci贸n SL*: Un paradigma basado en resoluci贸n lineal para la demostraci贸n autom谩tica

    Full text link
    El trabajo incluido en la presente tesis se enmarca dentro del campo de la demostraci贸n autom谩tica de teoremas y consiste en la estudio, definici贸n y desarrollo de un paradigma de resoluci贸n lineal, denominado Resoluci贸n SL*. La raz贸n para utilizar la denominaci贸n de paradigma reside en el hecho de que en s铆 misma resoluci贸n SL* no es un procedimiento, sino que se puede entender como una forma de razonamiento con ciertos par谩metros cuya instanciaci贸n da lugar a diferentes procedimientos que son adecuados para el tratamiento de distintos tipos de problemas. Por otro lado, se le ha dado el nombre de resoluci贸n SL* porque, como posteriormente se explicar谩, est谩 muy cercano a Eliminaci贸n de Modelos y a resoluci贸n SL (de ah铆 la primera parte del nombre). El asterisco final quiere denotar su parametrizaci贸n, de forma que los procedimientos instancias de resoluci贸n SL* ser谩n denominados con una letra m谩s en vez del asterisco, como posteriormente se ver谩. La tesis ha sido dividida en cuatro cap铆tulos que se describen brevemente a continuaci贸n. En el primero se realiza una breve introducci贸n hist贸rica a la demostraci贸n autom谩tica, que va desde los or铆genes de la l贸gica con el uso de las primeras notaciones matem谩ticas formales en el siglo XVI hasta la aparici贸n de los resultados m谩s importantes de la l贸gica descubiertos por Herbrand, G枚del, Church, etc. Se hace un especial hincapi茅 en este cap铆tulo en la demostraci贸n autom谩tica realizando un recorrido desde sus or铆genes a finales del siglo XVIII hasta el momento actual, en el cual es posible ver cu谩l ha sido la evoluci贸n de este campo y qu茅 descubrimientos y resultados se pueden presentar como los principales puntos de inflexi贸n. En el segundo cap铆tulo se presentan la resoluci贸n lineal y algunos de sus principales refinamientos, ya que resoluci贸n SL* es un variaci贸n de resoluci贸n SL y por tanto de resoluci贸n lineal. Para ello se introduce el principio de resoluci贸n, viendo los problemas de su mecanizaci贸n, y posteriormente se ven dos refinamientos de resoluci贸n: resoluci贸n sem谩ntica y resoluci贸n lineal. Para concluir se estudian los principales refinamientos de resoluci贸n lineal: resoluci贸n de entrada, resoluci贸n lineal con fusi贸n, resoluci贸n lineal con subsumci贸n, resoluci贸n lineal ordenada, resoluci贸n MTOSS y TOSS, Eliminaci贸n de Modelos, resoluci贸n SL y el sistema MESON. En el tercer cap铆tulo se presentan y estudian con profundidad las principales aportaciones al campo de la demostraci贸n autom谩tica que se han producido en los 煤ltimos a帽os y que est谩n cercanas a la aproximaci贸n del presente trabajo. Se han incluido los siguientes trabajos: el demostrador PTTP de Stickel, el sistema MESON basado en secuencias de Plaisted, el demostrador SATCHMO de Manthey y Bry, los procedimientos Near-Horn Prolog de Loveland y otros autores y, por 煤ltimo, el demostrador SETHEO de Bibel y otros autores. Obviamente no se han incluido todos los demostradores y procedimientos, pero s铆 aquellos que se han considerado como los m谩s interesantes y cercanos a resoluci贸n SL* de manera que sea posible realizar comparaciones, de forma que queden patentes las aportaciones realizadas. En el cuarto cap铆tulo se presenta resoluci贸n SL*. Se da la definici贸n formal de la misma y se introduce el concepto fundamental de elecci贸n de ancestros. La elecci贸n de ancestros es el mecanismo que permite controlar la aplicaci贸n de la resoluci贸n de ancestro haciendo posible una reducci贸n del coste de su aplicaci贸n y una adecuaci贸n de resoluci贸n SL* al tipo de problema a tratar. Posteriormente se ven las principales instancias de resoluci贸n SL*, los procedimientos SLT y SLP. En este cap铆tulo se hace un especial hincapi茅 en la elecci贸n de ancestros, ya que es la principal aportaci贸n de resoluci贸n SL*, analizando tanto las ventajas que aporta asociadas al incremento de la eficiencia como el hecho de dotar a resoluci贸n SL* la capacidad de adaptarse a los problemas que trata. Tambi茅n en este cap铆tulo se presenta una implementaci贸n de resoluci贸n SL*, en particular del procedimiento SLT, y se incluyen resultados sobre un conjunto extenso de problemas del campo de la demostraci贸n autom谩tica. En la 煤ltima secci贸n de este cap铆tulo se realiza una comparaci贸n de resoluci贸n SL* con los demostradores y sistemas m谩s cercanos, tanto a nivel de caracter铆sticas como de resultados.Casamayor Rodenas, JC. (1996). Resoluci贸n SL*: Un paradigma basado en resoluci贸n lineal para la demostraci贸n autom谩tica [Tesis doctoral no publicada]. Universitat Polit猫cnica de Val猫ncia. https://doi.org/10.4995/Thesis/10251/6023Palanci

    Computational Natural Deduction

    No full text
    The formalization of the notion of a logically sound argument as a natural deduction proof offers the prospect of a computer program capable of constructing such arguments for conclusions of interest. We present a constructive definition for a new subclass of natural deduction proofs, called atomic normal form (ANF) proofs. A natural deduction proof is readily understood as an argument leading from a set of premisses, by way of simple principles of reasoning, to the conclusion of interest. ANF extends this explanative power of natural deduction. The very detailed steps of the argument are replaced by derived rules of inference, each of which is justified by a particular input formula. ANF constitutes a proof theoretically well motivated normal form for natural deduction. Computational techniques developed for resolution refutation based systems are directly applicable to the task of constructing ANF proofs. We analyse a range of languages in this framework, extending from the simple Horn language to the full classical calculus. This analysis is applied to provide a natural deduction based account for existing logic programming languages, and to extend current logic programming implementation techniques towards more expressive languages. We consider the visualization of proofs, failure demonstrations, search spaces and the proof search process. Such visualization can be used for the purposes of explanation and to gain an understanding of the proof search process. We propose introspection based architecture for problem solvers based on natural deduction. The architecture offers a logic based meta language to overcome the combinatorial and other practical problems faced by the problem solver

    Integrity constraints in deductive databases

    Get PDF
    A deductive database is a logic program that generalises the concept of a relational database. Integrity constraints are properties that the data of a database are required to satisfy and in the context of logic programming, they are expressed as closed formulae. It is desirable to check the integrity of a database at the end of each transaction which changes the database. The simplest approach to checking integrity in a database involves the evaluation of each constraint whenever the database is updated. However, such an approach is too inefficient, especially for large databases, and does not make use of the fact that the database satisfies the constraints prior to the update. A method, called the path finding method, is proposed for checking integrity in definite deductive databases by considering constraints as closed first order formulae. A comparative evaluation is made among previously described methods and the proposed one. Closed general formulae is used to express aggregate constraints and Lloyd et al. 's simplification method is generalised to cope with these constraints. A new definition of constraint satisfiability is introduced in the case of indefinite deductive databases and the path finding method is generalised to check integrity in the presence of static constraints only. To evaluate a constraint in an indefinite deductive database to take full advantage of the query evaluation mechanism underlying the database, a query evaluator is proposed which is based on a definition of semantics, called negation as possible failure, for inferring negative information from an indefinite deductive database. Transitional constraints are expressed using action relations and it is shown that transitional constraints can be handled in definite deductive databases in the same way as static constraints if the underlying database is suitably extended. The concept of irnplicit update is introduced and the path finding method is extended to compute facts which are present in action relations. The extended method is capable of checking integrity in definite deductive databases in the presence of transitional constraints. Combining different generalisations of the path finding method to check integrity in deductive databases in the presence of arbitrary constraints is discussed. An extension of the data manipulation language of SQL is proposed to express a wider range of integrity constraints. This class of constraints can be maintained in a database with the tools provided in this thesis

    Goal-directed proof theory

    Get PDF
    This report is the draft of a book about goal directed proof theoretical formulations of non-classical logics. It evolved from a response to the existence of two camps in the applied logic (computer science/artificial intelligence) community. There are those members who believe that the new non-classical logics are the most important ones for applications and that classical logic itself is now no longer the main workhorse of applied logic, and there are those who maintain that classical logic is the only logic worth considering and that within classical logic the Horn clause fragment is the most important one. The book presents a uniform Prolog-like formulation of the landscape of classical and non-classical logics, done in such away that the distinctions and movements from one logic to another seem simple and natural; and within it classical logic becomes just one among many. This should please the non-classical logic camp. It will also please the classical logic camp since the goal directed formulation makes it all look like an algorithmic extension of Logic Programming. The approach also seems to provide very good compuational complexity bounds across its landscape

    Proceedings of the Workshop on the lambda-Prolog Programming Language

    Get PDF
    The expressiveness of logic programs can be greatly increased over first-order Horn clauses through a stronger emphasis on logical connectives and by admitting various forms of higher-order quantification. The logic of hereditary Harrop formulas and the notion of uniform proof have been developed to provide a foundation for more expressive logic programming languages. The 位-Prolog language is actively being developed on top of these foundational considerations. The rich logical foundations of 位-Prolog provides it with declarative approaches to modular programming, hypothetical reasoning, higher-order programming, polymorphic typing, and meta-programming. These aspects of 位-Prolog have made it valuable as a higher-level language for the specification and implementation of programs in numerous areas, including natural language, automated reasoning, program transformation, and databases

    Proof-theoretic investigations into integrated logical and functional programming

    Get PDF
    This thesis is a proof-theoretic investigation of logic programming based on hereditary Harrop logic (as in lambdaProlog). After studying various proof systems for the first-order hereditary Harrop logic, we define the proof-theoretic semantics of a logic LFPL, intended as the basis of logic programming with functions, which extends higher-order hereditary Harrop logic by providing definition mechanisms for functions in such a way that the logical specification of the function rather than the function may be used in proof search. In Chap. 3, we define, for the first-order hereditary Harrop fragment of LJ, the class of uniform linear focused (ULF) proofs (suitable for goal-directed search with backchaining and unification) and show that the ULF-proofs are in 1-1 correspondence with the expanded normal deductions, in Prawitz's sense. We give a system of proof-term annotations for LJ-proofs (where proof-terms uniquely represent proofs). We define a rewriting system on proof-terms (where rules represent a subset of Kleene's permutations in LJ) and show that: its irreducible proof- terms are those representing ULF-proofs; it is weakly normalising. We also show that the composition of Prawitz's mappings between LJ and NJ, restricted to ULF-proofs, is the identity. We take the view of logic programming where: a program P is a set of formulae; a goal G is a formula; and the different means of achieving G w.r.t. P correspond to the expanded normal deductions of G from the assumptions in P (rather than the traditional view, whereby the different means of goal-achievement correspond to the different answer substitutions). LFPL is defined in Chap. 4, by means of a sequent calculus. As in LeFun, it extends logic programming with functions and provides mechanisms for defining names for functions, maintaining proof search as the computation mechanism (contrary to languages such as ALF, Babel, Curry and Escher, based on equational logic, where the computation mechanism is some form of rewriting). LFPL also allows definitions for declaring logical properties of functions, called definitions of dependent type. Such definitions are of the form: (f,x) =def(A, w) : EX:RF, where f is a name for A and x is a name for w, a proof-term witnessing that the formula [A/x]F holds (i.e. A meets the specification Ex:rF). When searching for proofs, it may suffice to use the formula [A/x]F rather than A itself. We present an interpretation of LFPL into NNlambdanorm, a natural deduction system for hereditary Harrop logic with lambda-terms. The means of goal-achievement in LFPL are interpreted in NNlambdanorm essentially by cut-elimination, followed by an interpretation of cut-free sequent calculus proofs as normal deductions. We show that the use of definitions of dependent type may speed up proof search because the equivalent proofs using no such definitions may be much longer and because normalisation may be done lazily, since not all parts of the proof need to be exhibited. We sketch two methods for implementing LFPL, based on goal-directed proof search, differing in the mechanism for selecting definitions of dependent type on which to backchain. We discuss techniques for handling the redundancy arising from the equivalence of each proof using such a definition to one using no such definitions
    corecore