3,091 research outputs found

    Cut Elimination inside a Deep Inference System for Classical Predicate Logic

    Get PDF
    Deep inference is a natural generalisation of the one-sided sequent calculus where rules are allowed to apply deeply inside formulas, much like rewrite rules in term rewriting. This freedom in applying inference rules allows to express logical systems that are difficult or impossible to express in the cut-free sequent calculus and it also allows for a more fine-grained analysis of derivations than the sequent calculus. However, the same freedom also makes it harder to carry out this analysis, in particular it is harder to design cut elimination procedures. In this paper we see a cut elimination procedure for a deep inference system for classical predicate logic. As a consequence we derive Herbrand's Theorem, which we express as a factorisation of derivation

    Normalisation in Deep Inference

    Get PDF
    Στην διπλωματική αυτή εργασία γίνεται μια αναλυτική παρουσίαση του λογισμού των δομών, ενός φορμαλισμού της θεωρίας αποδείξεων που χρησιμοποιεί deep inference. Αυτό σημαίνει ότι οι συμπερασματικοί κανόνες εφαρμόζονται οσοδήποτε βαθιά στην πολυπλοκότητα ενός τύπου. Έπεται ότι οι αποδείξεις έχουν συμμετρική αντί για δενδρική μορφή. Εισάγουμε ένα σύστημα για την κλασική πρωτοβάθμια λογική και το συγκρίνουμε με το αντίστοιχο στον λογισμό ακολουθητών. Βλέπουμε πως επιτυγχάνεται τοπικότητα, δηλαδή κάθε λογικός κανόνας έχει σταθερή πολυπλοκότητα. Η εργασία τελικά εστιάζει στους διάφορους ορισμούς της κανονικής μορφής μιας απόδειξης.In this thesis we present the calculus of structures, a proof-theoretic formalism using deep inference. This means that inference rules apply arbitrarily deep inside formulas. It follows that derivations are now symmetric instead of tree-shape objects. A system for classical predicate logic is introduced and compared with the corresponding sequent calculus system. They both have an admissible Cut rule. However, locality can be obtained with deep inference, meaning that the effort of applying a rule is always bounded. Then we investigate what normal forms of deductions have been defined. Besides cut elimination, we can adopt two other notions of normalisation that allow cuts inside a derivation, under some constraints. We will try to remark common things and differences between normalisation in deep and shallow inference

    Deep Inference and Symmetry in Classical Proofs

    Get PDF
    In this thesis we see deductive systems for classical propositional and predicate logic which use deep inference, i.e. inference rules apply arbitrarily deep inside formulas, and a certain symmetry, which provides an involution on derivations. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they enjoy various new interesting properties. Not only the identity axiom, but also cut, weakening and even contraction are reducible to atomic form. This leads to inference rules that are local, meaning that the effort of applying them is bounded, and finitary, meaning that, given a conclusion, there is only a finite number of premises to choose from. The systems also enjoy new normal forms for derivations and, in the propositional case, a cut elimination procedure that is drastically simpler than the ones for sequent systems

    De Morgan Dual Nominal Quantifiers Modelling Private Names in Non-Commutative Logic

    Get PDF
    This paper explores the proof theory necessary for recommending an expressive but decidable first-order system, named MAV1, featuring a de Morgan dual pair of nominal quantifiers. These nominal quantifiers called `new' and `wen' are distinct from the self-dual Gabbay-Pitts and Miller-Tiu nominal quantifiers. The novelty of these nominal quantifiers is they are polarised in the sense that `new' distributes over positive operators while `wen' distributes over negative operators. This greater control of bookkeeping enables private names to be modelled in processes embedded as formulae in MAV1. The technical challenge is to establish a cut elimination result, from which essential properties including the transitivity of implication follow. Since the system is defined using the calculus of structures, a generalisation of the sequent calculus, novel techniques are employed. The proof relies on an intricately designed multiset-based measure of the size of a proof, which is used to guide a normalisation technique called splitting. The presence of equivariance, which swaps successive quantifiers, induces complex inter-dependencies between nominal quantifiers, additive conjunction and multiplicative operators in the proof of splitting. Every rule is justified by an example demonstrating why the rule is necessary for soundly embedding processes and ensuring that cut elimination holds.Comment: Submitted for review 18/2/2016; accepted CONCUR 2016; extended version submitted to journal 27/11/201

    Locality for Classical Logic

    Full text link
    In this paper we will see deductive systems for classical propositional and predicate logic in the calculus of structures. Like sequent systems, they have a cut rule which is admissible. In addition, they enjoy a top-down symmetry and some normal forms for derivations that are not available in the sequent calculus. Identity axiom, cut, weakening and also contraction can be reduced to atomic form. This leads to rules that are local: they do not require the inspection of expressions of unbounded size

    Tool support for reasoning in display calculi

    Get PDF
    We present a tool for reasoning in and about propositional sequent calculi. One aim is to support reasoning in calculi that contain a hundred rules or more, so that even relatively small pen and paper derivations become tedious and error prone. As an example, we implement the display calculus D.EAK of dynamic epistemic logic. Second, we provide embeddings of the calculus in the theorem prover Isabelle for formalising proofs about D.EAK. As a case study we show that the solution of the muddy children puzzle is derivable for any number of muddy children. Third, there is a set of meta-tools, that allows us to adapt the tool for a wide variety of user defined calculi

    Normalisation Control in Deep Inference via Atomic Flows

    Get PDF
    We introduce `atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax
    corecore