9,784 research outputs found

    Normalisation Control in Deep Inference via Atomic Flows

    Get PDF
    We introduce `atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax

    On the relative proof complexity of deep inference via atomic flows

    Get PDF
    We consider the proof complexity of the minimal complete fragment, KS, of standard deep inference systems for propositional logic. To examine the size of proofs we employ atomic flows, diagrams that trace structural changes through a proof but ignore logical information. As results we obtain a polynomial simulation of versions of Resolution, along with some extensions. We also show that these systems, as well as bounded-depth Frege systems, cannot polynomially simulate KS, by giving polynomial-size proofs of certain variants of the propositional pigeonhole principle in KS.Comment: 27 pages, 2 figures, full version of conference pape

    Canonical Proof nets for Classical Logic

    Full text link
    Proof nets provide abstract counterparts to sequent proofs modulo rule permutations; the idea being that if two proofs have the same underlying proof-net, they are in essence the same proof. Providing a convincing proof-net counterpart to proofs in the classical sequent calculus is thus an important step in understanding classical sequent calculus proofs. By convincing, we mean that (a) there should be a canonical function from sequent proofs to proof nets, (b) it should be possible to check the correctness of a net in polynomial time, (c) every correct net should be obtainable from a sequent calculus proof, and (d) there should be a cut-elimination procedure which preserves correctness. Previous attempts to give proof-net-like objects for propositional classical logic have failed at least one of the above conditions. In [23], the author presented a calculus of proof nets (expansion nets) satisfying (a) and (b); the paper defined a sequent calculus corresponding to expansion nets but gave no explicit demonstration of (c). That sequent calculus, called LK\ast in this paper, is a novel one-sided sequent calculus with both additively and multiplicatively formulated disjunction rules. In this paper (a self-contained extended version of [23]), we give a full proof of (c) for expansion nets with respect to LK\ast, and in addition give a cut-elimination procedure internal to expansion nets - this makes expansion nets the first notion of proof-net for classical logic satisfying all four criteria.Comment: Accepted for publication in APAL (Special issue, Classical Logic and Computation

    Combinatorial Flows and Their Normalisation

    Get PDF
    This paper introduces combinatorial flows that generalize combinatorial proofs such that they also include cut and substitution as methods of proof compression. We show a normalization procedure for combinatorial flows, and how syntactic proofs are translated into combinatorial flows and vice versa

    A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    Full text link
    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference is based on the semantic tableaux method which has some advantages when compared to traditional deduction strategies. The algorithm for an automatic generation of logical specifications is proposed. The generation procedure is based on the predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea for the approach is to consider patterns, defined in terms of temporal logic,as a kind of (logical) primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between intuitiveness of the deductive reasoning and the difficulty of its practical application in the case when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing our understanding of, the deduction-based formal verification of workflow-oriented models.Comment: International Journal of Applied Mathematics and Computer Scienc

    Worm Algorithm for Problems of Quantum and Classical Statistics

    Get PDF
    This is a chapter of the multi-author book "Understanding Quantum Phase Transitions," edited by Lincoln Carr and published by Taylor and Francis. In this chapter, we give a general introduction to the worm algorithm and present important results highlighting the power of the approachComment: 27 pages, 15 figures, chapter in a boo

    Programming Quantum Computers Using Design Automation

    Full text link
    Recent developments in quantum hardware indicate that systems featuring more than 50 physical qubits are within reach. At this scale, classical simulation will no longer be feasible and there is a possibility that such quantum devices may outperform even classical supercomputers at certain tasks. With the rapid growth of qubit numbers and coherence times comes the increasingly difficult challenge of quantum program compilation. This entails the translation of a high-level description of a quantum algorithm to hardware-specific low-level operations which can be carried out by the quantum device. Some parts of the calculation may still be performed manually due to the lack of efficient methods. This, in turn, may lead to a design gap, which will prevent the programming of a quantum computer. In this paper, we discuss the challenges in fully-automatic quantum compilation. We motivate directions for future research to tackle these challenges. Yet, with the algorithms and approaches that exist today, we demonstrate how to automatically perform the quantum programming flow from algorithm to a physical quantum computer for a simple algorithmic benchmark, namely the hidden shift problem. We present and use two tool flows which invoke RevKit. One which is based on ProjectQ and which targets the IBM Quantum Experience or a local simulator, and one which is based on Microsoft's quantum programming language Q#\#.Comment: 10 pages, 10 figures. To appear in: Proceedings of Design, Automation and Test in Europe (DATE 2018

    Relational interpretation of the wave function and a possible way around Bell's theorem

    Full text link
    The famous ``spooky action at a distance'' in the EPR-szenario is shown to be a local interaction, once entanglement is interpreted as a kind of ``nearest neighbor'' relation among quantum systems. Furthermore, the wave function itself is interpreted as encoding the ``nearest neighbor'' relations between a quantum system and spatial points. This interpretation becomes natural, if we view space and distance in terms of relations among spatial points. Therefore, ``position'' becomes a purely relational concept. This relational picture leads to a new perspective onto the quantum mechanical formalism, where many of the ``weird'' aspects, like the particle-wave duality, the non-locality of entanglement, or the ``mystery'' of the double-slit experiment, disappear. Furthermore, this picture cirumvents the restrictions set by Bell's inequalities, i.e., a possible (realistic) hidden variable theory based on these concepts can be local and at the same time reproduce the results of quantum mechanics.Comment: Accepted for publication in "International Journal of Theoretical Physics

    On linear rewriting systems for Boolean logic and some applications to proof theory

    Get PDF
    Linear rules have played an increasing role in structural proof theory in recent years. It has been observed that the set of all sound linear inference rules in Boolean logic is already coNP-complete, i.e. that every Boolean tautology can be written as a (left- and right-)linear rewrite rule. In this paper we study properties of systems consisting only of linear inferences. Our main result is that the length of any 'nontrivial' derivation in such a system is bound by a polynomial. As a consequence there is no polynomial-time decidable sound and complete system of linear inferences, unless coNP=NP. We draw tools and concepts from term rewriting, Boolean function theory and graph theory in order to access some required intermediate results. At the same time we make several connections between these areas that, to our knowledge, have not yet been presented and constitute a rich theoretical framework for reasoning about linear TRSs for Boolean logic.Comment: 27 pages, 3 figures, special issue of RTA 201
    • 

    corecore