107 research outputs found
Discriminator logics (Research announcement)
A discriminator logic is the 1-assertional logic of a discriminator variety V having two constant terms 0 and 1 such that V ⊨ 0 1 iff every member of V is trivial. Examples of such logics abound in the literature. The main result of this research announcement asserts that a certain non-Fregean deductive system SBPC, which closely resembles the classical propositional calculus, is canonical for the class of discriminator logics in the sense that any discriminator logic S can be presented (up to definitional equivalence) as an axiomatic extension of SBPC by a set of extensional logical connectives taken from the language of S. The results outlined in this research announcement are extended to several generalisations of the class of discriminator logics in the main work
Discriminator logics (Research announcement)
A discriminator logic is the 1-assertional logic of a discriminator variety V having two constant terms 0 and 1 such that V ⊨ 0 1 iff every member of V is trivial. Examples of such logics abound in the literature. The main result of this research announcement asserts that a certain non-Fregean deductive system SBPC, which closely resembles the classical propositional calculus, is canonical for the class of discriminator logics in the sense that any discriminator logic S can be presented (up to definitional equivalence) as an axiomatic extension of SBPC by a set of extensional logical connectives taken from the language of S. The results outlined in this research announcement are extended to several generalisations of the class of discriminator logics in the main work
Command injection attacks, continuations, and the Lambek calculus
This paper shows connections between command injection attacks,
continuations, and the Lambek calculus: certain command injections, such as the
tautology attack on SQL, are shown to be a form of control effect that can be
typed using the Lambek calculus, generalizing the double-negation typing of
continuations. Lambek's syntactic calculus is a logic with two implicational
connectives taking their arguments from the left and right, respectively. These
connectives describe how strings interact with their left and right contexts
when building up syntactic structures. The calculus is a form of propositional
logic without structural rules, and so a forerunner of substructural logics
like Linear Logic and Separation Logic.Comment: In Proceedings WoC 2015, arXiv:1606.0583
Double-Negation Elimination in Some Propositional Logics
This article answers two questions (posed in the literature), each concerning
the guaranteed existence of proofs free of double negation. A proof is free of
double negation if none of its deduced steps contains a term of the form
n(n(t)) for some term t, where n denotes negation. The first question asks for
conditions on the hypotheses that, if satisfied, guarantee the existence of a
double-negation-free proof when the conclusion is free of double negation. The
second question asks about the existence of an axiom system for classical
propositional calculus whose use, for theorems with a conclusion free of double
negation, guarantees the existence of a double-negation-free proof. After
giving conditions that answer the first question, we answer the second question
by focusing on the Lukasiewicz three-axiom system. We then extend our studies
to infinite-valued sentential calculus and to intuitionistic logic and
generalize the notion of being double-negation free. The double-negation proofs
of interest rely exclusively on the inference rule condensed detachment, a rule
that combines modus ponens with an appropriately general rule of substitution.
The automated reasoning program OTTER played an indispensable role in this
study.Comment: 32 pages, no figure
Proceedings of the Workshop on Linear Logic and Logic Programming
Declarative programming languages often fail to effectively address many aspects of control and resource management. Linear logic provides a framework for increasing the strength of declarative programming languages to embrace these aspects. Linear logic has been used to provide new analyses of Prolog\u27s operational semantics, including left-to-right/depth-first search and negation-as-failure. It has also been used to design new logic programming languages for handling concurrency and for viewing program clauses as (possibly) limited resources. Such logic programming languages have proved useful in areas such as databases, object-oriented programming, theorem proving, and natural language parsing.
This workshop is intended to bring together researchers involved in all aspects of relating linear logic and logic programming. The proceedings includes two high-level overviews of linear logic, and six contributed papers.
Workshop organizers: Jean-Yves Girard (CNRS and University of Paris VII), Dale Miller (chair, University of Pennsylvania, Philadelphia), and Remo Pareschi, (ECRC, Munich)
A Type Checker for a Logical Framework with Union and Intersection Types
We present the syntax, semantics, and typing rules of Bull, a prototype
theorem prover based on the Delta-Framework, i.e. a fully-typed lambda-calculus
decorated with union and intersection types, as described in previous papers by
the authors. Bull also implements a subtyping algorithm for the Type Theory Xi
of Barbanera-Dezani-de'Liguoro. Bull has a command-line interface where the
user can declare axioms, terms, and perform computations and some basic
terminal-style features like error pretty-printing, subexpressions
highlighting, and file loading. Moreover, it can typecheck a proof or normalize
it. These terms can be incomplete, therefore the typechecking algorithm uses
unification to try to construct the missing subterms. Bull uses the syntax of
Berardi's Pure Type Systems to improve the compactness and the modularity of
the kernel. Abstract and concrete syntax are mostly aligned and similar to the
concrete syntax of Coq. Bull uses a higher-order unification algorithm for
terms, while typechecking and partial type inference are done by a
bidirectional refinement algorithm, similar to the one found in Matita and
Beluga. The refinement can be split into two parts: the essence refinement and
the typing refinement. Binders are implemented using commonly-used de Bruijn
indices. We have defined a concrete language syntax that will allow the user to
write Delta-terms. We have defined the reduction rules and an evaluator. We
have implemented from scratch a refiner which does partial typechecking and
type reconstruction. We have experimented Bull with classical examples of the
intersection and union literature, such as the ones formalized by Pfenning with
his Refinement Types in LF. We hope that this research vein could be useful to
experiment, in a proof theoretical setting, forms of polymorphism alternatives
to Girard's parametric one
- …