1,108 research outputs found
A Semantic Completeness Proof for TaMeD
International audienceDeduction modulo is a theoretical framework designed to introduce computational steps in deductive systems. This approach is well suited to automated theorem proving and a tableau method for first-order classical deduction modulo has been developed. We reformulate this method and give an (almost constructive) semantic completeness proof. This new proof allows us to extend the completeness theorem to several classes of rewrite systems used for computations in deduction modulo. We are then able to build a counter-model when a proof fails for these systems
Deduction modulo theory
This paper is a survey on Deduction modulo theor
Semantic A-translation and Super-consistency entail Classical Cut Elimination
We show that if a theory R defined by a rewrite system is super-consistent,
the classical sequent calculus modulo R enjoys the cut elimination property,
which was an open question. For such theories it was already known that proofs
strongly normalize in natural deduction modulo R, and that cut elimination
holds in the intuitionistic sequent calculus modulo R. We first define a
syntactic and a semantic version of Friedman's A-translation, showing that it
preserves the structure of pseudo-Heyting algebra, our semantic framework. Then
we relate the interpretation of a theory in the A-translated algebra and its
A-translation in the original algebra. This allows to show the stability of the
super-consistency criterion and the cut elimination theorem
Changing a semantics: opportunism or courage?
The generalized models for higher-order logics introduced by Leon Henkin, and
their multiple offspring over the years, have become a standard tool in many
areas of logic. Even so, discussion has persisted about their technical status,
and perhaps even their conceptual legitimacy. This paper gives a systematic
view of generalized model techniques, discusses what they mean in mathematical
and philosophical terms, and presents a few technical themes and results about
their role in algebraic representation, calibrating provability, lowering
complexity, understanding fixed-point logics, and achieving set-theoretic
absoluteness. We also show how thinking about Henkin's approach to semantics of
logical systems in this generality can yield new results, dispelling the
impression of adhocness. This paper is dedicated to Leon Henkin, a deep
logician who has changed the way we all work, while also being an always open,
modest, and encouraging colleague and friend.Comment: 27 pages. To appear in: The life and work of Leon Henkin: Essays on
his contributions (Studies in Universal Logic) eds: Manzano, M., Sain, I. and
Alonso, E., 201
Sequentiality vs. Concurrency in Games and Logic
Connections between the sequentiality/concurrency distinction and the
semantics of proofs are investigated, with particular reference to games and
Linear Logic.Comment: 35 pages, appeared in Mathematical Structures in Computer Scienc
Mistakes in medical ontologies: Where do they come from and how can they be detected?
We present the details of a methodology for quality assurance in large medical terminologies and describe three algorithms that can help terminology developers and users to identify potential mistakes. The methodology is based in part on linguistic criteria and in part on logical and ontological principles governing sound classifications. We conclude by outlining the results of applying the methodology in the form of a taxonomy different types of errors and potential errors detected in SNOMED-CT
A Combination Framework for Complexity
In this paper we present a combination framework for polynomial complexity
analysis of term rewrite systems. The framework covers both derivational and
runtime complexity analysis. We present generalisations of powerful complexity
techniques, notably a generalisation of complexity pairs and (weak) dependency
pairs. Finally, we also present a novel technique, called dependency graph
decomposition, that in the dependency pair setting greatly increases
modularity. We employ the framework in the automated complexity tool TCT. TCT
implements a majority of the techniques found in the literature, witnessing
that our framework is general enough to capture a very brought setting
Axiom directed Focusing
Long versionInternational audienceSuperdeduction and deduction modulo are methods specially designed to ease the use of first-order theories in predicate logic. Superdeduction modulo, which combines both, enables the user to make a distinct use of computational and reasoning axioms. Although soundness is ensured, using superdeduction and deduction modulo to extend deduction with awkward theories can jeopardize essential properties of the extended system such as cut-elimination or completeness \wrt~predicate logic. Therefore one has to design criteria for theories which can safely be used through superdeduction and deduction modulo. In this paper we revisit the superdeduction paradigm by comparing it with the focusing approach. In particular we prove a focalization theorem for cut-free superdeduction modulo: we show that permutations of inference rules can transform any cut-free proof in deduction modulo into a cut-free proof in superdeduction modulo and conversely, provided that some hypotheses on the synchrony of reasoning axioms are verified. It implies that cut-elimination for deduction modulo and for superdeduction modulo are equivalent. Since several criteria have already been proposed for theories that do not break cut-elimination of the corresponding deduction modulo system, these criteria also imply cut-elimination of the superdeduction modulo system, provided our synchrony hypotheses hold. Finally we design a tableaux method for superdeduction modulo which is sound and complete provided cut-elimination holds
Making use of logic
It seems that Polish logic has always been open to considerations concerning the use of methods and results of formal logic within disciplines. We overview a couple of such Polish contributions to what may be called the realm of applied logic. We take a closer look at the formalization of natural reasoning, inconsistency-tolerant logic, and at the formal analysis of causal nexus
- âŠ