4,768 research outputs found
The Principle Of Excluded Middle Then And Now: Aristotle And Principia Mathematica
The prevailing truth-functional logic of the twentieth century, it is argued, is incapable of expressing the subtlety and richness of Aristotle's Principle of Excluded Middle, and hence cannot but misinterpret it. Furthermore, the manner in which truth-functional logic expresses its own Principle of Excluded Middle is less than satisfactory in its application to mathematics. Finally, there are glimpses of the "realism" which is the metaphysics demanded by twentieth century logic, with the remarkable consequent that Classical logic is a particularly inept instrument to analyze those philosophies which stand opposed to the "realism" it demands
Inconsistent boundaries
Research on this paper was supported by a grant from the Marsden Fund, Royal Society of New Zealand.Mereotopology is a theory of connected parts. The existence of boundaries, as parts of everyday objects, is basic to any such theory; but in classical mereotopology, there is a problem: if boundaries exist, then either distinct entities cannot be in contact, or else space is not topologically connected (Varzi in Noûs 31:26–58, 1997). In this paper we urge that this problem can be met with a paraconsistent mereotopology, and sketch the details of one such approach. The resulting theory focuses attention on the role of empty parts, in delivering a balanced and bounded metaphysics of naive space.PostprintPeer reviewe
Logic Programming for Describing and Solving Planning Problems
A logic programming paradigm which expresses solutions to problems as stable
models has recently been promoted as a declarative approach to solving various
combinatorial and search problems, including planning problems. In this
paradigm, all program rules are considered as constraints and solutions are
stable models of the rule set. This is a rather radical departure from the
standard paradigm of logic programming. In this paper we revisit abductive
logic programming and argue that it allows a programming style which is as
declarative as programming based on stable models. However, within abductive
logic programming, one has two kinds of rules. On the one hand predicate
definitions (which may depend on the abducibles) which are nothing else than
standard logic programs (with their non-monotonic semantics when containing
with negation); on the other hand rules which constrain the models for the
abducibles. In this sense abductive logic programming is a smooth extension of
the standard paradigm of logic programming, not a radical departure.Comment: 8 pages, no figures, Eighth International Workshop on Nonmonotonic
Reasoning, special track on Representing Actions and Plannin
Efficient long division via Montgomery multiply
We present a novel right-to-left long division algorithm based on the
Montgomery modular multiply, consisting of separate highly efficient loops with
simply carry structure for computing first the remainder (x mod q) and then the
quotient floor(x/q). These loops are ideally suited for the case where x
occupies many more machine words than the divide modulus q, and are strictly
linear time in the "bitsize ratio" lg(x)/lg(q). For the paradigmatic
performance test of multiword dividend and single 64-bit-word divisor,
exploitation of the inherent data-parallelism of the algorithm effectively
mitigates the long latency of hardware integer MUL operations, as a result of
which we are able to achieve respective costs for remainder-only and full-DIV
(remainder and quotient) of 6 and 12.5 cycles per dividend word on the Intel
Core 2 implementation of the x86_64 architecture, in single-threaded execution
mode. We further describe a simple "bit-doubling modular inversion" scheme,
which allows the entire iterative computation of the mod-inverse required by
the Montgomery multiply at arbitrarily large precision to be performed with
cost less than that of a single Newtonian iteration performed at the full
precision of the final result. We also show how the Montgomery-multiply-based
powering can be efficiently used in Mersenne and Fermat-number trial
factorization via direct computation of a modular inverse power of 2, without
any need for explicit radix-mod scalings.Comment: 23 pages; 8 tables v2: Tweak formatting, pagecount -= 2. v3: Fix
incorrect powers of R in formulae [7] and [11] v4: Add Eldridge & Walter ref.
v5: Clarify relation between Algos A/A',D and Hensel-div; clarify
true-quotient mechanics; Add Haswell timings, refs to Agner Fog timings pdf
and GMP asm-timings ref-page. v6: Remove stray +bw in MULL line of Algo D
listing; add note re byte-LUT for qinv_
Weak Assertion
We present an inferentialist account of the epistemic modal operator might. Our starting point is the bilateralist programme. A bilateralist explains the operator not in terms of the speech act of rejection ; we explain the operator might in terms of weak assertion, a speech act whose existence we argue for on the basis of linguistic evidence. We show that our account of might provides a solution to certain well-known puzzles about the semantics of modal vocabulary whilst retaining classical logic. This demonstrates that an inferentialist approach to meaning can be successfully extended beyond the core logical constants
Automatic Generation of Proof Tactics for Finite-Valued Logics
A number of flexible tactic-based logical frameworks are nowadays available
that can implement a wide range of mathematical theories using a common
higher-order metalanguage. Used as proof assistants, one of the advantages of
such powerful systems resides in their responsiveness to extensibility of their
reasoning capabilities, being designed over rule-based programming languages
that allow the user to build her own `programs to construct proofs' - the
so-called proof tactics.
The present contribution discusses the implementation of an algorithm that
generates sound and complete tableau systems for a very inclusive class of
sufficiently expressive finite-valued propositional logics, and then
illustrates some of the challenges and difficulties related to the algorithmic
formation of automated theorem proving tactics for such logics. The procedure
on whose implementation we will report is based on a generalized notion of
analyticity of proof systems that is intended to guarantee termination of the
corresponding automated tactics on what concerns theoremhood in our targeted
logics
Making proofs without Modus Ponens: An introduction to the combinatorics and complexity of cut elimination
This paper is intended to provide an introduction to cut elimination which is
accessible to a broad mathematical audience. Gentzen's cut elimination theorem
is not as well known as it deserves to be, and it is tied to a lot of
interesting mathematical structure. In particular we try to indicate some
dynamical and combinatorial aspects of cut elimination, as well as its
connections to complexity theory. We discuss two concrete examples where one
can see the structure of short proofs with cuts, one concerning feasible
numbers and the other concerning "bounded mean oscillation" from real analysis
On Classical Decidable Logics Extended with Percentage Quantifiers and Arithmetics
During the last decades, a lot of effort was put into identifying decidable fragments of first-order logic. Such efforts gave birth, among the others, to the two-variable fragment and the guarded fragment, depending on the type of restriction imposed on formulae from the language. Despite the success of the mentioned logics in areas like formal verification and knowledge representation, such first-order fragments are too weak to express even the simplest statistical constraints, required for modelling of influence networks or in statistical reasoning.
In this work we investigate the extensions of these classical decidable logics with percentage quantifiers, specifying how frequently a formula is satisfied in the indented model. We show, surprisingly, that all the mentioned decidable fragments become undecidable under such extension, sharpening the existing results in the literature. Our negative results are supplemented by decidability of the two-variable guarded fragment with even more expressive counting, namely Presburger constraints. Our results can be applied to infer decidability of various modal and description logics, e.g. Presburger Modal Logics with Converse or ALCI, with expressive cardinality constraints
- …