783,471 research outputs found
Complexity of Nested Circumscription and Nested Abnormality Theories
The need for a circumscriptive formalism that allows for simple yet elegant
modular problem representation has led Lifschitz (AIJ, 1995) to introduce
nested abnormality theories (NATs) as a tool for modular knowledge
representation, tailored for applying circumscription to minimize exceptional
circumstances. Abstracting from this particular objective, we propose L_{CIRC},
which is an extension of generic propositional circumscription by allowing
propositional combinations and nesting of circumscriptive theories. As shown,
NATs are naturally embedded into this language, and are in fact of equal
expressive capability. We then analyze the complexity of L_{CIRC} and NATs, and
in particular the effect of nesting. The latter is found to be a source of
complexity, which climbs the Polynomial Hierarchy as the nesting depth
increases and reaches PSPACE-completeness in the general case. We also identify
meaningful syntactic fragments of NATs which have lower complexity. In
particular, we show that the generalization of Horn circumscription in the NAT
framework remains CONP-complete, and that Horn NATs without fixed letters can
be efficiently transformed into an equivalent Horn CNF, which implies
polynomial solvability of principal reasoning tasks. Finally, we also study
extensions of NATs and briefly address the complexity in the first-order case.
Our results give insight into the ``cost'' of using L_{CIRC} (resp. NATs) as a
host language for expressing other formalisms such as action theories,
narratives, or spatial theories.Comment: A preliminary abstract of this paper appeared in Proc. Seventeenth
International Joint Conference on Artificial Intelligence (IJCAI-01), pages
169--174. Morgan Kaufmann, 200
Alternative axiomatics and complexity of deliberative STIT theories
We propose two alternatives to Xu's axiomatization of the Chellas STIT. The
first one also provides an alternative axiomatization of the deliberative STIT.
The second one starts from the idea that the historic necessity operator can be
defined as an abbreviation of operators of agency, and can thus be eliminated
from the logic of the Chellas STIT. The second axiomatization also allows us to
establish that the problem of deciding the satisfiability of a STIT formula
without temporal operators is NP-complete in the single-agent case, and is
NEXPTIME-complete in the multiagent case, both for the deliberative and the
Chellas' STIT.Comment: Submitted to the Journal of Philosophical Logic; 13 pages excluding
anne
Bounded Arithmetic in Free Logic
One of the central open questions in bounded arithmetic is whether Buss'
hierarchy of theories of bounded arithmetic collapses or not. In this paper, we
reformulate Buss' theories using free logic and conjecture that such theories
are easier to handle. To show this, we first prove that Buss' theories prove
consistencies of induction-free fragments of our theories whose formulae have
bounded complexity. Next, we prove that although our theories are based on an
apparently weaker logic, we can interpret theories in Buss' hierarchy by our
theories using a simple translation. Finally, we investigate finitistic G\"odel
sentences in our systems in the hope of proving that a theory in a lower level
of Buss' hierarchy cannot prove consistency of induction-free fragments of our
theories whose formulae have higher complexity
A Computable Economist’s Perspective on Computational Complexity
A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of the latter concepts had their origins on what may be called 'Post's Program of Research for Higher Recursion Theory'. Approximations, computations and constructions are also emphasised. The recent real model of computation as a basis for studying computational complexity in the domain of the reals is also presented and discussed, albeit critically. A brief sceptical section on algorithmic complexity theory is included in an appendix
Existence of optimal ultrafilters and the fundamental complexity of simple theories
In the first edition of Classification Theory, the second author
characterized the stable theories in terms of saturation of ultrapowers. Prior
to this theorem, stability had already been defined in terms of counting types,
and the unstable formula theorem was known. A contribution of the ultrapower
characterization was that it involved sorting out the global theory, and
introducing nonforking, seminal for the development of stability theory. Prior
to the present paper, there had been no such characterization of an unstable
class. In the present paper, we first establish the existence of so-called
optimal ultrafilters on Boolean algebras, which are to simple theories as
Keisler's good ultrafilters are to all theories. Then, assuming a supercompact
cardinal, we characterize the simple theories in terms of saturation of
ultrapowers. To do so, we lay the groundwork for analyzing the global structure
of simple theories, in ZFC, via complexity of certain amalgamation patterns.
This brings into focus a fundamental complexity in simple unstable theories
having no real analogue in stability.Comment: The revisions aim to separate the set theoretic and model theoretic
aspects of the paper to make it accessible to readers interested primarily in
one side. We thank the anonymous referee for many thoughtful comment
- …