6,052 research outputs found
Do Hard SAT-Related Reasoning Tasks Become Easier in the Krom Fragment?
Many reasoning problems are based on the problem of satisfiability (SAT).
While SAT itself becomes easy when restricting the structure of the formulas in
a certain way, the situation is more opaque for more involved decision
problems. We consider here the CardMinSat problem which asks, given a
propositional formula and an atom , whether is true in some
cardinality-minimal model of . This problem is easy for the Horn
fragment, but, as we will show in this paper, remains -complete (and
thus -hard) for the Krom fragment (which is given by formulas in
CNF where clauses have at most two literals). We will make use of this fact to
study the complexity of reasoning tasks in belief revision and logic-based
abduction and show that, while in some cases the restriction to Krom formulas
leads to a decrease of complexity, in others it does not. We thus also consider
the CardMinSat problem with respect to additional restrictions to Krom formulas
towards a better understanding of the tractability frontier of such problems
Dependence in Propositional Logic: Formula-Formula Dependence and Formula Forgetting -- Application to Belief Update and Conservative Extension
Dependence is an important concept for many tasks in artificial intelligence.
A task can be executed more efficiently by discarding something independent
from the task. In this paper, we propose two novel notions of dependence in
propositional logic: formula-formula dependence and formula forgetting. The
first is a relation between formulas capturing whether a formula depends on
another one, while the second is an operation that returns the strongest
consequence independent of a formula. We also apply these two notions in two
well-known issues: belief update and conservative extension. Firstly, we define
a new update operator based on formula-formula dependence. Furthermore, we
reduce conservative extension to formula forgetting.Comment: We find a mistake in this version and we need a period of time to fix
i
On Properties of Update Sequences Based on Causal Rejection
We consider an approach to update nonmonotonic knowledge bases represented as
extended logic programs under answer set semantics. New information is
incorporated into the current knowledge base subject to a causal rejection
principle enforcing that, in case of conflicts, more recent rules are preferred
and older rules are overridden. Such a rejection principle is also exploited in
other approaches to update logic programs, e.g., in dynamic logic programming
by Alferes et al. We give a thorough analysis of properties of our approach, to
get a better understanding of the causal rejection principle. We review
postulates for update and revision operators from the area of theory change and
nonmonotonic reasoning, and some new properties are considered as well. We then
consider refinements of our semantics which incorporate a notion of minimality
of change. As well, we investigate the relationship to other approaches,
showing that our approach is semantically equivalent to inheritance programs by
Buccafurri et al. and that it coincides with certain classes of dynamic logic
programs, for which we provide characterizations in terms of graph conditions.
Therefore, most of our results about properties of causal rejection principle
apply to these approaches as well. Finally, we deal with computational
complexity of our approach, and outline how the update semantics and its
refinements can be implemented on top of existing logic programming engines.Comment: 59 pages, 2 figures, 3 tables, to be published in "Theory and
Practice of Logic Programming
Belief Revision, Minimal Change and Relaxation: A General Framework based on Satisfaction Systems, and Applications to Description Logics
Belief revision of knowledge bases represented by a set of sentences in a
given logic has been extensively studied but for specific logics, mainly
propositional, and also recently Horn and description logics. Here, we propose
to generalize this operation from a model-theoretic point of view, by defining
revision in an abstract model theory known under the name of satisfaction
systems. In this framework, we generalize to any satisfaction systems the
characterization of the well known AGM postulates given by Katsuno and
Mendelzon for propositional logic in terms of minimal change among
interpretations. Moreover, we study how to define revision, satisfying the AGM
postulates, from relaxation notions that have been first introduced in
description logics to define dissimilarity measures between concepts, and the
consequence of which is to relax the set of models of the old belief until it
becomes consistent with the new pieces of knowledge. We show how the proposed
general framework can be instantiated in different logics such as
propositional, first-order, description and Horn logics. In particular for
description logics, we introduce several concrete relaxation operators tailored
for the description logic \ALC{} and its fragments \EL{} and \ELext{},
discuss their properties and provide some illustrative examples
Tractability and the computational mind
We overview logical and computational explanations of the notion of tractability as applied in cognitive science. We start by introducing the basics of mathematical theories of complexity: computability theory, computational complexity theory, and descriptive complexity theory. Computational philosophy of mind often identifies mental algorithms with computable functions. However, with the development of programming practice it has become apparent that for some computable problems finding effective algorithms is hardly possible. Some problems need too much computational resource, e.g., time or memory, to be practically computable.
Computational complexity theory is concerned with the amount of resources required for the execution of algorithms and, hence, the inherent difficulty of computational problems. An important goal of computational complexity theory is to categorize computational problems via complexity classes, and in particular, to identify efficiently solvable problems and draw a line between tractability and intractability.
We survey how complexity can be used to study computational plausibility of cognitive theories. We especially emphasize methodological and mathematical assumptions behind applying complexity theory in cognitive science. We pay special attention to the examples of applying logical and computational complexity toolbox in different domains of cognitive science. We focus mostly on theoretical and experimental research in psycholinguistics and social cognition
Space Efficiency of Propositional Knowledge Representation Formalisms
We investigate the space efficiency of a Propositional Knowledge
Representation (PKR) formalism. Intuitively, the space efficiency of a
formalism F in representing a certain piece of knowledge A, is the size of the
shortest formula of F that represents A. In this paper we assume that knowledge
is either a set of propositional interpretations (models) or a set of
propositional formulae (theorems). We provide a formal way of talking about the
relative ability of PKR formalisms to compactly represent a set of models or a
set of theorems. We introduce two new compactness measures, the corresponding
classes, and show that the relative space efficiency of a PKR formalism in
representing models/theorems is directly related to such classes. In
particular, we consider formalisms for nonmonotonic reasoning, such as
circumscription and default logic, as well as belief revision operators and the
stable model semantics for logic programs with negation. One interesting result
is that formalisms with the same time complexity do not necessarily belong to
the same space efficiency class
Propositional update operators based on formula/literal dependence
International audienceWe present and study a general family of belief update operators in a propositional setting. Its operators are based on formula/literal dependence, which is more fine-grained than the notion of formula/variable dependence that was proposed in the literature: formula/variable dependence is a particular case of formula/literal dependence. Our update operators are defined according to the "forget-then-conjoin" scheme: updating a belief base by an input formula consists in first forgetting in the base every literal on which the input formula has a negative influence, and then conjoining the resulting base with the input formula. The operators of our family differ by the underlying notion of formula/literal dependence, which may be defined syntactically or semantically, and which may or may not exploit further information like known persistent literals and pre-set dependencies. We argue that this allows to handle the frame problem and the ramification problem in a more appropriate way. We evaluate the update operators of our family w.r.t. two important dimensions: the logical dimension, by checking the status of the Katsuno-Mendelzon postulates for update, and the computational dimension, by identifying the complexity of a number of decision problems (including model checking, consistency and inference), both in the general case and in some restricted cases, as well as by studying compactability issues. It follows that several operators of our family are interesting alternatives to previous belief update operators
- âŠ