39,749 research outputs found
Lazy Model Expansion: Interleaving Grounding with Search
Finding satisfying assignments for the variables involved in a set of
constraints can be cast as a (bounded) model generation problem: search for
(bounded) models of a theory in some logic. The state-of-the-art approach for
bounded model generation for rich knowledge representation languages, like ASP,
FO(.) and Zinc, is ground-and-solve: reduce the theory to a ground or
propositional one and apply a search algorithm to the resulting theory.
An important bottleneck is the blowup of the size of the theory caused by the
reduction phase. Lazily grounding the theory during search is a way to overcome
this bottleneck. We present a theoretical framework and an implementation in
the context of the FO(.) knowledge representation language. Instead of
grounding all parts of a theory, justifications are derived for some parts of
it. Given a partial assignment for the grounded part of the theory and valid
justifications for the formulas of the non-grounded part, the justifications
provide a recipe to construct a complete assignment that satisfies the
non-grounded part. When a justification for a particular formula becomes
invalid during search, a new one is derived; if that fails, the formula is
split in a part to be grounded and a part that can be justified.
The theoretical framework captures existing approaches for tackling the
grounding bottleneck such as lazy clause generation and grounding-on-the-fly,
and presents a generalization of the 2-watched literal scheme. We present an
algorithm for lazy model expansion and integrate it in a model generator for
FO(ID), a language extending first-order logic with inductive definitions. The
algorithm is implemented as part of the state-of-the-art FO(ID) Knowledge-Base
System IDP. Experimental results illustrate the power and generality of the
approach
A Plausibility Semantics for Abstract Argumentation Frameworks
We propose and investigate a simple ranking-measure-based extension semantics
for abstract argumentation frameworks based on their generic instantiation by
default knowledge bases and the ranking construction semantics for default
reasoning. In this context, we consider the path from structured to logical to
shallow semantic instantiations. The resulting well-justified JZ-extension
semantics diverges from more traditional approaches.Comment: Proceedings of the 15th International Workshop on Non-Monotonic
Reasoning (NMR 2014). This is an improved and extended version of the
author's ECSQARU 2013 pape
Knowledge Representation Concepts for Automated SLA Management
Outsourcing of complex IT infrastructure to IT service providers has
increased substantially during the past years. IT service providers must be
able to fulfil their service-quality commitments based upon predefined Service
Level Agreements (SLAs) with the service customer. They need to manage, execute
and maintain thousands of SLAs for different customers and different types of
services, which needs new levels of flexibility and automation not available
with the current technology. The complexity of contractual logic in SLAs
requires new forms of knowledge representation to automatically draw inferences
and execute contractual agreements. A logic-based approach provides several
advantages including automated rule chaining allowing for compact knowledge
representation as well as flexibility to adapt to rapidly changing business
requirements. We suggest adequate logical formalisms for representation and
enforcement of SLA rules and describe a proof-of-concept implementation. The
article describes selected formalisms of the ContractLog KR and their adequacy
for automated SLA management and presents results of experiments to demonstrate
flexibility and scalability of the approach.Comment: Paschke, A. and Bichler, M.: Knowledge Representation Concepts for
Automated SLA Management, Int. Journal of Decision Support Systems (DSS),
submitted 19th March 200
Recapture, Transparency, Negation and a Logic for the Catuį¹£koį¹i
The recent literature on NÄgÄrjunaās catuį¹£koį¹i centres around Jay Garfieldās (2009) and Graham Priestās (2010) interpretation. It is an open discussion to what extent their interpretation is an adequate model of the logic for the catuskoti, and the MuĢla-madhyamaka-kÄrikÄ. Priest and Garfield try to make sense of the contradictions within the catuskoti by appeal to a series of lattices ā orderings of truth-values, supposed to model the path to enlightenment. They use Anderson & Belnaps\u27s (1975) framework of First Degree Entailment. Cotnoir (2015) has argued that the lattices of Priest and Garfield cannot ground the logic of the catuskoti. The concern is simple: on the one hand, FDE brings with it the failure of classical principles such as modus ponens. On the other hand, we frequently encounter NÄgÄrjuna using classical principles in other arguments in the MMK. There is a problem of validity. If FDE is NÄgÄrjunaās logic of choice, he is facing what is commonly called the classical recapture problem: how to make sense of cases where classical principles like modus pones are valid? One cannot just add principles like modus pones as assumptions, because in the background paraconsistent logic this does not rule out their negations. In this essay, I shall explore and critically evaluate Cotnoirās proposal. In detail, I shall reveal that his framework suffers collapse of the kotis. Taking Cotnoirās concerns seriously, I shall suggest a formulation of the catuskoti in classical Boolean Algebra, extended by the notion of an external negation as an illocutionary act. I will focus on purely formal considerations, leaving doctrinal matters to the scholarly discourse ā as far as this is possible
Abduction in Well-Founded Semantics and Generalized Stable Models
Abductive logic programming offers a formalism to declaratively express and
solve problems in areas such as diagnosis, planning, belief revision and
hypothetical reasoning. Tabled logic programming offers a computational
mechanism that provides a level of declarativity superior to that of Prolog,
and which has supported successful applications in fields such as parsing,
program analysis, and model checking. In this paper we show how to use tabled
logic programming to evaluate queries to abductive frameworks with integrity
constraints when these frameworks contain both default and explicit negation.
The result is the ability to compute abduction over well-founded semantics with
explicit negation and answer sets. Our approach consists of a transformation
and an evaluation method. The transformation adjoins to each objective literal
in a program, an objective literal along with rules that ensure
that will be true if and only if is false. We call the resulting
program a {\em dual} program. The evaluation method, \wfsmeth, then operates on
the dual program. \wfsmeth{} is sound and complete for evaluating queries to
abductive frameworks whose entailment method is based on either the
well-founded semantics with explicit negation, or on answer sets. Further,
\wfsmeth{} is asymptotically as efficient as any known method for either class
of problems. In addition, when abduction is not desired, \wfsmeth{} operating
on a dual program provides a novel tabling method for evaluating queries to
ground extended programs whose complexity and termination properties are
similar to those of the best tabling methods for the well-founded semantics. A
publicly available meta-interpreter has been developed for \wfsmeth{} using the
XSB system.Comment: 48 pages; To appear in Theory and Practice in Logic Programmin
Recapture, Transparency, Negation and a Logic for the Catuskoti
The recent literature on NÄgÄrjunaās catuį¹£koį¹i centres around Jay Garfieldās (2009) and Graham Priestās (2010) interpretation. It is an open discussion to what extent their interpretation is an adequate model of the logic for the catuskoti, and the MÅ«la-madhyamaka-kÄrikÄ. Priest and Garfield try to make sense of the contradictions within the catuskoti by appeal to a series of lattices ā orderings of truth-values, supposed to model the path to enlightenment. They use Anderson & Belnaps's (1975) framework of First Degree Entailment. Cotnoir (2015) has argued that the lattices of Priest and Garfield cannot ground the logic of the catuskoti. The concern is simple: on the one hand, FDE brings with it the failure of classical principles such as modus ponens. On the other hand, we frequently encounter NÄgÄrjuna using classical principles in other arguments in the MMK. There is a problem of validity. If FDE is NÄgÄrjunaās logic of choice, he is facing what is commonly called the classical recapture problem: how to make sense of cases where classical principles like modus pones are valid? One cannot just add principles like modus ponens as assumptions, because in the background paraconsistent logic this does not rule out their negations. In this essay, I shall explore and critically evaluate Cotnoirās proposal. In detail, I shall reveal that his framework suffers collapse of the kotis. Furthermore, I shall argue that the Collapse Argument has been misguided from the outset. The last chapter suggests a formulation of the catuskoti in classical Boolean Algebra, extended by the notion of an external negation as an illocutionary act. I will focus on purely formal considerations, leaving doctrinal matters to the scholarly discourse ā as far as this is possible
A QBF-based Formalization of Abstract Argumentation Semantics
Supported by the National Research Fund, Luxembourg (LAAMI project) and by the Engineering and Physical Sciences Research Council (EPSRC, UK), grant ref. EP/J012084/1 (SAsSY project).Peer reviewedPostprin
The Combination of Paradoxical, Uncertain, and Imprecise Sources of Information based on DSmT and Neutro-Fuzzy Inference
The management and combination of uncertain, imprecise, fuzzy and even
paradoxical or high conflicting sources of information has always been, and
still remains today, of primal importance for the development of reliable
modern information systems involving artificial reasoning. In this chapter, we
present a survey of our recent theory of plausible and paradoxical reasoning,
known as Dezert-Smarandache Theory (DSmT) in the literature, developed for
dealing with imprecise, uncertain and paradoxical sources of information. We
focus our presentation here rather on the foundations of DSmT, and on the two
important new rules of combination, than on browsing specific applications of
DSmT available in literature. Several simple examples are given throughout the
presentation to show the efficiency and the generality of this new approach.
The last part of this chapter concerns the presentation of the neutrosophic
logic, the neutro-fuzzy inference and its connection with DSmT. Fuzzy logic and
neutrosophic logic are useful tools in decision making after fusioning the
information using the DSm hybrid rule of combination of masses.Comment: 20 page
Complexity of Non-Monotonic Logics
Over the past few decades, non-monotonic reasoning has developed to be one of
the most important topics in computational logic and artificial intelligence.
Different ways to introduce non-monotonic aspects to classical logic have been
considered, e.g., extension with default rules, extension with modal belief
operators, or modification of the semantics. In this survey we consider a
logical formalism from each of the above possibilities, namely Reiter's default
logic, Moore's autoepistemic logic and McCarthy's circumscription.
Additionally, we consider abduction, where one is not interested in inferences
from a given knowledge base but in computing possible explanations for an
observation with respect to a given knowledge base.
Complexity results for different reasoning tasks for propositional variants
of these logics have been studied already in the nineties. In recent years,
however, a renewed interest in complexity issues can be observed. One current
focal approach is to consider parameterized problems and identify reasonable
parameters that allow for FPT algorithms. In another approach, the emphasis
lies on identifying fragments, i.e., restriction of the logical language, that
allow more efficient algorithms for the most important reasoning tasks. In this
survey we focus on this second aspect. We describe complexity results for
fragments of logical languages obtained by either restricting the allowed set
of operators (e.g., forbidding negations one might consider only monotone
formulae) or by considering only formulae in conjunctive normal form but with
generalized clause types.
The algorithmic problems we consider are suitable variants of satisfiability
and implication in each of the logics, but also counting problems, where one is
not only interested in the existence of certain objects (e.g., models of a
formula) but asks for their number.Comment: To appear in Bulletin of the EATC
- ā¦