11,383 research outputs found
CHR Grammars
A grammar formalism based upon CHR is proposed analogously to the way
Definite Clause Grammars are defined and implemented on top of Prolog. These
grammars execute as robust bottom-up parsers with an inherent treatment of
ambiguity and a high flexibility to model various linguistic phenomena. The
formalism extends previous logic programming based grammars with a form of
context-sensitive rules and the possibility to include extra-grammatical
hypotheses in both head and body of grammar rules. Among the applications are
straightforward implementations of Assumption Grammars and abduction under
integrity constraints for language analysis. CHR grammars appear as a powerful
tool for specification and implementation of language processors and may be
proposed as a new standard for bottom-up grammars in logic programming.
To appear in Theory and Practice of Logic Programming (TPLP), 2005Comment: 36 pp. To appear in TPLP, 200
Logical Algorithms meets CHR: A meta-complexity result for Constraint Handling Rules with rule priorities
This paper investigates the relationship between the Logical Algorithms
language (LA) of Ganzinger and McAllester and Constraint Handling Rules (CHR).
We present a translation schema from LA to CHR-rp: CHR with rule priorities,
and show that the meta-complexity theorem for LA can be applied to a subset of
CHR-rp via inverse translation. Inspired by the high-level implementation
proposal for Logical Algorithm by Ganzinger and McAllester and based on a new
scheduling algorithm, we propose an alternative implementation for CHR-rp that
gives strong complexity guarantees and results in a new and accurate
meta-complexity theorem for CHR-rp. It is furthermore shown that the
translation from Logical Algorithms to CHR-rp combined with the new CHR-rp
implementation, satisfies the required complexity for the Logical Algorithms
meta-complexity result to hold.Comment: To appear in Theory and Practice of Logic Programming (TPLP
The DLV System for Knowledge Representation and Reasoning
This paper presents the DLV system, which is widely considered the
state-of-the-art implementation of disjunctive logic programming, and addresses
several aspects. As for problem solving, we provide a formal definition of its
kernel language, function-free disjunctive logic programs (also known as
disjunctive datalog), extended by weak constraints, which are a powerful tool
to express optimization problems. We then illustrate the usage of DLV as a tool
for knowledge representation and reasoning, describing a new declarative
programming methodology which allows one to encode complex problems (up to
-complete problems) in a declarative fashion. On the foundational
side, we provide a detailed analysis of the computational complexity of the
language of DLV, and by deriving new complexity results we chart a complete
picture of the complexity of this language and important fragments thereof.
Furthermore, we illustrate the general architecture of the DLV system which
has been influenced by these results. As for applications, we overview
application front-ends which have been developed on top of DLV to solve
specific knowledge representation tasks, and we briefly describe the main
international projects investigating the potential of the system for industrial
exploitation. Finally, we report about thorough experimentation and
benchmarking, which has been carried out to assess the efficiency of the
system. The experimental results confirm the solidity of DLV and highlight its
potential for emerging application areas like knowledge management and
information integration.Comment: 56 pages, 9 figures, 6 table
Kolmogorov Complexity in perspective. Part II: Classification, Information Processing and Duality
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts published in a same volume. Part II is dedicated to the relation
between logic and information system, within the scope of Kolmogorov
algorithmic information theory. We present a recent application of Kolmogorov
complexity: classification using compression, an idea with provocative
implementation by authors such as Bennett, Vitanyi and Cilibrasi. This stresses
how Kolmogorov complexity, besides being a foundation to randomness, is also
related to classification. Another approach to classification is also
considered: the so-called "Google classification". It uses another original and
attractive idea which is connected to the classification using compression and
to Kolmogorov complexity from a conceptual point of view. We present and unify
these different approaches to classification in terms of Bottom-Up versus
Top-Down operational modes, of which we point the fundamental principles and
the underlying duality. We look at the way these two dual modes are used in
different approaches to information system, particularly the relational model
for database introduced by Codd in the 70's. This allows to point out diverse
forms of a fundamental duality. These operational modes are also reinterpreted
in the context of the comprehension schema of axiomatic set theory ZF. This
leads us to develop how Kolmogorov's complexity is linked to intensionality,
abstraction, classification and information system.Comment: 43 page
Kolmogorov Complexity in perspective. Part I: Information Theory and Randomnes
We survey diverse approaches to the notion of information: from Shannon
entropy to Kolmogorov complexity. Two of the main applications of Kolmogorov
complexity are presented: randomness and classification. The survey is divided
in two parts in the same volume. Part I is dedicated to information theory and
the mathematical formalization of randomness based on Kolmogorov complexity.
This last application goes back to the 60's and 70's with the work of
Martin-L\"of, Schnorr, Chaitin, Levin, and has gained new impetus in the last
years.Comment: 40 page
Towards Intelligent Databases
This article is a presentation of the objectives and techniques
of deductive databases. The deductive approach to databases aims at extending
with intensional definitions other database paradigms that describe
applications extensionaUy. We first show how constructive specifications can
be expressed with deduction rules, and how normative conditions can be defined
using integrity constraints. We outline the principles of bottom-up and
top-down query answering procedures and present the techniques used for
integrity checking. We then argue that it is often desirable to manage with
a database system not only database applications, but also specifications of
system components. We present such meta-level specifications and discuss
their advantages over conventional approaches
Abduction in Well-Founded Semantics and Generalized Stable Models
Abductive logic programming offers a formalism to declaratively express and
solve problems in areas such as diagnosis, planning, belief revision and
hypothetical reasoning. Tabled logic programming offers a computational
mechanism that provides a level of declarativity superior to that of Prolog,
and which has supported successful applications in fields such as parsing,
program analysis, and model checking. In this paper we show how to use tabled
logic programming to evaluate queries to abductive frameworks with integrity
constraints when these frameworks contain both default and explicit negation.
The result is the ability to compute abduction over well-founded semantics with
explicit negation and answer sets. Our approach consists of a transformation
and an evaluation method. The transformation adjoins to each objective literal
in a program, an objective literal along with rules that ensure
that will be true if and only if is false. We call the resulting
program a {\em dual} program. The evaluation method, \wfsmeth, then operates on
the dual program. \wfsmeth{} is sound and complete for evaluating queries to
abductive frameworks whose entailment method is based on either the
well-founded semantics with explicit negation, or on answer sets. Further,
\wfsmeth{} is asymptotically as efficient as any known method for either class
of problems. In addition, when abduction is not desired, \wfsmeth{} operating
on a dual program provides a novel tabling method for evaluating queries to
ground extended programs whose complexity and termination properties are
similar to those of the best tabling methods for the well-founded semantics. A
publicly available meta-interpreter has been developed for \wfsmeth{} using the
XSB system.Comment: 48 pages; To appear in Theory and Practice in Logic Programmin
- …