30 research outputs found
Foundations of Rule-Based Query Answering
This survey article introduces into the essential concepts and methods underlying rule-based query languages. It covers four complementary areas: declarative semantics based on adaptations of mathematical logic, operational semantics, complexity and expressive power, and optimisation of query evaluation.
The treatment of these areas is foundation-oriented, the foundations having resulted from over four decades of research in the logic programming and database communities on combinations of query languages and rules. These results have later formed the basis for conceiving, improving, and implementing several Web and Semantic Web technologies, in particular query languages such as XQuery or SPARQL for querying relational, XML, and RDF data, and rule languages like the âRule Interchange Framework (RIF)â currently being developed in a working group of the W3C.
Coverage of the article is deliberately limited to declarative languages in a classical setting: issues such as query answering in F-Logic or in description logics, or the relationship of query answering to reactive rules and events, are not addressed
Topics in Programming Languages, a Philosophical Analysis through the case of Prolog
[EN]Programming languages seldom find proper anchorage in philosophy of logic, language and science. is more, philosophy of language seems to be restricted to natural languages and linguistics, and even philosophy of logic is rarely framed into programming languages topics. The logic programming paradigm and Prolog are, thus, the most adequate paradigm and programming language to work on this subject, combining natural language processing and linguistics, logic programming and constriction methodology on both algorithms and procedures, on an overall philosophizing declarative status. Not only this, but the dimension of the Fifth Generation Computer system related to strong Al wherein Prolog took a major role. and its historical frame in the very crucial dialectic between procedural and declarative paradigms, structuralist and empiricist biases, serves, in exemplar form, to treat straight ahead philosophy of logic, language and science in the contemporaneous age as well.
In recounting Prolog's philosophical, mechanical and algorithmic harbingers, the opportunity is open to various routes. We herein shall exemplify some:
- the mechanical-computational background explored by Pascal, Leibniz, Boole, Jacquard, Babbage, Konrad Zuse, until reaching to the ACE (Alan Turing) and EDVAC (von Neumann), offering the backbone in computer architecture, and the work of Turing, Church, GĂśdel, Kleene, von Neumann, Shannon, and others on computability, in parallel lines, throughly studied in detail, permit us to interpret ahead the evolving realm of programming languages. The proper line from lambda-calculus, to the Algol-family, the declarative and procedural split with the C language and Prolog, and the ensuing branching and programming languages explosion and further delimitation, are thereupon inspected as to relate them with the proper syntax, semantics and philosophical ĂŠlan of logic programming and Prolog
An Algorithmic Interpretation of Quantum Probability
The Everett (or relative-state, or many-worlds) interpretation of quantum mechanics has come under fire for inadequately dealing with the Born rule (the formula for calculating quantum probabilities). Numerous attempts have been made to derive this rule from the perspective of observers within the quantum wavefunction. These are not really analytic proofs, but are rather attempts to derive the Born rule as a synthetic a priori necessity, given the nature of human observers (a fact not fully appreciated even by all of those who have attempted such proofs). I show why existing attempts are unsuccessful or only partly successful, and postulate that Solomonoff's algorithmic approach to the interpretation of probability theory could clarify the problems with these approaches. The Sleeping Beauty probability puzzle is used as a springboard from which to deduce an objectivist, yet synthetic a priori framework for quantum probabilities, that properly frames the role of self-location and self-selection (anthropic) principles in probability theory. I call this framework "algorithmic synthetic unity" (or ASU). I offer no new formal proof of the Born rule, largely because I feel that existing proofs (particularly that of Gleason) are already adequate, and as close to being a formal proof as one should expect or want. Gleason's one unjustified assumption--known as noncontextuality--is, I will argue, completely benign when considered within the algorithmic framework that I propose. I will also argue that, to the extent the Born rule can be derived within ASU, there is no reason to suppose that we could not also derive all the other fundamental postulates of quantum theory, as well. There is nothing special here about the Born rule, and I suggest that a completely successful Born rule proof might only be possible once all the other postulates become part of the derivation. As a start towards this end, I show how we can already derive the essential content of the fundamental postulates of quantum mechanics, at least in outline, and especially if we allow some educated and well-motivated guesswork along the way. The result is some steps towards a coherent and consistent algorithmic interpretation of quantum mechanics
19th Brazilian Logic Conference: Book of Abstracts
This is the book of abstracts of the 19th Brazilian Logic Conferences. The Brazilian Logic Conferences (EBL) is one of the most traditional logic conferences in South America. Organized by the Brazilian Logic Society (SBL), its main goal is to promote the dissemination of research in logic in a broad sense. It has been occurring since 1979, congregating logicians of different fields â mostly philosophy, mathematics and computer science â and with different backgrounds â from undergraduate students to senior researchers. The meeting is an important moment for the Brazilian and South American logical community to join together and discuss recent developments of the field. The areas of logic covered in the conference spread over foundations and philosophy of science, analytic philosophy, philosophy and history of logic, mathematics, computer science, informatics, linguistics and artificial intelligence. Previous editions of the EBL have been a great success, attracting researchers from all over Latin America and elsewhere.
The 19th edition of EBL takes place from May 6-10, 2019, in the beautiful city of JoĂŁo Pessoa, at the northeast coast of Brazil. It is conjointly organized by Federal University of ParaĂba (UFPB), whose main campus is located in JoĂŁo Pessoa, Federal University of Campina Grande (UFCG), whose main campus is located in the nearby city of Campina Grande (the second-largest city in ParaĂba state) and SBL. It is sponsored by UFPB, UFCG, the Brazilian Council for Scientific and Technological Development (CNPq) and the State Ministry of Education, Science and Technology of ParaĂba. It takes place at Hotel Luxxor Nord TambaĂş, privileged located right in front TambaĂş beach, one of JoĂŁo Pessoaâs most famous beaches
Validating reasoning heuristics using next generation theorem provers
The specification of enterprise information systems using formal specification languages
enables the formal verification of these systems. Reasoning about the properties of a formal
specification is a tedious task that can be facilitated much through the use of an automated
reasoner. However, set theory is a corner stone of many formal specification languages and
poses demanding challenges to automated reasoners. To this end a number of heuristics has
been developed to aid the Otter theorem prover in finding short proofs for set-theoretic
problems. This dissertation investigates the applicability of these heuristics to next generation
theorem provers.ComputingM.Sc. (Computer Science
Fuzzy Logic
Fuzzy Logic is becoming an essential method of solving problems in all domains. It gives tremendous impact on the design of autonomous intelligent systems. The purpose of this book is to introduce Hybrid Algorithms, Techniques, and Implementations of Fuzzy Logic. The book consists of thirteen chapters highlighting models and principles of fuzzy logic and issues on its techniques and implementations. The intended readers of this book are engineers, researchers, and graduate students interested in fuzzy logic systems
Graphical representation of canonical proof: two case studies
An interesting problem in proof theory is to find representations of proof that do
not distinguish between proofs that are âmorallyâ the same. For many logics, the presentation
of proofs in a traditional formalism, such as Gentzenâs sequent calculus, introduces
artificial syntactic structure called âbureaucracyâ; e.g., an arbitrary ordering
of freely permutable inferences. A proof system that is free of bureaucracy is called
canonical for a logic. In this dissertation two canonical proof systems are presented,
for two logics: a notion of proof nets for additive linear logic with units, and âclassical
proof forestsâ, a graphical formalism for first-order classical logic.
Additive linear logic (or sumâproduct logic) is the fragment of linear logic consisting
of linear implication between formulae constructed only from atomic formulae and
the additive connectives and units. Up to an equational theory over proofs, the logic
describes categories in which finite products and coproducts occur freely. A notion of
proof nets for additive linear logic is presented, providing canonical graphical representations
of the categorical morphisms and constituting a tractable decision procedure
for this equational theory. From existing proof nets for additive linear logic without
units by Hughes and Van Glabbeek (modified to include the units naively), canonical
proof nets are obtained by a simple graph rewriting algorithm called saturation. Main
technical contributions are the substantial correctness proof of the saturation algorithm,
and a correctness criterion for saturated nets.
Classical proof forests are a canonical, graphical proof formalism for first-order
classical logic. Related to Herbrandâs Theorem and backtracking games in the style
of Coquand, the forests assign witnessing information to quantifiers in a structurally
minimal way, reducing a first-order sentence to a decidable propositional one. A similar
formalism âexpansion tree proofsâ was presented by Miller, but not given a method
of composition. The present treatment adds a notion of cut, and investigates the possibility
of composing forests via cut-elimination. Cut-reduction steps take the form
of a rewrite relation that arises from the structure of the forests in a natural way.
Yet reductions are intricate, and initially not well-behaved: from perfectly ordinary
cuts, reduction may reach unnaturally configured cuts that may not be reduced. Cutelimination
is shown using a modified version of the rewrite relation, inspired by the
game-theoretic interpretation of the forests, for which weak normalisation is shown,
and strong normalisation is conjectured. In addition, by a more intricate argument,
weak normalisation is also shown for the original reduction relation
Intuition in formal proof : a novel framework for combining mathematical tools
This doctoral thesis addresses one major difficulty in formal proof: removing obstructions
to intuition which hamper the proof endeavour. We investigate this in the context
of formally verifying geometric algorithms using the theorem prover Isabelle, by first
proving the Grahamâs Scan algorithm for finding convex hulls, then using the challenges
we encountered as motivations for the design of a general, modular framework
for combining mathematical tools.
We introduce our integration framework â the Proverâs Palette, describing in detail
the guiding principles from software engineering and the key differentiator of our
approach â emphasising the role of the user. Two integrations are described, using
the framework to extend Eclipse Proof General so that the computer algebra systems
QEPCAD and Maple are directly available in an Isabelle proof context, capable of running
either fully automated or with user customisation. The versatility of the approach
is illustrated by showing a variety of ways that these tools can be used to streamline the
theorem proving process, enriching the userâs intuition rather than disrupting it. The
usefulness of our approach is then demonstrated through the formal verification of an
algorithm for computing Delaunay triangulations in the Proverâs Palette
Saturation-based decision procedures for extensions of the guarded fragment
We apply the framework of Bachmair and Ganzinger for saturation-based theorem proving to derive a range of decision procedures for logical formalisms, starting with a simple terminological language EL, which allows for conjunction and existential restrictions only, and ending with extensions of the guarded fragment with equality, constants, functionality, number restrictions and compositional axioms of form S ◦ T ⊆ H. Our procedures are derived in a uniform way using standard saturation-based calculi enhanced with simplification rules based on the general notion of redundancy. We argue that such decision procedures can be applied for reasoning in expressive description logics, where they have certain advantages over traditionally used tableau procedures, such as optimal worst-case complexity and direct correctness proofs.Wir wenden das Framework von Bachmair und Ganzinger fĂźr saturierungsbasiertes Theorembeweisen an, um eine Reihe von Entscheidungsverfahren fĂźr logische Formalismen abzuleiten, angefangen von einer simplen terminologischen Sprache EL, die nur Konjunktionen und existentielle Restriktionen erlaubt, bis zu Erweiterungen des Guarded Fragment mit Gleichheit, Konstanten, Funktionalität, Zahlenrestriktionen und Kompositionsaxiomen der Form S ◦ T ⊆ H. Unsere Verfahren sind einheitlich abgeleitet unter Benutzung herkĂśmmlicher saturierungsbasierter KalkĂźle, verbessert durch Simplifikationsregeln, die auf dem Konzept der Redundanz basieren. Wir argumentieren, daĂ solche Entscheidungsprozeduren fĂźr das Beweisen in ausdrucksvollen Beschreibungslogiken angewendet werden kĂśnnen, wo sie gewisse Vorteile gegenĂźber traditionell benutzten Tableauverfahren besitzen, wie z.B. optimale worst-case Komplexität und direkte Korrektheitsbeweise