12 research outputs found

    Uncertain Reasoning in Justification Logic

    Get PDF
    This thesis studies the combination of two well known formal systems for knowledge representation: probabilistic logic and justification logic. Our aim is to design a formal framework that allows the analysis of epistemic situations with incomplete information. In order to achieve this we introduce two probabilistic justification logics, which are defined by adding probability operators to the minimal justification logic J. We prove soundness and completeness theorems for our logics and establish decidability procedures. Both our logics rely on an infinitary rule so that strong completeness can be achieved. One of the most interesting mathematical results for our logics is the fact that adding only one iteration of the probability operator to the justification logic J does not increase the computational complexity of the logic

    Reasoning in Many Dimensions : Uncertainty and Products of Modal Logics

    Get PDF
    Probabilistic Description Logics (ProbDLs) are an extension of Description Logics that are designed to capture uncertainty. We study problems related to these logics. First, we investigate the monodic fragment of Probabilistic first-order logic, show that it has many nice properties, and are able to explain the complexity results obtained for ProbDLs. Second, in order to identify well-behaved, in best-case tractable ProbDLs, we study the complexity landscape for different fragments of ProbEL; amongst others, we are able to identify a tractable fragment. We then study the reasoning problem of ontological query answering, but apply it to probabilistic data. Therefore, we define the framework of ontology-based access to probabilistic data and study the computational complexity therein. In the final part of the thesis, we study the complexity of the satisfiability problem in the two-dimensional modal logic KxK. We are able to close a gap that has been open for more than ten years

    Query Answering in Probabilistic Data and Knowledge Bases

    Get PDF
    Probabilistic data and knowledge bases are becoming increasingly important in academia and industry. They are continuously extended with new data, powered by modern information extraction tools that associate probabilities with knowledge base facts. The state of the art to store and process such data is founded on probabilistic database systems, which are widely and successfully employed. Beyond all the success stories, however, such systems still lack the fundamental machinery to convey some of the valuable knowledge hidden in them to the end user, which limits their potential applications in practice. In particular, in their classical form, such systems are typically based on strong, unrealistic limitations, such as the closed-world assumption, the closed-domain assumption, the tuple-independence assumption, and the lack of commonsense knowledge. These limitations do not only lead to unwanted consequences, but also put such systems on weak footing in important tasks, querying answering being a very central one. In this thesis, we enhance probabilistic data and knowledge bases with more realistic data models, thereby allowing for better means for querying them. Building on the long endeavor of unifying logic and probability, we develop different rigorous semantics for probabilistic data and knowledge bases, analyze their computational properties and identify sources of (in)tractability and design practical scalable query answering algorithms whenever possible. To achieve this, the current work brings together some recent paradigms from logics, probabilistic inference, and database theory

    Runtime Monitoring for Uncertain Times

    Get PDF
    In Runtime Verification (RV), monitors check programs for correct operation at execution time. Also called Runtime Monitoring, RV offers advantages over other approaches to program verification. Efficient monitoring is possible for programs where static checking is cost-prohibitive. Runtime monitors may test for execution faults like hardware failure, as well as logical faults. Unlike simple log checking, monitors are typically constructed using formal languages and methods that precisely define expectations and guarantees. Despite the advantages of RV, however, adoption remains low. Applying Runtime Monitoring techniques to real systems requires addressing practical concerns that have garnered little attention from researchers. System operators need monitors that provide immediate diagnostic information before and after failures, that are simple to operate over distributed systems, and that remain reliable when communication is not. These challenges are solvable, and solving them is a necessary step towards widespread RV deployment. This thesis provides solutions to these and other barriers to practical Runtime Monitoring. We address the need for reporting diagnostic information from monitored programs with nfer, a language and system for event stream abstraction. Nfer supports the automatic extraction of the structure of real-time software and includes integrations with popular programming languages. We also provide for the operation of nfer and other monitoring tools over distributed systems with Palisade, a framework built for low-latency detection of embedded system anomalies. Finally, we supply a method to ensure program properties may be monitored despite unreliable communication channels. We classify monitorable properties over general unreliable conditions and define an algorithm for when more specific conditions are known

    An ontological analysis of vague motion verbs, with an application to event recognition

    Get PDF
    This research presents a methodology for the ontological formalisation of vague spatial concepts from natural language, with an application to the automatic recognition of event occurrences on video data. The main issue faced when defining concepts sourced from language is vagueness, related to the presence of ambiguities and borderline cases even in simple concepts such as ‘near’, ‘fast’, ‘big’, etc. Other issues specific to this semantic domain are saliency, granularity and uncertainty. In this work, the issue of vagueness in formal semantics is discussed and a methodology based on supervaluation semantics is proposed. This constitutes the basis for the formalisation of an ontology of vague spatial concepts based on classical logic, Event Calculus and supervaluation semantics. This ontology is structured in layers where high-level concepts, corresponding to complex actions and events, are inferred through mid-level concepts, corresponding to simple processes and properties of objects, and low-level primitive concepts, representing the most essential spatio-temporal characteristics of the real world. The development of ProVision, an event recognition system based on a logic-programming implementation of the ontology, demonstrates a practical application of the methodology. ProVision grounds the ontology on data representing the content of simple video scenes, leading to the inference of event occurrences and other high-level concepts. The contribution of this research is a methodology for the semantic characterisation of vague and qualitative concepts. This methodology addresses the issue of vagueness in ontologies and demonstrates the applicability of a supervaluationist approach to the formalisation of vague concepts. It is also proven to be effective towards solving a practical reasoning task, such as the event recognition on which this work focuses

    Programming Languages and Systems

    Get PDF
    This open access book constitutes the proceedings of the 28th European Symposium on Programming, ESOP 2019, which took place in Prague, Czech Republic, in April 2019, held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2019

    Uncertainty in Artificial Intelligence: Proceedings of the Thirty-Fourth Conference

    Get PDF
    corecore