135 research outputs found

    Probabilistic inference in SWI-Prolog

    Get PDF
    Probabilistic Logic Programming (PLP) emerged as one of the most prominent approaches to cope with real-world domains. The distribution semantics is one of most used in PLP, as it is followed by many languages, such as Independent Choice Logic, PRISM, pD, Logic Programs with Annotated Disjunctions (LPADs) and ProbLog. A possible system that allows performing inference on LPADs is PITA, which transforms the input LPAD into a Prolog program containing calls to library predicates for handling Binary Decision Diagrams (BDDs). In particular, BDDs are used to compactly encode explanations for goals and efficiently compute their probability. However, PITA needs mode-directed tabling (also called tabling with answer subsumption), which has been implemented in SWI-Prolog only recently. This paper shows how SWI-Prolog has been extended to include correct answer subsumption and how the PITA transformation has been changed to use SWI-Prolog implementation

    On continuation-passing transformations and expected cost analysis

    Get PDF
    We define a continuation-passing style (CPS) translation for a typed \u3bb-calculus with probabilistic choice, unbounded recursion, and a tick operator - for modeling cost. The target language is a (non-probabilistic) \u3bb-calculus, enriched with a type of extended positive reals and a fixpoint operator. We then show that applying the CPS transform of an expression M to the continuation \u3bb v. 0 yields the expected cost of M. We also introduce a formal system for higher-order logic, called EHOL, prove it sound, and show it can derive tight upper bounds on the expected cost of classic examples, including Coupon Collector and Random Walk. Moreover, we relate our translation to Kaminski et al.'s ert-calculus, showing that the latter can be recovered by applying our CPS translation to (a generalization of) the classic embedding of imperative programs into \u3bb-calculus. Finally, we prove that the CPS transform of an expression can also be used to compute pre-expectations and to reason about almost sure termination

    SWISH: SWI-Prolog for Sharing

    Full text link
    Recently, we see a new type of interfaces for programmers based on web technology. For example, JSFiddle, IPython Notebook and R-studio. Web technology enables cloud-based solutions, embedding in tutorial web pages, atractive rendering of results, web-scale cooperative development, etc. This article describes SWISH, a web front-end for Prolog. A public website exposes SWI-Prolog using SWISH, which is used to run small Prolog programs for demonstration, experimentation and education. We connected SWISH to the ClioPatria semantic web toolkit, where it allows for collaborative development of programs and queries related to a dataset as well as performing maintenance tasks on the running server and we embedded SWISH in the Learn Prolog Now! online Prolog book.Comment: International Workshop on User-Oriented Logic Programming (IULP 2015), co-located with the 31st International Conference on Logic Programming (ICLP 2015), Proceedings of the International Workshop on User-Oriented Logic Programming (IULP 2015), Editors: Stefan Ellmauthaler and Claudia Schulz, pages 99-113, August 201

    kLog: A Language for Logical and Relational Learning with Kernels

    Full text link
    We introduce kLog, a novel approach to statistical relational learning. Unlike standard approaches, kLog does not represent a probability distribution directly. It is rather a language to perform kernel-based learning on expressive logical and relational representations. kLog allows users to specify learning problems declaratively. It builds on simple but powerful concepts: learning from interpretations, entity/relationship data modeling, logic programming, and deductive databases. Access by the kernel to the rich representation is mediated by a technique we call graphicalization: the relational representation is first transformed into a graph --- in particular, a grounded entity/relationship diagram. Subsequently, a choice of graph kernel defines the feature space. kLog supports mixed numerical and symbolic data, as well as background knowledge in the form of Prolog or Datalog programs as in inductive logic programming systems. The kLog framework can be applied to tackle the same range of tasks that has made statistical relational learning so popular, including classification, regression, multitask learning, and collective classification. We also report about empirical comparisons, showing that kLog can be either more accurate, or much faster at the same level of accuracy, than Tilde and Alchemy. kLog is GPLv3 licensed and is available at http://klog.dinfo.unifi.it along with tutorials

    Neural Probabilistic Logic Programming in Discrete-Continuous Domains

    Full text link
    Neural-symbolic AI (NeSy) allows neural networks to exploit symbolic background knowledge in the form of logic. It has been shown to aid learning in the limited data regime and to facilitate inference on out-of-distribution data. Probabilistic NeSy focuses on integrating neural networks with both logic and probability theory, which additionally allows learning under uncertainty. A major limitation of current probabilistic NeSy systems, such as DeepProbLog, is their restriction to finite probability distributions, i.e., discrete random variables. In contrast, deep probabilistic programming (DPP) excels in modelling and optimising continuous probability distributions. Hence, we introduce DeepSeaProbLog, a neural probabilistic logic programming language that incorporates DPP techniques into NeSy. Doing so results in the support of inference and learning of both discrete and continuous probability distributions under logical constraints. Our main contributions are 1) the semantics of DeepSeaProbLog and its corresponding inference algorithm, 2) a proven asymptotically unbiased learning algorithm, and 3) a series of experiments that illustrate the versatility of our approach.Comment: 27 pages, 9 figure
    corecore