5 research outputs found

    Computation of Kullback–Leibler Divergence in Bayesian Networks

    Get PDF
    Kullback–Leibler divergence KL(p, q) is the standard measure of error when we have a true probability distribution p which is approximate with probability distribution q. Its efficient computation is essential in many tasks, as in approximate computation or as a measure of error when learning a probability. In high dimensional probabilities, as the ones associated with Bayesian networks, a direct computation can be unfeasible. This paper considers the case of efficiently computing the Kullback–Leibler divergence of two probability distributions, each one of them coming from a different Bayesian network, which might have different structures. The paper is based on an auxiliary deletion algorithm to compute the necessary marginal distributions, but using a cache of operations with potentials in order to reuse past computations whenever they are necessary. The algorithms are tested with Bayesian networks from the bnlearn repository. Computer code in Python is provided taking as basis pgmpy, a library for working with probabilistic graphical models.Spanish Ministry of Education and Science under project PID2019-106758GB-C31European Regional Development Fund (FEDER

    Reduction of Computational Complexity in Bayesian Networks through Removal of Weak Dependences

    No full text
    The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, since several fill-ins and moral links may be rendered superfluous by the removal. The method is described in terms of impact on the independence graph, the junction tree, and the potential functions associated with these. An empirical evaluation of the method using large real-world networks demonstrates the applicability of the method. Further, the method, which has been implemented in Hugin, complements the approximation method suggested by Jensen & Andersen (1990)

    Value of information in decision systems

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH
    corecore