579 research outputs found

    Logarithmic Time Parallel Bayesian Inference

    Full text link
    I present a parallel algorithm for exact probabilistic inference in Bayesian networks. For polytree networks with n variables, the worst-case time complexity is O(log n) on a CREW PRAM (concurrent-read, exclusive-write parallel random-access machine) with n processors, for any constant number of evidence variables. For arbitrary networks, the time complexity is O(r^{3w}*log n) for n processors, or O(w*log n) for r^{3w}*n processors, where r is the maximum range of any variable, and w is the induced width (the maximum clique size), after moralizing and triangulating the network.Comment: Appears in Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence (UAI1998

    Global Conditioning for Probabilistic Inference in Belief Networks

    Full text link
    In this paper we propose a new approach to probabilistic inference on belief networks, global conditioning, which is a simple generalization of Pearl's (1986b) method of loopcutset conditioning. We show that global conditioning, as well as loop-cutset conditioning, can be thought of as a special case of the method of Lauritzen and Spiegelhalter (1988) as refined by Jensen et al (199Oa; 1990b). Nonetheless, this approach provides new opportunities for parallel processing and, in the case of sequential processing, a tradeoff of time for memory. We also show how a hybrid method (Suermondt and others 1990) combining loop-cutset conditioning with Jensen's method can be viewed within our framework. By exploring the relationships between these methods, we develop a unifying framework in which the advantages of each approach can be combined successfully.Comment: Appears in Proceedings of the Tenth Conference on Uncertainty in Artificial Intelligence (UAI1994

    Optimal Decomposition of Belief Networks

    Full text link
    In this paper, optimum decomposition of belief networks is discussed. Some methods of decomposition are examined and a new method - the method of Minimum Total Number of States (MTNS) - is proposed. The problem of optimum belief network decomposition under our framework, as under all the other frameworks, is shown to be NP-hard. According to the computational complexity analysis, an algorithm of belief network decomposition is proposed in (Wee, 1990a) based on simulated annealing.Comment: Appears in Proceedings of the Sixth Conference on Uncertainty in Artificial Intelligence (UAI1990

    Logarithmic-Time Updates and Queries in Probabilistic Networks

    Full text link
    Traditional databases commonly support efficient query and update procedures that operate in time which is sublinear in the size of the database. Our goal in this paper is to take a first step toward dynamic reasoning in probabilistic databases with comparable efficiency. We propose a dynamic data structure that supports efficient algorithms for updating and querying singly connected Bayesian networks. In the conventional algorithm, new evidence is absorbed in O(1) time and queries are processed in time O(N), where N is the size of the network. We propose an algorithm which, after a preprocessing phase, allows us to answer queries in time O(log N) at the expense of O(log N) time per evidence absorption. The usefulness of sub-linear processing time manifests itself in applications requiring (near) real-time response over large probabilistic databases. We briefly discuss a potential application of dynamic probabilistic reasoning in computational biology.Comment: See http://www.jair.org/ for any accompanying file

    Computational Advantages of Relevance Reasoning in Bayesian Belief Networks

    Full text link
    This paper introduces a computational framework for reasoning in Bayesian belief networks that derives significant advantages from focused inference and relevance reasoning. This framework is based on d -separation and other simple and computationally efficient techniques for pruning irrelevant parts of a network. Our main contribution is a technique that we call relevance-based decomposition. Relevance-based decomposition approaches belief updating in large networks by focusing on their parts and decomposing them into partially overlapping subnetworks. This makes reasoning in some intractable networks possible and, in addition, often results in significant speedup, as the total time taken to update all subnetworks is in practice often considerably less than the time taken to update the network as a whole. We report results of empirical tests that demonstrate practical significance of our approach.Comment: Appears in Proceedings of the Thirteenth Conference on Uncertainty in Artificial Intelligence (UAI1997

    An Algorithm for the Construction of Bayesian Network Structures from Data

    Full text link
    Previous algorithms for the construction of Bayesian belief network structures from data have been either highly dependent on conditional independence (CI) tests, or have required an ordering on the nodes to be supplied by the user. We present an algorithm that integrates these two approaches - CI tests are used to generate an ordering on the nodes from the database which is then used to recover the underlying Bayesian network structure using a non CI based method. Results of preliminary evaluation of the algorithm on two networks (ALARM and LED) are presented. We also discuss some algorithm performance issues and open problems.Comment: Appears in Proceedings of the Ninth Conference on Uncertainty in Artificial Intelligence (UAI1993

    Ergo: A Graphical Environment for Constructing Bayesian

    Full text link
    We describe an environment that considerably simplifies the process of generating Bayesian belief networks. The system has been implemented on readily available, inexpensive hardware, and provides clarity and high performance. We present an introduction to Bayesian belief networks, discuss algorithms for inference with these networks, and delineate the classes of problems that can be solved with this paradigm. We then describe the hardware and software that constitute the system, and illustrate Ergo's use with several exampleComment: Appears in Proceedings of the Sixth Conference on Uncertainty in Artificial Intelligence (UAI1990

    Evidence Absorption and Propagation through Evidence Reversals

    Full text link
    The arc reversal/node reduction approach to probabilistic inference is extended to include the case of instantiated evidence by an operation called "evidence reversal." This not only provides a technique for computing posterior joint distributions on general belief networks, but also provides insight into the methods of Pearl [1986b] and Lauritzen and Spiegelhalter [1988]. Although it is well understood that the latter two algorithms are closely related, in fact all three algorithms are identical whenever the belief network is a forest.Comment: Appears in Proceedings of the Fifth Conference on Uncertainty in Artificial Intelligence (UAI1989

    Anytime Inference in Valuation Algebras

    Full text link
    Anytime inference is inference performed incrementally, with the accuracy of the inference being controlled by a tunable parameter, usually time. Such anytime inference algorithms are also usually interruptible, gradually converging to the exact inference value until terminated. While anytime inference algorithms for specific domains like probability potentials exist in the literature, our objective in this article is to obtain an anytime inference algorithm which is sufficiently generic to cover a wide range of domains. For this we utilise the theory of generic inference as a basis for constructing an anytime inference algorithm, and in particular, extending work done on ordered valuation algebras. The novel contribution of this work is the construction of anytime algorithms in a generic framework, which automatically gives us instantiations in various useful domains. We also show how to apply this generic framework for anytime inference in semiring induced valuation algebras, an important subclass of valuation algebras, which includes instances like probability potentials, disjunctive normal forms and distributive lattices. Keywords: Approximation; Anytime algorithms; Resource-bounded computation; Generic inference; Valuation algebras; Local computation; Binary join trees.Comment: 9 pages, 1 figur

    Tutorial on Exact Belief Propagation in Bayesian Networks: from Messages to Algorithms

    Full text link
    In Bayesian networks, exact belief propagation is achieved through message passing algorithms. These algorithms (ex: inward and outward) provide only a recursive definition of the corresponding messages. In contrast, when working on hidden Markov models and variants, one classically first defines explicitly these messages (forward and backward quantities), and then derive all results and algorithms. In this paper, we generalize the hidden Markov model approach by introducing an explicit definition of the messages in Bayesian networks, from which we derive all the relevant properties and results including the recursive algorithms that allow to compute these messages. Two didactic examples (the precipitation hidden Markov model and the pedigree Bayesian network) are considered along the paper to illustrate the new formalism and standalone R source code is provided in the appendix
    • …
    corecore