1,747 research outputs found

    An Axiomatic Framework for Bayesian and Belief-function Propagation

    Full text link
    In this paper, we describe an abstract framework and axioms under which exact local computation of marginals is possible. The primitive objects of the framework are variables and valuations. The primitive operators of the framework are combination and marginalization. These operate on valuations. We state three axioms for these operators and we derive the possibility of local computation from the axioms. Next, we describe a propagation scheme for computing marginals of a valuation when we have a factorization of the valuation on a hypertree. Finally we show how the problem of computing marginals of joint probability distributions and joint belief functions fits the general framework.Comment: Appears in Proceedings of the Fourth Conference on Uncertainty in Artificial Intelligence (UAI1988

    Beliefs in Markov Trees - From Local Computations to Local Valuation

    Full text link
    This paper is devoted to expressiveness of hypergraphs for which uncertainty propagation by local computations via Shenoy/Shafer method applies. It is demonstrated that for this propagation method for a given joint belief distribution no valuation of hyperedges of a hypergraph may provide with simpler hypergraph structure than valuation of hyperedges by conditional distributions. This has vital implication that methods recovering belief networks from data have no better alternative for finding the simplest hypergraph structure for belief propagation. A method for recovery tree-structured belief networks has been developed and specialized for Dempster-Shafer belief functionsComment: Preliminary versioin of conference paper: M.A. K{\l}opotek: Beliefs in Markov Trees - From Local Computations to Local Valuation. [in:] R. Trappl, Ed.: Cybernetics and Systems Research , Proc. 12th European Meeting on Cybernetics and System Research, Vienna 5-8 April 1994, World Scientific Publishers, Vol.1. pp. 351-35

    Independence, Conditionality and Structure of Dempster-Shafer Belief Functions

    Full text link
    Several approaches of structuring (factorization, decomposition) of Dempster-Shafer joint belief functions from literature are reviewed with special emphasis on their capability to capture independence from the point of view of the claim that belief functions generalize bayes notion of probability. It is demonstrated that Zhu and Lee's {Zhu:93} logical networks and Smets' {Smets:93} directed acyclic graphs are unable to capture statistical dependence/independence of bayesian networks {Pearl:88}. On the other hand, though Shenoy and Shafer's hypergraphs can explicitly represent bayesian network factorization of bayesian belief functions, they disclaim any need for representation of independence of variables in belief functions. Cano et al. {Cano:93} reject the hypergraph representation of Shenoy and Shafer just on grounds of missing representation of variable independence, but in their frameworks some belief functions factorizable in Shenoy/Shafer framework cannot be factored. The approach in {Klopotek:93f} on the other hand combines the merits of both Cano et al. and of Shenoy/Shafer approach in that for Shenoy/Shafer approach no simpler factorization than that in {Klopotek:93f} approach exists and on the other hand all independences among variables captured in Cano et al. framework and many more are captured in {Klopotek:93f} approach.%Comment: 1994 internal repor

    Evidential Reasoning with Conditional Belief Functions

    Full text link
    In the existing evidential networks with belief functions, the relations among the variables are always represented by joint belief functions on the product space of the involved variables. In this paper, we use conditional belief functions to represent such relations in the network and show some relations of these two kinds of representations. We also present a propagation algorithm for such networks. By analyzing the properties of some special evidential networks with conditional belief functions, we show that the reasoning process can be simplified in such kinds of networks.Comment: Appears in Proceedings of the Tenth Conference on Uncertainty in Artificial Intelligence (UAI1994

    Knowledge Acquisition, Representation \& Manipulation in Decision Support Systems

    Full text link
    In this paper we present a methodology and discuss some implementation issues for a project on statistical/expert approach to data analysis and knowledge acquisition. We discuss some general assumptions underlying the project. Further, the requirements for a user-friendly computer assistant are specified along with the nature of tools aiding the researcher. Next we show some aspects of belief network approach and Dempster-Shafer (DST) methodology introduced in practice to system SEAD. Specifically we present the application of DS methodology to belief revision problem. Further a concept of an interface to probabilistic and DS belief networks enabling a user to understand the communication with a belief network based reasoning system is presentedComment: Intelligent Information Systems Proceedings of a Workshop held in August\'ow, Poland, 7-11 June, 1993, pages 210- 23

    Belief Revision in Probability Theory

    Full text link
    In a probability-based reasoning system, Bayes' theorem and its variations are often used to revise the system's beliefs. However, if the explicit conditions and the implicit conditions of probability assignments `me properly distinguished, it follows that Bayes' theorem is not a generally applicable revision rule. Upon properly distinguishing belief revision from belief updating, we see that Jeffrey's rule and its variations are not revision rules, either. Without these distinctions, the limitation of the Bayesian approach is often ignored or underestimated. Revision, in its general form, cannot be done in the Bayesian approach, because a probability distribution function alone does not contain the information needed by the operation.Comment: Appears in Proceedings of the Ninth Conference on Uncertainty in Artificial Intelligence (UAI1993

    Generalized Variational Inference: Three arguments for deriving new Posteriors

    Full text link
    We advocate an optimization-centric view on and introduce a novel generalization of Bayesian inference. Our inspiration is the representation of Bayes' rule as infinite-dimensional optimization problem (Csiszar, 1975; Donsker and Varadhan; 1975, Zellner; 1988). First, we use it to prove an optimality result of standard Variational Inference (VI): Under the proposed view, the standard Evidence Lower Bound (ELBO) maximizing VI posterior is preferable to alternative approximations of the Bayesian posterior. Next, we argue for generalizing standard Bayesian inference. The need for this arises in situations of severe misalignment between reality and three assumptions underlying standard Bayesian inference: (1) Well-specified priors, (2) well-specified likelihoods, (3) the availability of infinite computing power. Our generalization addresses these shortcomings with three arguments and is called the Rule of Three (RoT). We derive it axiomatically and recover existing posteriors as special cases, including the Bayesian posterior and its approximation by standard VI. In contrast, approximations based on alternative ELBO-like objectives violate the axioms. Finally, we study a special case of the RoT that we call Generalized Variational Inference (GVI). GVI posteriors are a large and tractable family of belief distributions specified by three arguments: A loss, a divergence and a variational family. GVI posteriors have appealing properties, including consistency and an interpretation as approximate ELBO. The last part of the paper explores some attractive applications of GVI in popular machine learning models, including robustness and more appropriate marginals. After deriving black box inference schemes for GVI posteriors, their predictive performance is investigated on Bayesian Neural Networks and Deep Gaussian Processes, where GVI can comprehensively improve upon existing methods.Comment: 103 pages, 23 figures (comprehensive revision of previous version

    Identification and Interpretation of Belief Structure in Dempster-Shafer Theory

    Full text link
    Mathematical Theory of Evidence called also Dempster-Shafer Theory (DST) is known as a foundation for reasoning when knowledge is expressed at various levels of detail. Though much research effort has been committed to this theory since its foundation, many questions remain open. One of the most important open questions seems to be the relationship between frequencies and the Mathematical Theory of Evidence. The theory is blamed to leave frequencies outside (or aside of) its framework. The seriousness of this accusation is obvious: (1) no experiment may be run to compare the performance of DST-based models of real world processes against real world data, (2) data may not serve as foundation for construction of an appropriate belief model. In this paper we develop a frequentist interpretation of the DST bringing to fall the above argument against DST. An immediate consequence of it is the possibility to develop algorithms acquiring automatically DST belief models from data. We propose three such algorithms for various classes of belief model structures: for tree structured belief networks, for poly-tree belief networks and for general type belief networks.Comment: An internal report 199

    Robustness Analysis of Bayesian Networks with Local Convex Sets of Distributions

    Full text link
    Robust Bayesian inference is the calculation of posterior probability bounds given perturbations in a probabilistic model. This paper focuses on perturbations that can be expressed locally in Bayesian networks through convex sets of distributions. Two approaches for combination of local models are considered. The first approach takes the largest set of joint distributions that is compatible with the local sets of distributions; we show how to reduce this type of robust inference to a linear programming problem. The second approach takes the convex hull of joint distributions generated from the local sets of distributions; we demonstrate how to apply interior-point optimization methods to generate posterior bounds and how to generate approximations that are guaranteed to converge to correct posterior bounds. We also discuss calculation of bounds for expected utilities and variances, and global perturbation models.Comment: Appears in Proceedings of the Thirteenth Conference on Uncertainty in Artificial Intelligence (UAI1997

    Possibilistic Conditioning and Propagation

    Full text link
    We give an axiomatization of confidence transfer - a known conditioning scheme - from the perspective of expectation-based inference in the sense of Gardenfors and Makinson. Then, we use the notion of belief independence to "filter out" different proposal s of possibilistic conditioning rules, all are variations of confidence transfer. Among the three rules that we consider, only Dempster's rule of conditioning passes the test of supporting the notion of belief independence. With the use of this conditioning rule, we then show that we can use local computation for computing desired conditional marginal possibilities of the joint possibility satisfying the given constraints. It turns out that our local computation scheme is already proposed by Shenoy. However, our intuitions are completely different from that of Shenoy. While Shenoy just defines a local computation scheme that fits his framework of valuation-based systems, we derive that local computation scheme from II(,8) = tI(,8 I a) * II(a) and appropriate independence assumptions, just like how the Bayesians derive their local computation scheme.Comment: Appears in Proceedings of the Tenth Conference on Uncertainty in Artificial Intelligence (UAI1994
    • …
    corecore