627 research outputs found

    Multi-Source Neural Variational Inference

    Full text link
    Learning from multiple sources of information is an important problem in machine-learning research. The key challenges are learning representations and formulating inference methods that take into account the complementarity and redundancy of various information sources. In this paper we formulate a variational autoencoder based multi-source learning framework in which each encoder is conditioned on a different information source. This allows us to relate the sources via the shared latent variables by computing divergence measures between individual source's posterior approximations. We explore a variety of options to learn these encoders and to integrate the beliefs they compute into a consistent posterior approximation. We visualise learned beliefs on a toy dataset and evaluate our methods for learning shared representations and structured output prediction, showing trade-offs of learning separate encoders for each information source. Furthermore, we demonstrate how conflict detection and redundancy can increase robustness of inference in a multi-source setting.Comment: AAAI 2019, Association for the Advancement of Artificial Intelligence (AAAI) 201

    Representation of Interrelationships among Binary Variables under Dempster-Shafer Theory of Belief Functions

    Get PDF
    This is the peer reviewed version of the following article: Srivastava, R. P., L. Gao, and P. Gillett. " Representation of Interrelationships among Binary Variables under Dempster-Shafer Theory of Belief Functions" (pre-publication version), 2009, International Journal of Intelligent Systems, Volume 24 Issue 4, pp. 459 - 475, which has been published in final form at http://doi.org/10.1002/int.20347. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.This paper presents an algorithm for developing models under Dempster-Shafer theory of belief functions for categorical and 'uncertain' logical relationships among binary variables. We illustrate the use of the algorithm by developing belief-function representations of the following categorical relationships: 'AND', 'OR', 'Exclusive OR (EOR)' and 'Not Exclusive OR (NEOR)', and 'AND-NEOR' and of the following uncertain relationships: 'Discounted AND', 'Conditional OR', and 'Weighted Average'. Such representations are needed to fully model and analyze a problem with a network of interrelated variables under Dempster-Shafer theory of belief functions. In addition, we compare our belief-function representation of the 'Weighted Average' relationship with the 'Weighted Average' representation developed and used by Shenoy and Shenoy8. We find that Shenoy and Shenoy representation of the weighted average relationship is an approximation and yields significantly different values under certain conditions
    corecore