2 research outputs found

    Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model

    Full text link
    Bayesian networks in their Factor Graph Reduced Normal Form (FGrn) are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable, even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. An online version of the classic batch learning algorithm is also analyzed, showing very similar results (in an unsupervised context); which is essential even if multilevel structures are to be built. The solutions proposed, together with the possible online learning algorithm, are included in a C++ library that is quite efficient, especially if compared to the direct use of the well-known sum-product and Maximum Likelihood (ML) algorithms. The results are discussed with particular reference to a Latent Variable Model (LVM) structure.Comment: 20 pages, 8 figure

    Two-dimensional multi-layer Factor Graphs in Reduced Normal Form

    No full text
    We build a multi-layer architecture using the Bayesian framework of the Factor Graphs in Reduced Normal Form (FGrn). This model allows great modularity and unique localized learning equations. The multi-layer architecture implements a hierarchical data representation that via belief propagation can be used for learning and inference in pattern completion, correction and classification. We apply the framework to images extracted from a standard data set
    corecore