114 research outputs found

    The Libra Toolkit for Probabilistic Models

    Full text link
    The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, and sum-product networks. Compared to other toolkits, Libra places a greater emphasis on learning the structure of tractable models in which exact inference is efficient. It also includes a variety of algorithms for learning graphical models in which inference is potentially intractable, and for performing exact and approximate inference. Libra is released under a 2-clause BSD license to encourage broad use in academia and industry

    Online Inference for Adaptive Diagnosis via Arithmetic Circuit Compilation of Bayesian Networks

    No full text
    International audienceConsidering technology and complexity evolution the design of fully reliable embedded systems will be prohibitively complex and costly. Onboard diagnosis is a first solution that can be achieved by means of Bayesian networks. An efficient compilation of Bayesian inference is proposed using Arithmetic Circuits (AC). ACs can be efficiently implemented in hardware to get very fast response time. This approach has been recently experimented in Software Health Management of aircrafts or UAVs. However, there are two kinds of obstacles that must be addressed. First, the tree complexity can lead to intractable solutions and second, an offline static analysis cannot capture the dynamic behaviour of a system that can have multiple configurations and applications. In this paper, we present our direction to solve these issues. Our approach relies on an adaptive version of the diagnosis computation for different kinds of applications/missions of UAVs. In particular, we consider an incremental generation of the AC structure. This adaptive diagnosis can be implemented using dynamic reconfiguration of FPGA circuits

    Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures

    Full text link
    Probabilistic graphical models are a central tool in AI; however, they are generally not as expressive as deep neural models, and inference is notoriously hard and slow. In contrast, deep probabilistic models such as sum-product networks (SPNs) capture joint distributions in a tractable fashion, but still lack the expressive power of intractable models based on deep neural networks. Therefore, we introduce conditional SPNs (CSPNs), conditional density estimators for multivariate and potentially hybrid domains which allow harnessing the expressive power of neural networks while still maintaining tractability guarantees. One way to implement CSPNs is to use an existing SPN structure and condition its parameters on the input, e.g., via a deep neural network. This approach, however, might misrepresent the conditional independence structure present in data. Consequently, we also develop a structure-learning approach that derives both the structure and parameters of CSPNs from data. Our experimental evidence demonstrates that CSPNs are competitive with other probabilistic models and yield superior performance on multilabel image classification compared to mean field and mixture density networks. Furthermore, they can successfully be employed as building blocks for structured probabilistic models, such as autoregressive image models.Comment: 13 pages, 6 figure
    • …
    corecore