18,348 research outputs found

    Hybrid Bayesian Networks with Linear Deterministic Variables

    Get PDF
    When a hybrid Bayesian network has conditionally deterministic variables with continuous parents, the joint density function for the continuous variables does not exist. Conditional linear Gaussian distributions can handle such cases when the continuous variables have a multi-variate normal distribution and the discrete variables do not have continuous parents. In this paper, operations required for performing inference with conditionally deterministic variables in hybrid Bayesian networks are developed. These methods allow inference in networks with deterministic variables where continuous variables may be non-Gaussian, and their density functions can be approximated by mixtures of truncated exponentials. There are no constraints on the placement of continuous and discrete nodes in the network

    Inference in Hybrid Bayesian Networks with Deterministic Variables

    Get PDF
    An important class of hybrid Bayesian networks are those that have conditionally deterministic variables (a variable that is a deterministic function of its parents). In this case, if some of the parents are continuous, then the joint density function does not exist. Conditional linear Gaussian (CLG) distributions can handle such cases when the deterministic function is linear and continuous variables are normally distributed. In this paper, we develop operations required for performing inference with conditionally deterministic variables using relationships derived from joint cumulative distribution functions (CDF’s). These methods allow inference in networks with deterministic variables where continuous variables are non-Gaussian

    Inference in Hybrid Bayesian Networks Using Mixtures of Gaussians

    Get PDF
    The main goal of this paper is to describe a method for exact inference in general hybrid Bayesian networks (BNs) (with a mixture of discrete and continuous chance variables). Our method consists of approximating general hybrid Bayesian networks by a mixture of Gaussians (MoG) BNs. There exists a fast algorithm by Lauritzen-Jensen (LJ) for making exact inferences in MoG Bayesian networks, and there exists a commercial implementation of this algorithm. However, this algorithm can only be used for MoG BNs. Some limitations of such networks are as follows. All continuous chance variables must have conditional linear Gaussian distributions, and discrete chance nodes cannot have continuous parents. The methods described in this paper will enable us to use the LJ algorithm for a bigger class of hybrid Bayesian networks. This includes networks with continuous chance nodes with non-Gaussian distributions, networks with no restrictions on the topology of discrete and continuous variables, networks with conditionally deterministic variables that are a nonlinear function of their continuous parents, and networks with continuous chance variables whose variances are functions of their parents
    • …
    corecore