24,392 research outputs found
Parallelization of the PC Algorithm
This paper describes a parallel version of the PC algorithm
for learning the structure of a Bayesian network from data. The PC
algorithm is a constraint-based algorithm consisting of fi ve steps where
the first step is to perform a set of (conditional) independence tests
while the remaining four steps relate to identifying the structure of the
Bayesian network using the results of the (conditional) independence
tests. In this paper, we describe a new approach to parallelization of the
(conditional) independence testing as experiments illustrate that this is
by far the most time consuming step. The proposed parallel PC algorithm
is evaluated on data sets generated at random from five different real-
world Bayesian networks. The results demonstrate that signi cant time
performance improvements are possible using the proposed algorithm
Parallelization of the PC Algorithm
Abstract. This paper describes a parallel version of the PC algorithm for learning the structure of a Bayesian network from data. The PC algorithm is a constraint-based algorithm consisting of five steps where the first step is to perform a set of (conditional) independence tests while the remaining four steps relate to identifying the structure of the Bayesian network using the results of the (conditional) independence tests. In this paper, we describe a new approach to parallelization of the (conditional) independence testing as experiments illustrate that this is by far the most time consuming step. The proposed parallel PC algorithm is evaluated on data sets generated at random from five different realworld Bayesian networks. The results demonstrate that significant time performance improvements are possible using the proposed algorithm
Inferring dynamic genetic networks with low order independencies
In this paper, we propose a novel inference method for dynamic genetic
networks which makes it possible to face with a number of time measurements n
much smaller than the number of genes p. The approach is based on the concept
of low order conditional dependence graph that we extend here in the case of
Dynamic Bayesian Networks. Most of our results are based on the theory of
graphical models associated with the Directed Acyclic Graphs (DAGs). In this
way, we define a minimal DAG G which describes exactly the full order
conditional dependencies given the past of the process. Then, to face with the
large p and small n estimation case, we propose to approximate DAG G by
considering low order conditional independencies. We introduce partial qth
order conditional dependence DAGs G(q) and analyze their probabilistic
properties. In general, DAGs G(q) differ from DAG G but still reflect relevant
dependence facts for sparse networks such as genetic networks. By using this
approximation, we set out a non-bayesian inference method and demonstrate the
effectiveness of this approach on both simulated and real data analysis. The
inference procedure is implemented in the R package 'G1DBN' freely available
from the CRAN archive
- …