40,308 research outputs found
A Parallel Algorithm for Exact Bayesian Structure Discovery in Bayesian Networks
Exact Bayesian structure discovery in Bayesian networks requires exponential
time and space. Using dynamic programming (DP), the fastest known sequential
algorithm computes the exact posterior probabilities of structural features in
time and space, if the number of nodes (variables) in the
Bayesian network is and the in-degree (the number of parents) per node is
bounded by a constant . Here we present a parallel algorithm capable of
computing the exact posterior probabilities for all edges with optimal
parallel space efficiency and nearly optimal parallel time efficiency. That is,
if processors are used, the run-time reduces to
and the space usage becomes per
processor. Our algorithm is based the observation that the subproblems in the
sequential DP algorithm constitute a - hypercube. We take a delicate way
to coordinate the computation of correlated DP procedures such that large
amount of data exchange is suppressed. Further, we develop parallel techniques
for two variants of the well-known \emph{zeta transform}, which have
applications outside the context of Bayesian networks. We demonstrate the
capability of our algorithm on datasets with up to 33 variables and its
scalability on up to 2048 processors. We apply our algorithm to a biological
data set for discovering the yeast pheromone response pathways.Comment: 32 pages, 12 figure
Algorithms for Exact Structure Discovery in Bayesian Networks
Bayesian networks are compact, flexible, and interpretable representations of a joint distribution. When the network structure is unknown but there are observational data at hand, one can try to learn the network structure. This is called structure discovery. This thesis contributes to two areas of structure discovery in Bayesian networks: space--time tradeoffs and learning ancestor relations.
The fastest exact algorithms for structure discovery in Bayesian networks are based on dynamic programming and use excessive amounts of space. Motivated by the space usage, several schemes for trading space against time are presented. These schemes are presented in a general setting for a class of computational problems called permutation problems; structure discovery in Bayesian networks is seen as a challenging variant of the permutation problems. The main contribution in the area of the space--time tradeoffs is the partial order approach, in which the standard dynamic programming algorithm is extended to run over partial orders. In particular, a certain family of partial orders called parallel bucket orders is considered. A partial order scheme that provably yields an optimal space--time tradeoff within parallel bucket orders is presented. Also practical issues concerning parallel bucket orders are discussed.
Learning ancestor relations, that is, directed paths between nodes, is motivated by the need for robust summaries of the network structures when there are unobserved nodes at work. Ancestor relations are nonmodular features and hence learning them is more difficult than modular features. A dynamic programming algorithm is presented for computing posterior probabilities of ancestor relations exactly. Empirical tests suggest that ancestor relations can be learned from observational data almost as accurately as arcs even in the presence of unobserved nodes.Algoritmeja Bayes-verkkojen rakenteen tarkkaan oppimiseen
Bayes-verkot ovat todennäköisyysmalleja, joiden avulla voidaan kuvata muuttujien välisiä suhteita. Bayes-verkko koostuu kahdesta osasta: rakenteesta ja kuhunkin muuttujaan liittyvästä ehdollisesta todennäköisyysjakaumasta. Rakenteen puolestaan muodostaa muuttujien välisiä riippuvuuksia kuvaava suunnattu syklitön verkko. Kun tarkasteltavaa ilmiötä hyvin kuvaavaa Bayes-verkkoa ei tunneta ennalta, mutta ilmiöön liittyvistä muuttujista on kerätty havaintoaineistoa, voidaan sopivia algoritmeja käyttäen yrittää löytää verkkorakenne, joka sovittuu aineistoon mahdollisimman hyvin.
Nopeimmat tarkat rakenteenoppimisalgoritmit perustuvat niin kutsuttuun dynaamiseen ohjelmointiin, eli ne pitävät välituloksia muistissa ja näin välttävät suorittamasta samoja laskuja useaan kertaan. Vaikka tällaiset menetelmät ovat suhteellisen nopeita, niiden haittapuolena on suuri muistinkäyttö, joka estää suurten verkkojen rakenteen oppimisen. Väitöskirjan alkuosa käsittelee rakenteenoppimisalgoritmeja, jotka tasapainottelevat ajan- ja muistinkäytön välillä. Kirjassa esitellään menetelmiä, joilla verkon rakenne voidaan oppia tehokkaasti käyttäen hyväksi kaikki käytössä oleva tila. Uusi menetelmä mahdollistaa entistä suurempien verkkojen rakenteen oppimisen. Edellä mainittu menetelmä yleistetään ratkaisemaan Bayes-verkkojen rakenteenoppimisen lisäksi myös niin kutsuttuja permutaatio-ongelmia, joista tunnetuin lienee kauppamatkustajan ongelma.
Väitöskirjan loppuosa käsittelee muuttujien välisien esi-isäsuhteiden oppimista. Kyseiset suhteet ovat kiinnostavia, sillä ne antavat lisätietoa muuttujien sekä suorista että epäsuorista syy-seuraussuhteista. Väitöskirjassa esitetään algoritmi esi-isäsuhteiden todennäköisyyksien laskemiseen. Algoritmin toimintaa tutkitaan käytännössä ja todetaan, että esi-isäsuhteita pystytään oppimaan melko hyvin jopa silloin, kun useat havaitsemattomat muuttujat vaikuttavat aineiston muuttujiin
Bayesian Discovery of Multiple Bayesian Networks via Transfer Learning
Bayesian network structure learning algorithms with limited data are being
used in domains such as systems biology and neuroscience to gain insight into
the underlying processes that produce observed data. Learning reliable networks
from limited data is difficult, therefore transfer learning can improve the
robustness of learned networks by leveraging data from related tasks. Existing
transfer learning algorithms for Bayesian network structure learning give a
single maximum a posteriori estimate of network models. Yet, many other models
may be equally likely, and so a more informative result is provided by Bayesian
structure discovery. Bayesian structure discovery algorithms estimate posterior
probabilities of structural features, such as edges. We present transfer
learning for Bayesian structure discovery which allows us to explore the shared
and unique structural features among related tasks. Efficient computation
requires that our transfer learning objective factors into local calculations,
which we prove is given by a broad class of transfer biases. Theoretically, we
show the efficiency of our approach. Empirically, we show that compared to
single task learning, transfer learning is better able to positively identify
true edges. We apply the method to whole-brain neuroimaging data.Comment: 10 page
- …