Bayesian Learning of Markov Network Structure


We propose a simple and efficient approach to building undirected probabilistic classification models (Markov networks) that extend naive Bayes classifiers and outperform existing directed probabilistic classifiers (Bayesian networks) of similar complexity. Our Markov network model is represented as a set of consistent probability distributions on subsets of variables. Inference with such a model can be done efficiently in closed form for problems like class probability estimation. We also propose a highly efficient Bayesian structure learning algorithm for conditional prediction problems, based on integrating along a hill-climb in the structure space. Our prior based on the degrees of freedom effectively prevents overfitting

Similar works

Full text

thumbnail-image time updated on 7/12/2013View original full text link

This paper was published in ePrints.FRI.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.