4 research outputs found
Expressive power of binary relevance and chain classifiers based on Bayesian Networks for multi-label classification
Bayesian network classifiers are widely used in machine learning because they intuitively represent causal relations. Multi-label classification problems require each instance to be assigned a subset of a defined set of h labels. This problem is equivalent to finding a multi-valued decision function that predicts a vector of h binary classes. In this paper we obtain the decision boundaries of two widely used Bayesian network approaches for building multi-label classifiers: Multi-label Bayesian network classifiers built using the binary relevance method and Bayesian network chain classifiers. We extend our previous single-label results to multi-label chain classifiers, and we prove that, as expected, chain classifiers provide a more expressive model than the binary relevance method
A geometric characterisation of sensitivity analysis in monomial models
Sensitivity analysis in probabilistic discrete graphical models is usually
conducted by varying one probability value at a time and observing how this
affects output probabilities of interest. When one probability is varied then
others are proportionally covaried to respect the sum-to-one condition of
probability laws. The choice of proportional covariation is justified by a
variety of optimality conditions, under which the original and the varied
distributions are as close as possible under different measures of closeness.
For variations of more than one parameter at a time proportional covariation is
justified in some special cases only. In this work, for the large class of
discrete statistical models entertaining a regular monomial parametrisation, we
demonstrate the optimality of newly defined proportional multi-way schemes with
respect to an optimality criterion based on the notion of I-divergence. We
demonstrate that there are varying parameters choices for which proportional
covariation is not optimal and identify the sub-family of model distributions
where the distance between the original distribution and the one where
probabilities are covaried proportionally is minimum. This is shown by adopting
a new formal, geometric characterization of sensitivity analysis in monomial
models, which include a wide array of probabilistic graphical models. We also
demonstrate the optimality of proportional covariation for multi-way analyses
in Naive Bayes classifiers