Abstract

Abstract

From causal theory it is known that the independencies entailed by deterministic relations in a stochastic distribution cannot be represented by a faithful causal model. Deterministic relations lead to situations in which either of two variables X and Y become conditionally independent from a third variable Z by conditioning on the other variable. More generally, this occurs when X and Y contain the same information about Z, they are called information-equivalent. The joint distribution defines an equivalent partitioning of the domains of X and Y by which only the states are related for which the conditional distribution of target Z is the same, hence P (Z | X) = P (Z | Y). We propose to select the relation with the target variable containing the least complexity. Under the assumption that complexity does not increase along a Markov chain, this selection criterion results in consistent models. Faithfulness of the graph can be reestablished by limiting the conditional independencies by the simplicity criterion in cases of equivalent information. On the other hand, all conditional independencies among the variables can be retrieved from the graph by a generalized definition of the d-separation property. Finally, the PC algorithm was extended to learn models containing information-equivalent variables from data.

    Similar works

    Full text

    thumbnail-image

    Available Versions