Many methods for causal inference generate directed acyclic graphs (DAGs)
that formalize causal relations between n variables. Given the joint
distribution on all these variables, the DAG contains all information about how
intervening on one variable changes the distribution of the other n−1
variables. However, quantifying the causal influence of one variable on another
one remains a nontrivial question. Here we propose a set of natural, intuitive
postulates that a measure of causal strength should satisfy. We then introduce
a communication scenario, where edges in a DAG play the role of channels that
can be locally corrupted by interventions. Causal strength is then the relative
entropy distance between the old and the new distribution. Many other measures
of causal strength have been proposed, including average causal effect,
transfer entropy, directed information, and information flow. We explain how
they fail to satisfy the postulates on simple DAGs of ≤3 nodes. Finally,
we investigate the behavior of our measure on time-series, supporting our
claims with experiments on simulated data.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1145 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org