682 research outputs found
On directed information theory and Granger causality graphs
Directed information theory deals with communication channels with feedback.
When applied to networks, a natural extension based on causal conditioning is
needed. We show here that measures built from directed information theory in
networks can be used to assess Granger causality graphs of stochastic
processes. We show that directed information theory includes measures such as
the transfer entropy, and that it is the adequate information theoretic
framework needed for neuroscience applications, such as connectivity inference
problems.Comment: accepted for publications, Journal of Computational Neuroscienc
Unifying Gaussian LWF and AMP Chain Graphs to Model Interference
An intervention may have an effect on units other than those to which it was
administered. This phenomenon is called interference and it usually goes
unmodeled. In this paper, we propose to combine Lauritzen-Wermuth-Frydenberg
and Andersson-Madigan-Perlman chain graphs to create a new class of causal
models that can represent both interference and non-interference relationships
for Gaussian distributions. Specifically, we define the new class of models,
introduce global and local and pairwise Markov properties for them, and prove
their equivalence. We also propose an algorithm for maximum likelihood
parameter estimation for the new models, and report experimental results.
Finally, we show how to compute the effects of interventions in the new models.Comment: v2: Section 6 has been added. v3: Sections 7 and 8 have been added.
v4: Major reorganization. v5: Major reorganization. v6-v7: Minor changes. v8:
Addition of Appendix B. v9: Section 7 has been rewritte
- …