75,916 research outputs found
On directed information theory and Granger causality graphs
Directed information theory deals with communication channels with feedback.
When applied to networks, a natural extension based on causal conditioning is
needed. We show here that measures built from directed information theory in
networks can be used to assess Granger causality graphs of stochastic
processes. We show that directed information theory includes measures such as
the transfer entropy, and that it is the adequate information theoretic
framework needed for neuroscience applications, such as connectivity inference
problems.Comment: accepted for publications, Journal of Computational Neuroscienc
Causal conditioning and instantaneous coupling in causality graphs
The paper investigates the link between Granger causality graphs recently
formalized by Eichler and directed information theory developed by Massey and
Kramer. We particularly insist on the implication of two notions of causality
that may occur in physical systems. It is well accepted that dynamical
causality is assessed by the conditional transfer entropy, a measure appearing
naturally as a part of directed information. Surprisingly the notion of
instantaneous causality is often overlooked, even if it was clearly understood
in early works. In the bivariate case, instantaneous coupling is measured
adequately by the instantaneous information exchange, a measure that
supplements the transfer entropy in the decomposition of directed information.
In this paper, the focus is put on the multivariate case and conditional graph
modeling issues. In this framework, we show that the decomposition of directed
information into the sum of transfer entropy and information exchange does not
hold anymore. Nevertheless, the discussion allows to put forward the two
measures as pillars for the inference of causality graphs. We illustrate this
on two synthetic examples which allow us to discuss not only the theoretical
concepts, but also the practical estimation issues.Comment: submitte
Quantifying causal influences
Many methods for causal inference generate directed acyclic graphs (DAGs)
that formalize causal relations between variables. Given the joint
distribution on all these variables, the DAG contains all information about how
intervening on one variable changes the distribution of the other
variables. However, quantifying the causal influence of one variable on another
one remains a nontrivial question. Here we propose a set of natural, intuitive
postulates that a measure of causal strength should satisfy. We then introduce
a communication scenario, where edges in a DAG play the role of channels that
can be locally corrupted by interventions. Causal strength is then the relative
entropy distance between the old and the new distribution. Many other measures
of causal strength have been proposed, including average causal effect,
transfer entropy, directed information, and information flow. We explain how
they fail to satisfy the postulates on simple DAGs of nodes. Finally,
we investigate the behavior of our measure on time-series, supporting our
claims with experiments on simulated data.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1145 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Quantifying 'causality' in complex systems: Understanding Transfer Entropy
'Causal' direction is of great importance when dealing with complex systems.
Often big volumes of data in the form of time series are available and it is
important to develop methods that can inform about possible causal connections
between the different observables. Here we investigate the ability of the
Transfer Entropy measure to identify causal relations embedded in emergent
coherent correlations. We do this by firstly applying Transfer Entropy to an
amended Ising model. In addition we use a simple Random Transition model to
test the reliability of Transfer Entropy as a measure of `causal' direction in
the presence of stochastic fluctuations. In particular we systematically study
the effect of the finite size of data sets
Measuring Shared Information and Coordinated Activity in Neuronal Networks
Most nervous systems encode information about stimuli in the responding
activity of large neuronal networks. This activity often manifests itself as
dynamically coordinated sequences of action potentials. Since multiple
electrode recordings are now a standard tool in neuroscience research, it is
important to have a measure of such network-wide behavioral coordination and
information sharing, applicable to multiple neural spike train data. We propose
a new statistic, informational coherence, which measures how much better one
unit can be predicted by knowing the dynamical state of another. We argue
informational coherence is a measure of association and shared information
which is superior to traditional pairwise measures of synchronization and
correlation. To find the dynamical states, we use a recently-introduced
algorithm which reconstructs effective state spaces from stochastic time
series. We then extend the pairwise measure to a multivariate analysis of the
network by estimating the network multi-information. We illustrate our method
by testing it on a detailed model of the transition from gamma to beta rhythms.Comment: 8 pages, 6 figure
Causal inference using the algorithmic Markov condition
Inferring the causal structure that links n observables is usually based upon
detecting statistical dependences and choosing simple graphs that make the
joint measure Markovian. Here we argue why causal inference is also possible
when only single observations are present.
We develop a theory how to generate causal graphs explaining similarities
between single objects. To this end, we replace the notion of conditional
stochastic independence in the causal Markov condition with the vanishing of
conditional algorithmic mutual information and describe the corresponding
causal inference rules.
We explain why a consistent reformulation of causal inference in terms of
algorithmic complexity implies a new inference principle that takes into
account also the complexity of conditional probability densities, making it
possible to select among Markov equivalent causal graphs. This insight provides
a theoretical foundation of a heuristic principle proposed in earlier work.
We also discuss how to replace Kolmogorov complexity with decidable
complexity criteria. This can be seen as an algorithmic analog of replacing the
empirically undecidable question of statistical independence with practical
independence tests that are based on implicit or explicit assumptions on the
underlying distribution.Comment: 16 figure
Causal Dependence Tree Approximations of Joint Distributions for Multiple Random Processes
We investigate approximating joint distributions of random processes with
causal dependence tree distributions. Such distributions are particularly
useful in providing parsimonious representation when there exists causal
dynamics among processes. By extending the results by Chow and Liu on
dependence tree approximations, we show that the best causal dependence tree
approximation is the one which maximizes the sum of directed informations on
its edges, where best is defined in terms of minimizing the KL-divergence
between the original and the approximate distribution. Moreover, we describe a
low-complexity algorithm to efficiently pick this approximate distribution.Comment: 9 pages, 15 figure
- …