29,837 research outputs found
Information Decomposition and Synergy
Recently, a series of papers addressed the problem of decomposing the information of two random variables into shared information, unique information and synergistic information. Several measures were proposed, although still no consensus has been reached. Here, we compare these proposals with an older approach to define synergistic information based on the projections on exponential families containing only up to k-th order interactions. We show that these measures are not compatible with a decomposition into unique, shared and synergistic information if one requires that all terms are always non-negative (local positivity). We illustrate the difference between the two measures for multivariate Gaussians.EC/FP7/31872
Intersection Information based on Common Randomness
The introduction of the partial information decomposition generated a flurry
of proposals for defining an intersection information that quantifies how much
of "the same information" two or more random variables specify about a target
random variable. As of yet, none is wholly satisfactory. A palatable measure of
intersection information would provide a principled way to quantify slippery
concepts, such as synergy. Here, we introduce an intersection information
measure based on the G\'acs-K\"orner common random variable that is the first
to satisfy the coveted target monotonicity property. Our measure is imperfect,
too, and we suggest directions for improvement.Comment: 19 pages, 5 figure
Synergy as a warning sign of transitions: the case of the two-dimensional Ising model
We consider the formalism of information decomposition of target effects from
multi-source interactions, i.e. the problem of defining redundant and
synergistic components of the information that a set of source variables
provides about a target, and apply it to the two-dimensional Ising model as a
paradigm of a critically transitioning system. Intuitively, synergy is the
information about the target variable that is uniquely obtained taking the
sources together, but not considering them alone; redundancy is the information
which is shared by the sources. To disentangle the components of the
information both at the static level and at the dynamical one, the
decomposition is applied respectively to the mutual information and to the
transfer entropy between a given spin, the target, and a pair of neighbouring
spins (taken as the drivers). We show that a key signature of an impending
phase transition (approached from the disordered size) is the fact that the
synergy peaks in the disordered phase, both in the static and in the dynamic
case: the synergy can thus be considered a precursor of the transition. The
redundancy, instead, reaches its maximum at the critical temperature. The peak
of the synergy of the transfer entropy is far more pronounced than those of the
static mutual information. We show that these results are robust w.r.t. the
details of the information decomposition approach, as we find the same results
using two different methods; moreover, w.r.t. previous literature rooted on the
notion of Global Transfer Entropy, our results demonstrate that considering as
few as three variables is sufficient to construct a precursor of the
transition, and provide a paradigm for the investigation of a variety of
systems prone to crisis, like financial markets, social media, or epileptic
seizures
Information theoretical study of cross-talk mediated signal transduction in MAPK pathways
Biochemical networks related to similar functional pathways are often
correlated due to cross-talk among the homologous proteins in the different
networks. Using a stochastic framework, we address the functional significance
of the cross-talk between two pathways. Our theoretical analysis on generic
MAPK pathways reveals cross-talk is responsible for developing coordinated
fluctuations between the pathways. The extent of correlation evaluated in terms
of the information theoretic measure provides directionality to net information
propagation. Stochastic time series and scattered plot suggest that the
cross-talk generates synchronization within a cell as well as in a cellular
population. Depending on the number of input and output, we identify signal
integration and signal bifurcation motif that arise due to inter-pathway
connectivity in the composite network. Analysis using partial information
decomposition quantifies the net synergy in the information propagation through
these branched pathways.Comment: Revised version, 17 pages, 5 figure
- …