Abstract

An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish driving and responding elements and to detect asymmetry in the coupling of subsystems.Comment: 4 pages, 4 Figures, Revte

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 05/06/2019