Skip to main content
Article thumbnail
Location of Repository

Measuring information-transfer delays

By Michael Wibral, Nicolae Pampu, Viola Priesemann, Felix Siebenhühner, Hannes Seiwert, Michael Lindner, Joseph T. Lizier and Raul Vicente


In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics

Topics: ddc:610
Year: 2013
OAI identifier:

Suggested articles


  1. (2011). A
  2. (2003). An improved algorithm for the detection of dynamical interdependence in bivariate time-series.
  3. (2000). Anticipating chaotic synchronization.
  4. (2010). Assessing coupling dynamics from an ensemble of time series.
  5. (2005). Assessing interactions of linear and nonlinear neuronal sources using meg beamformers: a proof of concept.
  6. (1972). Autapses in neocortex cerebri: synapses between a pyramidal cell’s axon and its own dendrites.
  7. (2005). Binless strategies for estimation of information from neural data.
  8. (2006). Bivariate nonlinear prediction to quantify the strength of complex dynamical interactions in short-term cardiovascular variability.
  9. (2010). Causal relationships between frequency bands of extracellular signals in visual cortex revealed by an information theoretic analysis.
  10. (2000). Causality: models, reasoning, and inference.
  11. (1997). Conduction in segmentally demyelinated mammalian central axons.
  12. (2009). Confounding effects of indirect connections on causality estimation.
  13. (2008). Decomposition of working memoryrelated scalp erps: crossvalidation of fmri-constrained source analysis and ica.
  14. (2005). Detecting nonlinearity in structural systems using the transfer entropy.
  15. (1993). Detection of Abrupt changes: Theory and Application.
  16. (2006). Development of interaction measures based on adaptive non-linear time series analysis of biomedical signals.
  17. (2010). Differentiating information transfer and causal effect.
  18. (2011). Distinguishing anticipation from causality: anticipatory bias in the estimation of information ow.
  19. (2009). Dynamic system change detection using a modification of the transfer entropy.
  20. (1981). Dynamical Systems and Turbulence,
  21. (2011). Empirical and theoretical aspects of generation and transfer of information in a neuromagnetic source network.
  22. (2004). Estimating mutual information.
  23. (2007). Evaluating information transfer between auditory cortical neurons.
  24. (2009). Evaluation of the performance of information theory-based methods and cross-correlation to estimate the functional connectivity in cortical networks.
  25. (2010). Exploring transient transfer entropy based on a group-wise ica decomposition of eeg data.
  26. (2011). Extending transfer entropy improves identification of effective connectivity in a spiking cortical network model.
  27. (2009). Granger causality and transfer entropy are equivalent for Gaussian variables.
  28. (2012). Impaired gamma-band activity during perceptual organization in adults with autism spectrum disorders: evidence for dysfunctional network activity in frontalposterior cortices.
  29. (2010). Information modification and particle collisions in distributed computation.
  30. (2009). Information ow and application to epileptogenic focus localization from intracranial EEG.
  31. (2008). Information ows in causal networks.
  32. (2012). JIDT: An information-theoretic toolkit for studying the dynamics of complex systems. URL Accessed
  33. Ledberg A (2012) When two become one: the limits of causality analysis of brain dynamics.
  34. (1998). Limit Theorems in Change-Point Analysis.
  35. (2008). Local information transfer as a spatiotemporal filter for complex systems.
  36. (2012). Local measures of information storage in complex distributed computation.
  37. (2002). Markov models from data by simple nonlinear time series predictors in delay embedding spaces.
  38. (2000). Measuring information transfer.
  39. Mirasso CR (2012) Synchronization in simple network motifs with negligible correlation and mutual information measures.
  40. (2011). Momentary information transfer as a coupling measure of time series.
  41. (2012). Multivariate construction of effective computational networks from observational data.
  42. (2011). Multivariate informationtheoretic measures reveal directed information structure and task relevant changes in fmri connectivity.
  43. (1979). Nearest neighbour searches and the curse of dimensionality.
  44. (1996). Nonlinear dynamics and Time Series: Building a Bridge between natural and statistical sciences, Fields Institute Communications,
  45. (2003). Nonlinear Time Series Analysis.
  46. (2011). On directed information theory and Granger causality graphs.
  47. (2010). Optimal information transfer in the cortex through synchronization.
  48. (2012). Quantifying additive evoked contributions to the event-related potential.
  49. (2008). Robustly estimating the ow direction of information in complex physical systems.
  50. (1987). Sample estimate of entropy of a random vector.
  51. (2012). Schizophrenia, myelination, and delayed corollary discharges: a hypothesis.
  52. (2003). Statistical assessment of nonlinear causality: application to epileptic EEG signals.
  53. (2009). Symbolic transfer entropy: inferring directionality in biosignals.
  54. (2001). Synchronization as adjustment of information rates: detection from bivariate time series.
  55. (2010). Testing methodologies for the nonlinear analysis of causal relationships in neurovascular coupling.
  56. (1999). Testing non-linearity and directedness of interactions between neural groups in the macaque inferotemporal cortex.
  57. (1997). The Psychophysics Toolbox.
  58. (1997). The VideoToolbox software for visual psychophysics: transforming numbers into movies.
  59. (2002). Thresholding of statistical maps in functional neuroimaging using the false discovery rate.
  60. (2011). Transfer entropy – a modelfree measure of effective connectivity for the neurosciences.
  61. (2011). Transfer entropy in magnetoencephalographic data: Quantifying information ow in cortical and cerebellar networks.
  62. (2011). Trentool: A matlab open source toolbox to analyse information ow in time series data with transfer entropy.
  63. (1990). Visual-response properties of neurons in turtle basal optic nucleus in vitro.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.