3,122 research outputs found

    Assessing Transfer Entropy in cardiovascular and respiratory time series: A VARFI approach

    Get PDF
    In the study of complex biomedical systems represented by multivariate stochastic processes, such as the cardiovascular and respiratory systems, an issue of great relevance is the description of the system dynamics spanning multiple temporal scales. Recently, the quantification of multiscale complexity based on linear parametric models, incorporating autoregressive coefficients and fractional integration, encompassing short term dynamics and long-range correlations, was extended to multivariate time series. Within this Vector AutoRegressive Fractionally Integrated (VARFI) framework formalized for Gaussian processes, in this work we propose to estimate the Transfer Entropy, or equivalently Granger Causality, in the cardiovascular and respiratory systems. This allows to quantify the information flow and assess directed interactions accounting for the simultaneous presence of short-term dynamics and long-range correlations. The proposed approach is first tested on simulations of benchmark VARFI processes where the transfer entropy could be computed from the known model parameters. Then, it is applied to experimental data consisting of heart period, systolic arterial pressure and respiration time series measured in healthy subjects monitored at rest and during mental and postural stress. Both simulations and real data analysis revealed that the proposed method highlights the dependence of the information transfer on the balance between short-term and longrange correlations in coupled dynamical systems

    Measuring information-transfer delays

    Get PDF
    In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    Multiscale Analysis of Information Dynamics for Linear Multivariate Processes

    Full text link
    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using state-space (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors

    Towards a sharp estimation of transfer entropy for identifying causality in financial time series

    Get PDF
    We present an improvement of an estimator of causality in financial time series via transfer entropy, which includes the side information that may affect the cause-effect relation in the system, i.e. a conditional information-transfer based causality. We show that for weakly stationary time series the conditional transfer entropy measure is nonnegative and bounded below by the Geweke's measure of Granger causality. We use k-nearest neighbor distances to estimate entropy and approximate the distribution of the estimator with bootstrap techniques. We give examples of the application of the estimator in detecting causal effects in a simulated autoregressive stationary system in three random variables with linear and non-linear couplings; in a system of non stationary variables; and with real financial data.Postprint (published version

    Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes

    Get PDF
    Exploiting the theory of state space models, we derive the exact expressions of the information transfer, as well as redundant and synergistic transfer, for coupled Gaussian processes observed at multiple temporal scales. All of the terms, constituting the frameworks known as interaction information decomposition and partial information decomposition, can thus be analytically obtained for different time scales from the parameters of the VAR model that fits the processes. We report the application of the proposed methodology firstly to benchmark Gaussian systems, showing that this class of systems may generate patterns of information decomposition characterized by mainly redundant or synergistic information transfer persisting across multiple time scales or even by the alternating prevalence of redundant and synergistic source interaction depending on the time scale. Then, we apply our method to an important topic in neuroscience, i.e., the detection of causal interactions in human epilepsy networks, for which we show the relevance of partial information decomposition to the detection of multiscale information transfer spreading from the seizure onset zone

    Assessing coupling dynamics from an ensemble of time series

    Get PDF
    Finding interdependency relations between (possibly multivariate) time series provides valuable knowledge about the processes that generate the signals. Information theory sets a natural framework for non-parametric measures of several classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be overcome when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy, and their conditional counterparts) which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data that the proposed approach allows to recover the time-resolved dynamics of the coupling between different subsystems
    • …
    corecore