57,593 research outputs found

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    Entropy production from stochastic dynamics in discrete full phase space

    Get PDF
    The stochastic entropy generated during the evolution of a system interacting with an environment may be separated into three components, but only two of these have a non-negative mean. The third component of entropy production is associated with the relaxation of the system probability distribution towards a stationary state and with nonequilibrium constraints within the dynamics that break detailed balance. It exists when at least some of the coordinates of the system phase space change sign under time reversal, and when the stationary state is asymmetric in these coordinates. We illustrate the various components of entropy production, both in detail for particular trajectories and in the mean, using simple systems defined on a discrete phase space of spatial and velocity coordinates. These models capture features of the drift and diffusion of a particle in a physical system, including the processes of injection and removal and the effect of a temperature gradient. The examples demonstrate how entropy production in stochastic thermodynamics depends on the detail that is included in a model of the dynamics of a process. Entropy production from such a perspective is a measure of the failure of such models to meet Loschmidt's expectation of dynamic reversibility

    Biophotons and emergence of quantum coherence : a diffusion entropy analysis

    Get PDF
    We study the emission of photons from germinating seeds using an experimental technique designed to detect light of extremely small intensity. We analyze the dark count signal without germinating seeds as well as the photon emission during the germination process. The technique of analysis adopted here, called diffusion entropy analysis (DEA) and originally designed to measure the temporal complexity of astrophysical, sociological and physiological processes, rests on Kolmogorov complexity. The updated version of DEA used in this paper is designed to determine if the signal complexity is generated either by non-ergodic crucial events with a non-stationary correlation function or by the infinite memory of a stationary but non-integrable correlation function or by a mixture of both processes. We find that dark count yields the ordinary scaling, thereby showing that no complexity of either kinds may occur without any seeds in the chamber. In the presence of seeds in the chamber anomalous scaling emerges, reminiscent of that found in neuro-physiological processes. However, this is a mixture of both processes and with the progress of germination the non-ergodic component tends to vanish and complexity becomes dominated by the stationary infinite memory. We illustrate some conjectures ranging from stress induced annihilation of crucial events to the emergence of quantum coherence

    Pathwise Sensitivity Analysis in Transient Regimes

    Full text link
    The instantaneous relative entropy (IRE) and the corresponding instanta- neous Fisher information matrix (IFIM) for transient stochastic processes are pre- sented in this paper. These novel tools for sensitivity analysis of stochastic models serve as an extension of the well known relative entropy rate (RER) and the corre- sponding Fisher information matrix (FIM) that apply to stationary processes. Three cases are studied here, discrete-time Markov chains, continuous-time Markov chains and stochastic differential equations. A biological reaction network is presented as a demonstration numerical example
    • …
    corecore