9 research outputs found
Information-Theoretic Methods for Identifying Relationships among Climate Variables
Information-theoretic quantities, such as entropy, are used to quantify the
amount of information a given variable provides. Entropies can be used together
to compute the mutual information, which quantifies the amount of information
two variables share. However, accurately estimating these quantities from data
is extremely challenging. We have developed a set of computational techniques
that allow one to accurately compute marginal and joint entropies. These
algorithms are probabilistic in nature and thus provide information on the
uncertainty in our estimates, which enable us to establish statistical
significance of our findings. We demonstrate these methods by identifying
relations between cloud data from the International Satellite Cloud Climatology
Project (ISCCP) and data from other sources, such as equatorial pacific sea
surface temperatures (SST).Comment: Presented at the Earth-Sun System Technology Conference (ESTC 2008),
Adelphi, MD. http://esto.nasa.gov/conferences/estc2008/ 3 pages, 3 figures.
Appears in the Proceedings of the Earth-Sun System Technology Conference
(ESTC 2008), Adelphi, M
Analysis of parameter changes of a neuronal network model using transfer entropy
Understanding the dynamics of coupled neurons is one of the fundamental problems in the analysis of neuronal model dynamics. The transfer entropy (TE) method is one of the primary analyses to explore the information flow between the neuronal populations. We perform the TE analysis on
the two-neuron conductance-based Hodgkin-Huxley (HH) neuronal network to analyze how their connectivity changes due to conductances. We find that the information flow due to underlying synaptic connectivity changes direction by changing conductances individually and/or
simultaneously as a result of TE analysis through numerical simulations.No sponso
Effects of neuronal noise on neural communication
In this work, we propose an approach to better understand the effects of neuronal noise on neural communication systems. Here, we extend the fundamental Hodgkin-Huxley (HH) model by adding synaptic couplings to represent the statistical dependencies among different neurons under the effect of additional noise. We estimate directional information-theoretic quantities, such as the Transfer Entropy (TE), to infer the couplings between neurons under the effect of different noise levels. Based on our computational simulations, we demonstrate that these nonlinear systems can behave beyond our predictions and TE is an ideal tool to extract such dependencies from data.No sponso
Statistical approaches for the analysis of dependency among neurons under noise
Neuronal noise is a major factor affecting the communication between coupled neurons. In this work, we propose a statistical toolset to infer the coupling between two neurons under noise. We estimate these statistical dependencies from data which are generated by a coupled Hodgkin–Huxley (HH) model with additive noise. To infer the coupling using observation data, we employ copulas and information-theoretic quantities, such as the mutual information (MI) and the transfer entropy (TE). Copulas and MI between two variables are symmetric quantities, whereas TE is asymmetric. We demonstrate the performances of copulas and MI as functions of different noise levels and show that they are effective in the identification of the interactions due to coupling and noise. Moreover,
we analyze the inference of TE values between neurons as a function of noise and conclude that TE is an effective tool for finding out the direction of coupling between neurons under the effects of noise.No sponso
Transfer entropy
Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality.
This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.No sponso
Transfer entropy
Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality.
This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.No sponso
Statistical approaches for the analysis of dependency among neurons under noise
Neuronal noise is a major factor affecting the communication between coupled neurons. In this work, we propose a statistical toolset to infer the coupling between two neurons under noise. We estimate these statistical dependencies from data which are generated by a coupled Hodgkin–Huxley (HH) model with additive noise. To infer the coupling using observation data, we employ copulas and information-theoretic quantities, such as the mutual information (MI) and the transfer entropy (TE). Copulas and MI between two variables are symmetric quantities, whereas TE is asymmetric. We demonstrate the performances of copulas and MI as functions of different noise levels and show that they are effective in the identification of the interactions due to coupling and noise. Moreover,
we analyze the inference of TE values between neurons as a function of noise and conclude that TE is an effective tool for finding out the direction of coupling between neurons under the effects of noise.No sponso