2,029 research outputs found
Permutation Complexity and Coupling Measures in Hidden Markov Models
In [Haruna, T. and Nakajima, K., 2011. Physica D 240, 1370-1377], the authors
introduced the duality between values (words) and orderings (permutations) as a
basis to discuss the relationship between information theoretic measures for
finite-alphabet stationary stochastic processes and their permutation
analogues. It has been used to give a simple proof of the equality between the
entropy rate and the permutation entropy rate for any finite-alphabet
stationary stochastic process and show some results on the excess entropy and
the transfer entropy for finite-alphabet stationary ergodic Markov processes.
In this paper, we extend our previous results to hidden Markov models and show
the equalities between various information theoretic complexity and coupling
measures and their permutation analogues. In particular, we show the following
two results within the realm of hidden Markov models with ergodic internal
processes: the two permutation analogues of the transfer entropy, the symbolic
transfer entropy and the transfer entropy on rank vectors, are both equivalent
to the transfer entropy if they are considered as the rates, and the directed
information theory can be captured by the permutation entropy approach.Comment: 26 page
JIDT: An information-theoretic toolkit for studying the dynamics of complex systems
Complex systems are increasingly being viewed as distributed information
processing systems, particularly in the domains of computational neuroscience,
bioinformatics and Artificial Life. This trend has resulted in a strong uptake
in the use of (Shannon) information-theoretic measures to analyse the dynamics
of complex systems in these fields. We introduce the Java Information Dynamics
Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3
licensed) open-source code implementation for empirical estimation of
information-theoretic measures from time-series data. While the toolkit
provides classic information-theoretic measures (e.g. entropy, mutual
information, conditional mutual information), it ultimately focusses on
implementing higher-level measures for information dynamics. That is, JIDT
focusses on quantifying information storage, transfer and modification, and the
dynamics of these operations in space and time. For this purpose, it includes
implementations of the transfer entropy and active information storage, their
multivariate extensions and local or pointwise variants. JIDT provides
implementations for both discrete and continuous-valued data for each measure,
including various types of estimator for continuous data (e.g. Gaussian,
box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time
due to Java's object-oriented polymorphism. Furthermore, while written in Java,
the toolkit can be used directly in MATLAB, GNU Octave, Python and other
environments. We present the principles behind the code design, and provide
several examples to guide users.Comment: 37 pages, 4 figure
Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation
We adapt tools from information theory to analyze how an observer comes to
synchronize with the hidden states of a finitary, stationary stochastic
process. We show that synchronization is determined by both the process's
internal organization and by an observer's model of it. We analyze these
components using the convergence of state-block and block-state entropies,
comparing them to the previously known convergence properties of the Shannon
block entropy. Along the way, we introduce a hierarchy of information
quantifiers as derivatives and integrals of these entropies, which parallels a
similar hierarchy introduced for block entropy. We also draw out the duality
between synchronization properties and a process's controllability. The tools
lead to a new classification of a process's alternative representations in
terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl
Permutation Complexity via Duality between Values and Orderings
We study the permutation complexity of finite-state stationary stochastic
processes based on a duality between values and orderings between values.
First, we establish a duality between the set of all words of a fixed length
and the set of all permutations of the same length. Second, on this basis, we
give an elementary alternative proof of the equality between the permutation
entropy rate and the entropy rate for a finite-state stationary stochastic
processes first proved in [Amigo, J.M., Kennel, M. B., Kocarev, L., 2005.
Physica D 210, 77-95]. Third, we show that further information on the
relationship between the structure of values and the structure of orderings for
finite-state stationary stochastic processes beyond the entropy rate can be
obtained from the established duality. In particular, we prove that the
permutation excess entropy is equal to the excess entropy, which is a measure
of global correlation present in a stationary stochastic process, for
finite-state stationary ergodic Markov processes.Comment: 26 page
Measuring information-transfer delays
In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics
Symbolic local information transfer
Recently, the permutation-information theoretic approach has been used in a
broad range of research fields. In particular, in the study of highdimensional
dynamical systems, it has been shown that this approach can be effective in
characterizing global properties, including the complexity of their
spatiotemporal dynamics. Here, we show that this approach can also be applied
to reveal local spatiotemporal profiles of distributed computations existing at
each spatiotemporal point in the system. J. T. Lizier et al. have recently
introduced the concept of local information dynamics, which consists of
information storage, transfer, and modification. This concept has been
intensively studied with regard to cellular automata, and has provided
quantitative evidence of several characteristic behaviors observed in the
system. In this paper, by focusing on the local information transfer, we
demonstrate that the application of the permutation-information theoretic
approach, which introduces natural symbolization methods, makes the concept
easily extendible to systems that have continuous states. We propose measures
called symbolic local transfer entropies, and apply these measures to two test
models, the coupled map lattice (CML) system and the Bak-Sneppen model
(BS-model), to show their relevance to spatiotemporal systems that have
continuous states.Comment: 20 pages, 7 figure
- …