2,167 research outputs found

    Measuring information-transfer delays

    Get PDF
    In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics

    Information transfer and causality in the sensorimotor loop

    Get PDF
    This thesis investigates information-theoretic tools for detecting and describing causal influences in embodied agents. It presents an analysis of philosophical and statistical approaches to causation, and in particular focuses on causal Bayes nets and transfer entropy. It argues for a novel perspective that explicitly incorporates the epistemological role of information as a tool for inference. This approach clarifies and resolves some of the known problems associated with such methods. Here it is argued, through a series of experiments, mathematical results and some philosophical accounts, that universally applicable measures of causal influence strength are unlikely to exist. Instead, the focus should be on the role that information-theoretic tools can play in inferential tests for causal relationships in embodied agents particularly, and dynamical systems in general. This thesis details how these two approaches differ. Following directly from these arguments, the thesis proposes a concept of “hidden” information transfer to describe situations where causal influences passing through a chain of variables may be more easily detected at the end-points than at intermediate nodes. This is described using theoretical examples, and also appears in the information dynamics of computer-simulated and real robots developed herein. Practical examples include some minimal models of agent-environment systems, but also a novel complete system for generating locomotion gait patterns using a biologically-inspired decentralized architecture on a walking robotic hexapod

    Temporal information partitioning networks to infer ecohydrologic behaviors

    Get PDF
    An ecohydrologic system is a complex network, in which the shifting behavior of individual components and the connectivity between them determines the dynamics. This connectivity between components can act to constrain, accentuate, or otherwise modify the variability of individuals. In an ecohydrologic system, connectivity exists in the form of many time dependent relationships between states and fluxes related to water, energy, nutrients, soils, and vegetation. Although relationships are constrained by conservation laws, they exhibit a wide range of variability at many timescales due to non-linear interactions, threshold behavior, forcing, and feedback. Moreover, these aspects of connectivity and variability exist at a single location or over a spatial gradient. The understanding of this connectivity within the system as a whole necessitates an appropriate framework, in which evolving interactions are identified from time-series observations. The goals of this thesis are to (i) develop a Temporal Information Partitioning Network (TIPNet) framework for understanding the joint variability of network components as characterized by time-series data, and (ii) apply this framework to understand ecohydrologic systems across climate gradients based on flux tower and weather station observations. In the TIPNet framework, nodes in the network are time-series variables, and links are information theoretic measures that quantify multivariate lagged time-dependencies from lagged "source" nodes to "target" nodes. The strength of this framework is its ability to characterize information flow between variables over short time windows, and further distinguish aspects of unique, redundant, and synergistic dependencies. Redundant information is overlapping information provided by multiple sources to a target, unique information is only provided by a single target, and synergistic information is provided only when two or more sources are known together. Based on data from three Critical Zone Observatories, we find that network structure shifts according to conditions at sub-daily time scales and constraints imposed by seasonal energy and water availability. TIPNets constructed from 1-minute weather station data reveal shifts in time-scales and levels of uniqueness, synergy, and redundancy between wet and dry conditions. A more complex network of synergistic interactions characterizes several-hour windows when surfaces are wet, and peaks in information flow during the growing season correspond to shifts in precipitation patterns. Networks based on half hourly flux tower data reveal seasonal shifts in the nature of forcing to carbon and heat fluxes from radiation, atmospheric, and soil subsystems. Along two study transects, we attribute variability in heat and carbon fluxes within constraints imposed by energy and moisture availability to joint interactions that are more synergistic in the spring and redundant in the fall. Finally, we explore the nature of information flow along an elevation gradient from flux towers located along a transect to gauge local versus non-local connectivity. While the strength of shared information between variables at a site reflects local connectivity, shared information between variables at different sites reflects non-local connectivity. Along two elevation transects, we find that information flow between distant sites indicates directional connectivity that is related to dominant weather patterns. At the Southern Sierra CZO in California, non-local information flow is dominantly west to east, corresponding to weather forcing from the Pacific Ocean eastward, while non-local flow has less directionality at Reynolds Creek CZO, where sites are much closer together and there is no dominant weather forcing direction along the transect. The developed framework and applications presented in this thesis reveal the common presence of multivariate process interactions at timescales from minutes to hours, many of which would not be detected using traditional approaches. For an ecohydrologic system, the complex network of relationships dictates ecosystem resilience to perturbations such as climate change, drought, or human influences. More broadly, the methods and framework developed here contribute toward a holistic understanding of complex systems, and are applicable to a range of studies of evolving networks

    Information-theoretic Reasoning in Distributed and Autonomous Systems

    Get PDF
    The increasing prevalence of distributed and autonomous systems is transforming decision making in industries as diverse as agriculture, environmental monitoring, and healthcare. Despite significant efforts, challenges remain in robustly planning under uncertainty. In this thesis, we present a number of information-theoretic decision rules for improving the analysis and control of complex adaptive systems. We begin with the problem of quantifying the data storage (memory) and transfer (communication) within information processing systems. We develop an information-theoretic framework to study nonlinear interactions within cooperative and adversarial scenarios, solely from observations of each agent's dynamics. This framework is applied to simulations of robotic soccer games, where the measures reveal insights into team performance, including correlations of the information dynamics to the scoreline. We then study the communication between processes with latent nonlinear dynamics that are observed only through a filter. By using methods from differential topology, we show that the information-theoretic measures commonly used to infer communication in observed systems can also be used in certain partially observed systems. For robotic environmental monitoring, the quality of data depends on the placement of sensors. These locations can be improved by either better estimating the quality of future viewpoints or by a team of robots operating concurrently. By robustly handling the uncertainty of sensor model measurements, we are able to present the first end-to-end robotic system for autonomously tracking small dynamic animals, with a performance comparable to human trackers. We then solve the issue of coordinating multi-robot systems through distributed optimisation techniques. These allow us to develop non-myopic robot trajectories for these tasks and, importantly, show that these algorithms provide guarantees for convergence rates to the optimal payoff sequence

    Causal history analysis of complex system dynamics

    Get PDF
    Complex system arises as a result of inter-dependencies between multiple components. The nonlinear interactions occurring in the system usually lead to emergent behaviors. The emergence prevails in many natural systems, such as the fractal dynamics of stream chemistry, the chaotic behavior of atmospheric convection, the entropy production due to the dissipative structure of plants, and so forth. Multivariate interactions of the entire system definitely play a key role in sustaining these emergent behaviors, which will not happen solely based on the dynamics of univariate or the interactions within a specific set of variables. Therefore, improving the understanding on the whole system dynamics requires the consideration of how the entire evolutionary dynamics of a system, termed causal history, jointly shapes its present state. In this dissertation, the primary goal is to establish a framework for the study of whole system evolutionary dynamics from multivariate interactions. To achieve that, an information-theoretic formulation is developed to characterize the joint influence from the entire causal history to the present state of each variable using a directed acyclic graph representation. The proposed framework builds on the quantification and characterization of information flow from one source through a causal pathway and two sources through the interaction of separable pathways, which takes advantage of the idea of momentary information transfer and partial information decomposition. Momentary information transfer captures the amount of information flow between any two variables lagged at two specific points in time. Partial information decomposition characterizes the joint effect from two sources into redundant, synergistic and unique contributions. To evaluate the joint influence from the causal history, we partition it into immediate causal history, as a function of lag τ from the recent time, to capture the influence of recent dynamics, and the complementary distant causal history. Further, each of these influences are decomposed into self and cross feedbacks. Such a partition allows the characterization of the information flow from the self- and cross-dependencies with other variables in both histories. This causal history analysis approach is then implemented to investigate the dynamics of different types of systems. It successfully illustrates the memory dependencies of short- and long-memory processes. Further, we find the information characterization differs from system to system, illustrating their various dynamics. A long-memory process, for instance, is sustained by self-feedback-dominated recent dynamics and cross- dependency dominated earlier dynamics. In the analysis of observed stream chemistry data, this analysis indicates the key role of the flow rate in creating cross connectivities among stream solutes and also its influence on the dynamics of each solute. Meanwhile, the information from cross-dependence is non-negligible even after correcting for the dependency of flow rate in raw data. It suggests that besides its self-feedback interaction, the resulting 1/f signature of each solute is also maintained by the interactions with other variables in the stream. Last, we evaluate the structure of numerical models based on the idea of information flow between variables. Since we have the ability to intervene in numerical models, the evaluation analyzes how intervening or freezing one or multiple lagged source variables impacts the dynamics of each target variable. Such interventional-effect is different from the prior observational data based analysis anchored on statistical dependencies, and thus provides a complementary view on the component interaction. The analysis of the Lorenz model illustrates the potential contradictory conclusion drawn from the two perspectives, in terms of the extent of information transferred from source variables. It, therefore, reveals the importance of numerical modelling effort in providing insights on the dynamics of the simulated natural systems, in addition to the analysis of observational data. A better and deeper understanding of complex system dynamics is becoming a necessity due to a higher demand on multidisciplinary research nowadays. With increasing availability of observational data and complexity of numerical models, the information-theoretic metrics proposed and utilized here open new avenues for understanding complex system dynamics
    corecore