2,368 research outputs found

    JIDT: An information-theoretic toolkit for studying the dynamics of complex systems

    Get PDF
    Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Java's object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave, Python and other environments. We present the principles behind the code design, and provide several examples to guide users.Comment: 37 pages, 4 figure

    Informative and misinformative interactions in a school of fish

    Get PDF
    It is generally accepted that, when moving in groups, animals process information to coordinate their motion. Recent studies have begun to apply rigorous methods based on Information Theory to quantify such distributed computation. Following this perspective, we use transfer entropy to quantify dynamic information flows locally in space and time across a school of fish during directional changes around a circular tank, i.e. U-turns. This analysis reveals peaks in information flows during collective U-turns and identifies two different flows: an informative flow (positive transfer entropy) based on fish that have already turned about fish that are turning, and a misinformative flow (negative transfer entropy) based on fish that have not turned yet about fish that are turning. We also reveal that the information flows are related to relative position and alignment between fish, and identify spatial patterns of information and misinformation cascades. This study offers several methodological contributions and we expect further application of these methodologies to reveal intricacies of self-organisation in other animal groups and active matter in general

    Are there new models of computation? Reply to Wegner and Eberbach

    Get PDF
    Wegner and Eberbach[Weg04b] have argued that there are fundamental limitations to Turing Machines as a foundation of computability and that these can be overcome by so-called superTuring models such as interaction machines, the [pi]calculus and the $-calculus. In this paper we contest Weger and Eberbach claims

    Coherence and measurement in quantum thermodynamics

    Full text link
    Thermodynamics is a highly successful macroscopic theory widely used across the natural sciences and for the construction of everyday devices, from car engines and fridges to power plants and solar cells. With thermodynamics predating quantum theory, research now aims to uncover the thermodynamic laws that govern finite size systems which may in addition host quantum effects. Here we identify information processing tasks, the so-called "projections", that can only be formulated within the framework of quantum mechanics. We show that the physical realisation of such projections can come with a non-trivial thermodynamic work only for quantum states with coherences. This contrasts with information erasure, first investigated by Landauer, for which a thermodynamic work cost applies for classical and quantum erasure alike. Implications are far-reaching, adding a thermodynamic dimension to measurements performed in quantum thermodynamics experiments, and providing key input for the construction of a future quantum thermodynamic framework. Repercussions are discussed for quantum work fluctuation relations and thermodynamic single-shot approaches.Comment: 6 pages + appendix, 4 figures, v2: changed presentation, critically discuss interpretation as measurement, added new conclusions; previous title: "Quantum measurement and its role in thermodynamics

    Quantifying criticality, information dynamics and thermodynamics of collective motion

    Get PDF
    Active matter consists of self-propelled particles whose interactions give rise to coherent collective motion. Well-known examples include schools of fish, flocks of birds, swarms of insects and herds of ungulates. On the micro-scale, cells, enzymes and bacteria also move collectively as active matter, inspiring engineering of artificial materials and devices. These diverse systems exhibit similar collective behaviours, including gathering, alignment and quick propagation of perturbations, which emerge from relatively simple local interactions. This phenomenon is known as self-organisation and is observed in active matter as well as in many other complex collective phenomena, including urban agglomeration, financial crises, ecosystems dynamics and technological cascading failures. Some open challenges in the study of self-organisation include (a) how the information processing across the collective and over time gives rise to emergent behaviour, (b) how to identify the regimes in which different collective behaviours exist and their phase transitions, and (c) how to quantify the thermodynamics associated with these phenomena. This thesis aims to investigate these topics in the context of active matter, while building a rigorous theoretical framework. Specifically, this thesis provides three main contributions. Firstly, the question of how to formally measure information transfer across the collective is addressed and applied to a real system, i.e., a school of fish. Secondly, general relations between statistical mechanical and thermodynamical quantities are analytically derived and applied to a model of active matter, resulting in the formulation of the concept of “thermodynamic efficiency of computation during collective motion”. This concept is then extended to the domain of urban dynamics. Thirdly, this thesis provides a rigorous quantification of the non-equilibrium entropy production associated with the collective motion of active Brownian particles

    Integrated information theory in complex neural systems

    Get PDF
    This thesis concerns Integrated Information Theory (IIT), a branch of information theory aimed at providing a fundamental theory of consciousness. At its core, lie two powerful intuitions: • That a system that is somehow more than the sum of its parts has non-zero integrated information, Φ; and • That a system with non-zero integrated information is conscious. The audacity of IIT’s claims about consciousness has (understandably) sparked vigorous criticism, and experimental evidence for IIT as a theory of consciousness remains scarce and indirect. Nevertheless, I argue that IIT still has merits as a theory of informational complexity within complexity science, leaving aside all claims about consciousness. In my work I follow this broad line of reasoning: showcasing applications where IIT yields rich analyses of complex systems, while critically examining its merits and limitations as a theory of consciousness. This thesis is divided in three parts. First, I describe three example applications of IIT to complex systems from the computational neuroscience literature (coupled oscillators, spiking neurons, and cellular automata), and develop novel Φ estimators to extend IIT’s range of applicability. Second, I show two important limitations of current IIT: that its axiomatic foundation is not specific enough to determine a unique measure of integrated information; and that available measures do not behave as predicted by the theory when applied to neurophysiological data. Finally, I present new theoretical developments aimed at alleviating some of IIT’s flaws. These are based on the concepts of partial information decomposition and lead to a unification of both theories, Integrated Information Decomposition, or ΦID. The thesis concludes with two experimental studies on M/EEG data, showing that a much simpler informational theory of consciousness – the entropic brain hypothesis – can yield valuable insight without the mathematical challenges brought by IIT.Open Acces

    Investigating Information Flows in Spiking Neural Networks With High Fidelity

    Get PDF
    The brains of many organisms are capable of a wide variety of complex computations. This capability must be undergirded by a more general purpose computational capacity. The exact nature of this capacity, how it is distributed across the brains of organisms and how it arises throughout the course of development is an open topic of scientific investigation. Individual neurons are widely considered to be the fundamental computational units of brains. Moreover, the finest scale at which large scale recordings of brain activity can be performed is the spiking activity of neurons and our ability to perform these recordings over large numbers of neurons and with fine spatial resolution is increasing rapidly. This makes the spiking activity of individual neurons a highly attractive data modality on which to study neural computation. The framework of information dynamics has proven to be a successful approach towards interrogating the capacity for general purpose computation. It does this by revealing the atomic information processing operations of information storage, transfer and modification. Unfortunately, the study of information flows and other information processing operations from the spiking activity of neurons has been severely hindered by the lack of effective tools for estimating these quantities on this data modality. This thesis remedies this situation by presenting an estimator for information flows, as measured by Transfer Entropy (TE), that operates in continuous time on event-based data such as spike trains. Unlike the previous approach to the estimation of this quantity, which discretised the process into time bins, this estimator operates on the raw inter-spike intervals. It is demonstrated to be far superior to the previous discrete-time approach in terms of consistency, rate of convergence and bias. Most importantly, unlike the discrete-time approach, which requires a hard tradeoff between capturing fine temporal precision or history effects occurring over reasonable time intervals, this estimator can capture history effects occurring over relatively large intervals without any loss of temporal precision. This estimator is applied to developing dissociated cultures of cortical rat neurons, therefore providing the first high-fidelity study of information flows on spiking data. It is found that the spatial structure of the flows locks in to a significant extent. at the point of their emergence and that certain nodes occupy specialised computational roles as either transmitters, receivers or mediators of information flow. Moreover, these roles are also found to lock in early. In order to fully understand the structure of neural information flows, however, we are required to go beyond pairwise interactions, and indeed multivariate information flows have become an important tool in the inference of effective networks from neuroscience data. These are directed networks where each node is connected to a minimal set of sources which maximally reduce the uncertainty in its present state. However, the application of multivariate information flows to the inference of effective networks from spiking data has been hampered by the above-mentioned issues with preexisting estimation techniques. Here, a greedy algorithm which iteratively builds a set of parents for each target node using multivariate transfer entropies, and which has already been well validated in the context of traditional discretely sampled time series, is adapted to use in conjunction with the newly developed estimator for event-based data. The combination of the greedy algorithm and continuous-time estimator is then validated on simulated examples for which the ground truth is known. The new capabilities in the estimation of information flows and the inference of effective networks on event-based data presented in this work represent a very substantial step forward in our ability to perform these analyses on the ever growing set of high resolution, large scale recordings of interacting neurons. As such, this work promises to enable substantial quantitative insights in the future regarding how neurons interact, how they process information, and how this changes under different conditions such as disease

    Information-theoretic Reasoning in Distributed and Autonomous Systems

    Get PDF
    The increasing prevalence of distributed and autonomous systems is transforming decision making in industries as diverse as agriculture, environmental monitoring, and healthcare. Despite significant efforts, challenges remain in robustly planning under uncertainty. In this thesis, we present a number of information-theoretic decision rules for improving the analysis and control of complex adaptive systems. We begin with the problem of quantifying the data storage (memory) and transfer (communication) within information processing systems. We develop an information-theoretic framework to study nonlinear interactions within cooperative and adversarial scenarios, solely from observations of each agent's dynamics. This framework is applied to simulations of robotic soccer games, where the measures reveal insights into team performance, including correlations of the information dynamics to the scoreline. We then study the communication between processes with latent nonlinear dynamics that are observed only through a filter. By using methods from differential topology, we show that the information-theoretic measures commonly used to infer communication in observed systems can also be used in certain partially observed systems. For robotic environmental monitoring, the quality of data depends on the placement of sensors. These locations can be improved by either better estimating the quality of future viewpoints or by a team of robots operating concurrently. By robustly handling the uncertainty of sensor model measurements, we are able to present the first end-to-end robotic system for autonomously tracking small dynamic animals, with a performance comparable to human trackers. We then solve the issue of coordinating multi-robot systems through distributed optimisation techniques. These allow us to develop non-myopic robot trajectories for these tasks and, importantly, show that these algorithms provide guarantees for convergence rates to the optimal payoff sequence
    corecore