13 research outputs found

    The use of information theory in evolutionary biology

    Full text link
    Information is a key concept in evolutionary biology. Information is stored in biological organism's genomes, and used to generate the organism as well as to maintain and control it. Information is also "that which evolves". When a population adapts to a local environment, information about this environment is fixed in a representative genome. However, when an environment changes, information can be lost. At the same time, information is processed by animal brains to survive in complex environments, and the capacity for information processing also evolves. Here I review applications of information theory to the evolution of proteins as well as to the evolution of information processing in simulated agents that adapt to perform a complex task.Comment: 25 pages, 7 figures. To appear in "The Year in Evolutionary Biology", of the Annals of the NY Academy of Science

    The Minimal Complexity of Adapting Agents Increases with Fitness

    Get PDF
    What is the relationship between the complexity and the fitness of evolved organisms, whether natural or artificial? It has been asserted, primarily based on empirical data, that the complexity of plants and animals increases as their fitness within a particular environment increases via evolution by natural selection. We simulate the evolution of the brains of simple organisms living in a planar maze that they have to traverse as rapidly as possible. Their connectome evolves over 10,000s of generations. We evaluate their circuit complexity, using four information-theoretical measures, including one that emphasizes the extent to which any network is an irreducible entity. We find that their minimal complexity increases with their fitness

    Integrated information increases with fitness in the evolution of animats

    Get PDF
    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.Comment: 27 pages, 8 figures, one supplementary figure. Three supplementary video files available on request. Version commensurate with published text in PLoS Comput. Bio

    Inferring Coupling of Distributed Dynamical Systems via Transfer Entropy

    Full text link
    In this work, we are interested in structure learning for a set of spatially distributed dynamical systems, where individual subsystems are coupled via latent variables and observed through a filter. We represent this model as a directed acyclic graph (DAG) that characterises the unidirectional coupling between subsystems. Standard approaches to structure learning are not applicable in this framework due to the hidden variables, however we can exploit the properties of certain dynamical systems to formulate exact methods based on state space reconstruction. We approach the problem by using reconstruction theorems to analytically derive a tractable expression for the KL-divergence of a candidate DAG from the observed dataset. We show this measure can be decomposed as a function of two information-theoretic measures, transfer entropy and stochastic interaction. We then present two mathematically robust scoring functions based on transfer entropy and statistical independence tests. These results support the previously held conjecture that transfer entropy can be used to infer effective connectivity in complex networks

    Minimising the kullback-leibler divergence for model selection in distributed nonlinear systems

    Full text link
    Β© 2018 by the authors. The Kullback-Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theoretic measures, transfer entropy and stochastic interaction. More specifically, these measures are applicable when selecting a candidate model for a distributed system, where individual subsystems are coupled via latent variables and observed through a filter. We represent this model as a directed acyclic graph (DAG) that characterises the unidirectional coupling between subsystems. Standard approaches to structure learning are not applicable in this framework due to the hidden variables; however, we can exploit the properties of certain dynamical systems to formulate exact methods based on differential topology. We approach the problem by using reconstruction theorems to derive an analytical expression for the KL divergence of a candidate DAG from the observed dataset. Using this result, we present a scoring function based on transfer entropy to be used as a subroutine in a structure learning algorithm. We then demonstrate its use in recovering the structure of coupled Lorenz and RΓΆssler systems

    Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework

    Get PDF
    This paper introduces a time- and state-dependent measure of integrated information, Ο†, which captures the repertoire of causal states available to a system as a whole. Specifically, φ quantifies how much information is generated (uncertainty is reduced) when a system enters a particular state through causal interactions among its elements, above and beyond the information generated independently by its parts. Such mathematical characterization is motivated by the observation that integrated information captures two key phenomenological properties of consciousness: (i) there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a large amount of information by ruling out all the others; and (ii) this information is integrated, in that each experience appears as a whole that cannot be decomposed into independent parts. This paper extends previous work on stationary systems and applies integrated information to discrete networks as a function of their dynamics and causal architecture. An analysis of basic examples indicates the following: (i) Ο† varies depending on the state entered by a network, being higher if active and inactive elements are balanced and lower if the network is inactive or hyperactive. (ii) Ο† varies for systems with identical or similar surface dynamics depending on the underlying causal architecture, being low for systems that merely copy or replay activity states. (iii) Ο† varies as a function of network architecture. High Ο† values can be obtained by architectures that conjoin functional specialization with functional integration. Strictly modular and homogeneous systems cannot generate high Ο† because the former lack integration, whereas the latter lack information. Feedforward and lattice architectures are capable of generating high Ο† but are inefficient. (iv) In Hopfield networks, φ is low for attractor states and neutral states, but increases if the networks are optimized to achieve tension between local and global interactions. These basic examples appear to match well against neurobiological evidence concerning the neural substrates of consciousness. More generally, Ο† appears to be a useful metric to characterize the capacity of any physical system to integrate information

    Information-theoretic Reasoning in Distributed and Autonomous Systems

    Get PDF
    The increasing prevalence of distributed and autonomous systems is transforming decision making in industries as diverse as agriculture, environmental monitoring, and healthcare. Despite significant efforts, challenges remain in robustly planning under uncertainty. In this thesis, we present a number of information-theoretic decision rules for improving the analysis and control of complex adaptive systems. We begin with the problem of quantifying the data storage (memory) and transfer (communication) within information processing systems. We develop an information-theoretic framework to study nonlinear interactions within cooperative and adversarial scenarios, solely from observations of each agent's dynamics. This framework is applied to simulations of robotic soccer games, where the measures reveal insights into team performance, including correlations of the information dynamics to the scoreline. We then study the communication between processes with latent nonlinear dynamics that are observed only through a filter. By using methods from differential topology, we show that the information-theoretic measures commonly used to infer communication in observed systems can also be used in certain partially observed systems. For robotic environmental monitoring, the quality of data depends on the placement of sensors. These locations can be improved by either better estimating the quality of future viewpoints or by a team of robots operating concurrently. By robustly handling the uncertainty of sensor model measurements, we are able to present the first end-to-end robotic system for autonomously tracking small dynamic animals, with a performance comparable to human trackers. We then solve the issue of coordinating multi-robot systems through distributed optimisation techniques. These allow us to develop non-myopic robot trajectories for these tasks and, importantly, show that these algorithms provide guarantees for convergence rates to the optimal payoff sequence

    New Trends in Statistical Physics of Complex Systems

    Get PDF
    A topical research activity in statistical physics concerns the study of complex and disordered systems. Generally, these systems are characterized by an elevated level of interconnection and interaction between the parts so that they give rise to a rich structure in the phase space that self-organizes under the control of internal non-linear dynamics. These emergent collective dynamics confer new behaviours to the whole system that are no longer the direct consequence of the properties of the single parts, but rather characterize the whole system as a new entity with its own features, giving rise to the birth of new phenomenologies. As is highlighted in this collection of papers, the methodologies of statistical physics have become very promising in understanding these new phenomena. This volume groups together 12 research works showing the use of typical tools developed within the framework of statistical mechanics, in non-linear kinetic and information geometry, to investigate emerging features in complex physical and physical-like systems. A topical research activity in statistical physics concerns the study of complex and disordered systems. Generally, these systems are characterized by an elevated level of interconnection and interaction between the parts so that they give rise to a rich structure in the phase space that self-organizes under the control of internal non-linear dynamics. These emergent collective dynamics confer new behaviours to the whole system that are no longer the direct consequence of the properties of the single parts, but rather characterize the whole system as a new entity with its own features, giving rise to the birth of new phenomenologies. As is highlighted in this collection of papers, the methodologies of statistical physics have become very promising in understanding these new phenomena. This volume groups together 12 research works showing the use of typical tools developed within the framework of statistical mechanics, in non-linear kinetic and information geometry, to investigate emerging features in complex physical and physical-like systems
    corecore