64,277 research outputs found

    A framework for the local information dynamics of distributed computation in complex systems

    Full text link
    The nature of distributed computation has often been described in terms of the component operations of universal computation: information storage, transfer and modification. We review the first complete framework that quantifies each of these individual information dynamics on a local scale within a system, and describes the manner in which they interact to create non-trivial computation where "the whole is greater than the sum of the parts". We describe the application of the framework to cellular automata, a simple yet powerful model of distributed computation. This is an important application, because the framework is the first to provide quantitative evidence for several important conjectures about distributed computation in cellular automata: that blinkers embody information storage, particles are information transfer agents, and particle collisions are information modification events. The framework is also shown to contrast the computations conducted by several well-known cellular automata, highlighting the importance of information coherence in complex computation. The results reviewed here provide important quantitative insights into the fundamental nature of distributed computation and the dynamics of complex systems, as well as impetus for the framework to be applied to the analysis and design of other systems.Comment: 44 pages, 8 figure

    Identifying Sources and Sinks in the Presence of Multiple Agents with Gaussian Process Vector Calculus

    Full text link
    In systems of multiple agents, identifying the cause of observed agent dynamics is challenging. Often, these agents operate in diverse, non-stationary environments, where models rely on hand-crafted environment-specific features to infer influential regions in the system's surroundings. To overcome the limitations of these inflexible models, we present GP-LAPLACE, a technique for locating sources and sinks from trajectories in time-varying fields. Using Gaussian processes, we jointly infer a spatio-temporal vector field, as well as canonical vector calculus operations on that field. Notably, we do this from only agent trajectories without requiring knowledge of the environment, and also obtain a metric for denoting the significance of inferred causal features in the environment by exploiting our probabilistic method. To evaluate our approach, we apply it to both synthetic and real-world GPS data, demonstrating the applicability of our technique in the presence of multiple agents, as well as its superiority over existing methods.Comment: KDD '18 Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Pages 1254-1262, 9 pages, 5 figures, conference submission, University of Oxford. arXiv admin note: text overlap with arXiv:1709.0235

    Probability as a physical motive

    Full text link
    Recent theoretical progress in nonequilibrium thermodynamics, linking the physical principle of Maximum Entropy Production ("MEP") to the information-theoretical "MaxEnt" principle of scientific inference, together with conjectures from theoretical physics that there may be no fundamental causal laws but only probabilities for physical processes, and from evolutionary theory that biological systems expand "the adjacent possible" as rapidly as possible, all lend credence to the proposition that probability should be recognized as a fundamental physical motive. It is further proposed that spatial order and temporal order are two aspects of the same thing, and that this is the essence of the second law of thermodynamics.Comment: Replaced at the request of the publisher. Minor corrections to references and to Equation 1 added

    Maximum Entropy Models of Shortest Path and Outbreak Distributions in Networks

    Full text link
    Properties of networks are often characterized in terms of features such as node degree distributions, average path lengths, diameters, or clustering coefficients. Here, we study shortest path length distributions. On the one hand, average as well as maximum distances can be determined therefrom; on the other hand, they are closely related to the dynamics of network spreading processes. Because of the combinatorial nature of networks, we apply maximum entropy arguments to derive a general, physically plausible model. In particular, we establish the generalized Gamma distribution as a continuous characterization of shortest path length histograms of networks or arbitrary topology. Experimental evaluations corroborate our theoretical results

    General anesthesia reduces complexity and temporal asymmetry of the informational structures derived from neural recordings in Drosophila

    Full text link
    We apply techniques from the field of computational mechanics to evaluate the statistical complexity of neural recording data from fruit flies. First, we connect statistical complexity to the flies' level of conscious arousal, which is manipulated by general anesthesia (isoflurane). We show that the complexity of even single channel time series data decreases under anesthesia. The observed difference in complexity between the two states of conscious arousal increases as higher orders of temporal correlations are taken into account. We then go on to show that, in addition to reducing complexity, anesthesia also modulates the informational structure between the forward- and reverse-time neural signals. Specifically, using three distinct notions of temporal asymmetry we show that anesthesia reduces temporal asymmetry on information-theoretic and information-geometric grounds. In contrast to prior work, our results show that: (1) Complexity differences can emerge at very short timescales and across broad regions of the fly brain, thus heralding the macroscopic state of anesthesia in a previously unforeseen manner, and (2) that general anesthesia also modulates the temporal asymmetry of neural signals. Together, our results demonstrate that anesthetized brains become both less structured and more reversible.Comment: 14 pages, 6 figures. Comments welcome; Added time-reversal analysis, updated discussion, new figures (Fig. 5 & Fig. 6) and Tables (Tab. 1

    Perspectives on the Neuroscience of Cognition and Consciousness

    Get PDF
    The origin and current use of the concepts of computation, representation and information in Neuroscience are examined and conceptual flaws are identified which vitiate their usefulness for addressing problems of the neural basis of Cognition and Consciousness. In contrast, a convergence of views is presented to support the characterization of the Nervous System as a complex dynamical system operating in the metastable regime, and capable of evolving to configurations and transitions in phase space with potential relevance for Cognition and Consciousness

    Metastability, Criticality and Phase Transitions in brain and its Models

    Get PDF
    This essay extends the previously deposited paper "Oscillations, Metastability and Phase Transitions" to incorporate the theory of Self-organizing Criticality. The twin concepts of Scaling and Universality of the theory of nonequilibrium phase transitions is applied to the role of reentrant activity in neural circuits of cerebral cortex and subcortical neural structures
    • …
    corecore