174,292 research outputs found

    Complexity over Uncertainty in Generalized Representational\ud Information Theory (GRIT): A Structure-Sensitive General\ud Theory of Information

    Get PDF
    What is information? Although researchers have used the construct of information liberally to refer to pertinent forms of domain-specific knowledge, relatively few have attempted to generalize and standardize the construct. Shannon and Weaver(1949)offered the best known attempt at a quantitative generalization in terms of the number of discriminable symbols required to communicate the state of an uncertain event. This idea, although useful, does not capture the role that structural context and complexity play in the process of understanding an event as being informative. In what follows, we discuss the limitations and futility of any generalization (and particularly, Shannon’s) that is not based on the way that agents extract patterns from their environment. More specifically, we shall argue that agent concept acquisition, and not the communication of\ud states of uncertainty, lie at the heart of generalized information, and that the best way of characterizing information is via the relative gain or loss in concept complexity that is experienced when a set of known entities (regardless of their nature or domain of origin) changes. We show that Representational Information Theory perfectly captures this crucial aspect of information and conclude with the first generalization of Representational Information Theory (RIT) to continuous domains

    Information dynamics: patterns of expectation and surprise in the perception of music

    Get PDF
    This is a postprint of an article submitted for consideration in Connection Science © 2009 [copyright Taylor & Francis]; Connection Science is available online at:http://www.tandfonline.com/openurl?genre=article&issn=0954-0091&volume=21&issue=2-3&spage=8

    Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning

    Full text link
    We show that the way in which the Shannon entropy of sequences produced by an information source converges to the source's entropy rate can be used to monitor how an intelligent agent builds and effectively uses a predictive model of its environment. We introduce natural measures of the environment's apparent memory and the amounts of information that must be (i) extracted from observations for an agent to synchronize to the environment and (ii) stored by an agent for optimal prediction. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.Comment: 6 pages, 5 figures, Santa Fe Institute Working Paper 01-03-020, http://www.santafe.edu/projects/CompMech/papers/stte.htm

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl
    • …
    corecore