13,083 research outputs found

    Some Aspects of Finite State Channel related to Hidden Markov Process

    Get PDF
    We have no satisfactory capacity formula for most channels with finite states. Here, we consider some interesting examples of finite state channels, such as Gilbert-Elliot channel, trapdoor channel, etc., to reveal special characters of problems and difficulties to determine the capacities. Meanwhile, we give a simple expression of the capacity formula for Gilbert-Elliot channel by using a hidden Markov source for the optimal input process. This idea should be extended to other finite state channels

    Synchronizing to the Environment: Information Theoretic Constraints on Agent Learning

    Full text link
    We show that the way in which the Shannon entropy of sequences produced by an information source converges to the source's entropy rate can be used to monitor how an intelligent agent builds and effectively uses a predictive model of its environment. We introduce natural measures of the environment's apparent memory and the amounts of information that must be (i) extracted from observations for an agent to synchronize to the environment and (ii) stored by an agent for optimal prediction. If structural properties are ignored, the missed regularities are converted to apparent randomness. Conversely, using representations that assume too much memory results in false predictability.Comment: 6 pages, 5 figures, Santa Fe Institute Working Paper 01-03-020, http://www.santafe.edu/projects/CompMech/papers/stte.htm

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl

    Error threshold in optimal coding, numerical criteria and classes of universalities for complexity

    Full text link
    The free energy of the Random Energy Model at the transition point between ferromagnetic and spin glass phases is calculated. At this point, equivalent to the decoding error threshold in optimal codes, free energy has finite size corrections proportional to the square root of the number of degrees. The response of the magnetization to the ferromagnetic couplings is maximal at the values of magnetization equal to half. We give several criteria of complexity and define different universality classes. According to our classification, at the lowest class of complexity are random graph, Markov Models and Hidden Markov Models. At the next level is Sherrington-Kirkpatrick spin glass, connected with neuron-network models. On a higher level are critical theories, spin glass phase of Random Energy Model, percolation, self organized criticality (SOC). The top level class involves HOT design, error threshold in optimal coding, language, and, maybe, financial market. Alive systems are also related with the last class. A concept of anti-resonance is suggested for the complex systems.Comment: 17 page

    Consistency of maximum likelihood estimation for some dynamical systems

    Get PDF
    We consider the asymptotic consistency of maximum likelihood parameter estimation for dynamical systems observed with noise. Under suitable conditions on the dynamical systems and the observations, we show that maximum likelihood parameter estimation is consistent. Our proof involves ideas from both information theory and dynamical systems. Furthermore, we show how some well-studied properties of dynamical systems imply the general statistical properties related to maximum likelihood estimation. Finally, we exhibit classical families of dynamical systems for which maximum likelihood estimation is consistent. Examples include shifts of finite type with Gibbs measures and Axiom A attractors with SRB measures.Comment: Published in at http://dx.doi.org/10.1214/14-AOS1259 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore