2,284 research outputs found

    Limit theorems for the sample entropy of hidden Markov chains

    Get PDF
    The Shannon-McMillan-Breiman theorem asserts that the sample entropy of a stationary and ergodic stochastic process converges to the entropy rate of the same process (as the sample size tends to infinity) almost surely. In this paper, we restrict our attention to the convergence behavior of the sample entropy of hidden Markov chains. Under certain positivity assumptions, we prove that a central limit theorem (CLT) with some Berry-Esseen bound for the sample entropy of a hidden Markov chain, and we use this CLT to establish a law of iterated logarithm (LIL) for the sample entropy. © 2011 IEEE.published_or_final_versionThe 2011 IEEE International Symposium on Information Theory (ISIT), St. Petersburg, Russia, 31 July-5 August 2011. In Proceedings of ISIT, 2011, p. 3009-301

    A Randomized Algorithm for the Capacity of Finite-State Channels

    Get PDF
    Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to compute the capacity of a finite-state channel with a Markovian input. When the mutual information rate of the channel is concave with respect to the chosen parameterization, the proposed algorithm proves to be convergent to the ca- pacity of the channel almost surely with the derived convergence rate. We also discuss the convergence behavior of the algorithm without the concavity assumption.published_or_final_versio

    Consistency of Feature Markov Processes

    Full text link
    We are studying long term sequence prediction (forecasting). We approach this by investigating criteria for choosing a compact useful state representation. The state is supposed to summarize useful information from the history. We want a method that is asymptotically consistent in the sense it will provably eventually only choose between alternatives that satisfy an optimality property related to the used criterion. We extend our work to the case where there is side information that one can take advantage of and, furthermore, we briefly discuss the active setting where an agent takes actions to achieve desirable outcomes.Comment: 16 LaTeX page

    How Random is a Coin Toss? Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos

    Get PDF
    Symbolic dynamics has proven to be an invaluable tool in analyzing the mechanisms that lead to unpredictability and random behavior in nonlinear dynamical systems. Surprisingly, a discrete partition of continuous state space can produce a coarse-grained description of the behavior that accurately describes the invariant properties of an underlying chaotic attractor. In particular, measures of the rate of information production--the topological and metric entropy rates--can be estimated from the outputs of Markov or generating partitions. Here we develop Bayesian inference for k-th order Markov chains as a method to finding generating partitions and estimating entropy rates from finite samples of discretized data produced by coarse-grained dynamical systems.Comment: 8 pages, 1 figure; http://cse.ucdavis.edu/~cmg/compmech/pubs/hrct.ht

    Information Geometry Approach to Parameter Estimation in Markov Chains

    Full text link
    We consider the parameter estimation of Markov chain when the unknown transition matrix belongs to an exponential family of transition matrices. Then, we show that the sample mean of the generator of the exponential family is an asymptotically efficient estimator. Further, we also define a curved exponential family of transition matrices. Using a transition matrix version of the Pythagorean theorem, we give an asymptotically efficient estimator for a curved exponential family.Comment: Appendix D is adde
    • …
    corecore