40 research outputs found

    Novel Lower Bounds on the Entropy Rate of Binary Hidden Markov Processes

    Full text link
    Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the (1,∞)(1,\infty)-RLL constraint

    Taylor series expansions for the entropy rate of Hidden Markov Processes

    Full text link
    Finding the entropy rate of Hidden Markov Processes is an active research topic, of both theoretical and practical importance. A recently used approach is studying the asymptotic behavior of the entropy rate in various regimes. In this paper we generalize and prove a previous conjecture relating the entropy rate to entropies of finite systems. Building on our new theorems, we establish series expansions for the entropy rate in two different regimes. We also study the radius of convergence of the two series expansions

    The Entropy of a Binary Hidden Markov Process

    Full text link
    The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter epsilon. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in epsilon. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

    Entropy rate calculations of algebraic measures

    Full text link
    Let K={0,1,...,q−1}K = \{0,1,...,q-1\}. We use a special class of translation invariant measures on KZK^\mathbb{Z} called algebraic measures to study the entropy rate of a hidden Markov processes. Under some irreducibility assumptions of the Markov transition matrix we derive exact formulas for the entropy rate of a general qq state hidden Markov process derived from a Markov source corrupted by a specific noise model. We obtain upper bounds on the error when using an approximation to the formulas and numerically compute the entropy rates of two and three state hidden Markov models

    Derivatives of Entropy Rate in Special Families of Hidden Markov Chains

    Get PDF
    Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called ``Black Holes.'' We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.Comment: The relaxed condtions for entropy rate and examples are taken out (to be part of another paper). The section about general principle and an example to determine the domain of analyticity is taken out (to be part of another paper). A section about binary Markov chains corrupted by binary symmetric noise is adde

    Asymptotics of input-constrained binary symmetric channel capacity

    Get PDF
    We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117--122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero.Comment: Published in at http://dx.doi.org/10.1214/08-AAP570 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org
    corecore