8,487 research outputs found

    Entropy rate of continuous-state hidden Markov chains

    Get PDF
    We prove that under mild positivity assumptions, the entropy rate of a continuous-state hidden Markov chain, observed when passing a finite-state Markov chain through a discrete-time continuous-output channel, is analytic as a function of the transition probabilities of the underlying Markov chain. We further prove that the entropy rate of a continuous-state hidden Markov chain, observed when passing a mixing finite-type constrained Markov chain through a discrete-time Gaussian channel, is smooth as a function of the transition probabilities of the underlying Markov chain. © 2010 IEEE.published_or_final_versionThe IEEE International Symposium on Information Theory (ISIT 2010), Austin, TX., 13-18 June 2010. In Proceedings of ISIT, 2010, p. 1468-147

    Analyticity of Entropy Rate of Hidden Markov Chains

    Get PDF
    We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain {\em itself} varies analytically, in a strong sense, as a function of the underlying Markov chain parameters.Comment: The title has been changed. The new main theorem now combines the old main theorem and the remark following the old main theorem. A new section is added as an introduction to complex analysis. General principle and an example to determine the domain of analyticity of entropy rate have been added. Relaxed conditions for analyticity of entropy rate and the corresponding examples are added. The section about binary markov chain corrupted by binary symmetric noise is taken out (to be part of another paper

    Entropy rate calculations of algebraic measures

    Full text link
    Let K={0,1,...,q−1}K = \{0,1,...,q-1\}. We use a special class of translation invariant measures on KZK^\mathbb{Z} called algebraic measures to study the entropy rate of a hidden Markov processes. Under some irreducibility assumptions of the Markov transition matrix we derive exact formulas for the entropy rate of a general qq state hidden Markov process derived from a Markov source corrupted by a specific noise model. We obtain upper bounds on the error when using an approximation to the formulas and numerically compute the entropy rates of two and three state hidden Markov models

    Consistency of the maximum likelihood estimator for general hidden Markov models

    Full text link
    Consider a parametrized family of general hidden Markov models, where both the observed and unobserved components take values in a complete separable metric space. We prove that the maximum likelihood estimator (MLE) of the parameter is strongly consistent under a rather minimal set of assumptions. As special cases of our main result, we obtain consistency in a large class of nonlinear state space models, as well as general results on linear Gaussian state space models and finite state models. A novel aspect of our approach is an information-theoretic technique for proving identifiability, which does not require an explicit representation for the relative entropy rate. Our method of proof could therefore form a foundation for the investigation of MLE consistency in more general dependent and non-Markovian time series. Also of independent interest is a general concentration inequality for VV-uniformly ergodic Markov chains.Comment: Published in at http://dx.doi.org/10.1214/10-AOS834 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Derivatives of Entropy Rate in Special Families of Hidden Markov Chains

    Get PDF
    Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called ``Black Holes.'' We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.Comment: The relaxed condtions for entropy rate and examples are taken out (to be part of another paper). The section about general principle and an example to determine the domain of analyticity is taken out (to be part of another paper). A section about binary Markov chains corrupted by binary symmetric noise is adde
    • 

    corecore