178 research outputs found

    Analyticity of Entropy Rate of Hidden Markov Chains With Continuous Alphabet

    Get PDF
    We first prove that under certain mild assumptions, the entropy rate of a hidden Markov chain, observed when passing a finite-state stationary Markov chain through a discrete-time continuous-output channel, is analytic with respect to the input Markov chain parameters. We then further prove, under strengthened assumptions on the chan- nel, that the entropy rate is jointly analytic as a function of both the input Markov chain parameters and the channel parameters. In particular, the main theorems estab- lish the analyticity of the entropy rate for two representative channels: Cauchy and Gaussian.published_or_final_versio

    Entropy rate of continuous-state hidden Markov chains

    Get PDF
    We prove that under mild positivity assumptions, the entropy rate of a continuous-state hidden Markov chain, observed when passing a finite-state Markov chain through a discrete-time continuous-output channel, is analytic as a function of the transition probabilities of the underlying Markov chain. We further prove that the entropy rate of a continuous-state hidden Markov chain, observed when passing a mixing finite-type constrained Markov chain through a discrete-time Gaussian channel, is smooth as a function of the transition probabilities of the underlying Markov chain. © 2010 IEEE.published_or_final_versionThe IEEE International Symposium on Information Theory (ISIT 2010), Austin, TX., 13-18 June 2010. In Proceedings of ISIT, 2010, p. 1468-147

    Limit Theorems in Hidden Markov Models

    Get PDF
    In this paper, under mild assumptions, we derive a law of large numbers, a central limit theorem with an error estimate, an almost sure invariance principle and a variant of Chernoff bound in finite-state hidden Markov models. These limit theorems are of interest in certain ares in statistics and information theory. Particularly, we apply the limit theorems to derive the rate of convergence of the maximum likelihood estimator in finite-state hidden Markov models.Comment: 35 page

    Dynamical determinants and their applications

    Get PDF
    This thesis is concerned with situations where we can define trace-class transfer oper- ators, and extract useful information from their determinants. The first topic is on Lyapunov exponents of random products of matrices. We obtain a new expression for the Lyapunov exponent of a continuous family of matrices, and a slightly different version of existing work for the discrete case. The second topic explores possibilities of using similar theory to approximate eigen- functions of the Laplacian for surfaces of constant negative curvature. The third topic gives a variety of approximations of Mahler measures, which occur in many different areas of mathematics, by manipulating the integrals into a form that can be numerically integrated using work of Pollicott and Jenkinson. The final topic of the thesis works out the details of earlier ideas of Pollicott, to give a method for the numerical approximation of entropy rates of hidden Markov processes
    • …
    corecore