2,472 research outputs found

    Taylor series expansions for the entropy rate of Hidden Markov Processes

    Full text link
    Finding the entropy rate of Hidden Markov Processes is an active research topic, of both theoretical and practical importance. A recently used approach is studying the asymptotic behavior of the entropy rate in various regimes. In this paper we generalize and prove a previous conjecture relating the entropy rate to entropies of finite systems. Building on our new theorems, we establish series expansions for the entropy rate in two different regimes. We also study the radius of convergence of the two series expansions

    The Entropy of a Binary Hidden Markov Process

    Full text link
    The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter epsilon. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in epsilon. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

    Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains

    Get PDF
    We derive an asymptotic formula for entropy rate of a hidden Markov chain around a "weak Black Hole". We also discuss applications of the asymptotic formula to the asymptotic behaviors of certain channels.Comment: 16 page

    Asymptotics of entropy rate in special families of hidden Markov chains

    Get PDF
    We derive an asymptotic formula for entropy rate of a hidden Markov chain under certain parameterizations. We also discuss applications of the asymptotic formula to the asymptotic behaviors of entropy rate of hidden Markov chains as outputs of certain channels, such as binary symmetric channel, binary erasure channel, and some special Gilbert-Elliot channel. © 2006 IEEE.published_or_final_versio

    Analyticity of Entropy Rate of Hidden Markov Chains

    Get PDF
    We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain {\em itself} varies analytically, in a strong sense, as a function of the underlying Markov chain parameters.Comment: The title has been changed. The new main theorem now combines the old main theorem and the remark following the old main theorem. A new section is added as an introduction to complex analysis. General principle and an example to determine the domain of analyticity of entropy rate have been added. Relaxed conditions for analyticity of entropy rate and the corresponding examples are added. The section about binary markov chain corrupted by binary symmetric noise is taken out (to be part of another paper

    Derivatives of Entropy Rate in Special Families of Hidden Markov Chains

    Get PDF
    Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called ``Black Holes.'' We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.Comment: The relaxed condtions for entropy rate and examples are taken out (to be part of another paper). The section about general principle and an example to determine the domain of analyticity is taken out (to be part of another paper). A section about binary Markov chains corrupted by binary symmetric noise is adde

    Entropy of hidden Markov models

    Get PDF

    Recent advances in directional statistics

    Get PDF
    Mainstream statistical methodology is generally applicable to data observed in Euclidean space. There are, however, numerous contexts of considerable scientific interest in which the natural supports for the data under consideration are Riemannian manifolds like the unit circle, torus, sphere and their extensions. Typically, such data can be represented using one or more directions, and directional statistics is the branch of statistics that deals with their analysis. In this paper we provide a review of the many recent developments in the field since the publication of Mardia and Jupp (1999), still the most comprehensive text on directional statistics. Many of those developments have been stimulated by interesting applications in fields as diverse as astronomy, medicine, genetics, neurology, aeronautics, acoustics, image analysis, text mining, environmetrics, and machine learning. We begin by considering developments for the exploratory analysis of directional data before progressing to distributional models, general approaches to inference, hypothesis testing, regression, nonparametric curve estimation, methods for dimension reduction, classification and clustering, and the modelling of time series, spatial and spatio-temporal data. An overview of currently available software for analysing directional data is also provided, and potential future developments discussed.Comment: 61 page
    corecore