21,750 research outputs found

    The Entropy of a Binary Hidden Markov Process

    Full text link
    The entropy of a binary symmetric Hidden Markov Process is calculated as an expansion in the noise parameter epsilon. We map the problem onto a one-dimensional Ising model in a large field of random signs and calculate the expansion coefficients up to second order in epsilon. Using a conjecture we extend the calculation to 11th order and discuss the convergence of the resulting series

    Derivatives of Entropy Rate in Special Families of Hidden Markov Chains

    Get PDF
    Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called ``Black Holes.'' We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.Comment: The relaxed condtions for entropy rate and examples are taken out (to be part of another paper). The section about general principle and an example to determine the domain of analyticity is taken out (to be part of another paper). A section about binary Markov chains corrupted by binary symmetric noise is adde

    Novel Lower Bounds on the Entropy Rate of Binary Hidden Markov Processes

    Full text link
    Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the (1,∞)(1,\infty)-RLL constraint

    Prediction and Generation of Binary Markov Processes: Can a Finite-State Fox Catch a Markov Mouse?

    Get PDF
    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.Comment: 12 pages, 12 figures; http://csc.ucdavis.edu/~cmg/compmech/pubs/gmc.ht

    Statistical Physics Analysis of Maximum a Posteriori Estimation for Multi-channel Hidden Markov Models

    Full text link
    The performance of Maximum a posteriori (MAP) estimation is studied analytically for binary symmetric multi-channel Hidden Markov processes. We reduce the estimation problem to a 1D Ising spin model and define order parameters that correspond to different characteristics of the MAP-estimated sequence. The solution to the MAP estimation problem has different operational regimes separated by first order phase transitions. The transition points for LL-channel system with identical noise levels, are uniquely determined by LL being odd or even, irrespective of the actual number of channels. We demonstrate that for lower noise intensities, the number of solutions is uniquely determined for odd LL, whereas for even LL there are exponentially many solutions. We also develop a semi analytical approach to calculate the estimation error without resorting to brute force simulations. Finally, we examine the tradeoff between a system with single low-noise channel and one with multiple noisy channels.Comment: The paper has been submitted to Journal of Statistical Physics with submission number JOSS-S-12-0039

    On Hidden Markov Processes with Infinite Excess Entropy

    Full text link
    We investigate stationary hidden Markov processes for which mutual information between the past and the future is infinite. It is assumed that the number of observable states is finite and the number of hidden states is countably infinite. Under this assumption, we show that the block mutual information of a hidden Markov process is upper bounded by a power law determined by the tail index of the hidden state distribution. Moreover, we exhibit three examples of processes. The first example, considered previously, is nonergodic and the mutual information between the blocks is bounded by the logarithm of the block length. The second example is also nonergodic but the mutual information between the blocks obeys a power law. The third example obeys the power law and is ergodic.Comment: 12 page

    Analyticity of Entropy Rate of Hidden Markov Chains

    Get PDF
    We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain {\em itself} varies analytically, in a strong sense, as a function of the underlying Markov chain parameters.Comment: The title has been changed. The new main theorem now combines the old main theorem and the remark following the old main theorem. A new section is added as an introduction to complex analysis. General principle and an example to determine the domain of analyticity of entropy rate have been added. Relaxed conditions for analyticity of entropy rate and the corresponding examples are added. The section about binary markov chain corrupted by binary symmetric noise is taken out (to be part of another paper

    Taylor series expansions for the entropy rate of Hidden Markov Processes

    Full text link
    Finding the entropy rate of Hidden Markov Processes is an active research topic, of both theoretical and practical importance. A recently used approach is studying the asymptotic behavior of the entropy rate in various regimes. In this paper we generalize and prove a previous conjecture relating the entropy rate to entropies of finite systems. Building on our new theorems, we establish series expansions for the entropy rate in two different regimes. We also study the radius of convergence of the two series expansions
    • 

    corecore