1,162 research outputs found

    Concavity of the mutual information rate for input-restricted memoryless channels at high SNR

    Get PDF
    We consider a memoryless channel with an input Markov process supported on a mixing finite-type constraint. We continue the development of asymptotics for the entropy rate of the output hidden Markov chain and deduce that, at high signal-to-noise ratio, the mutual information rate of such a channel is concave with respect to "almost" all input Markov chains of a given order. © 2012 IEEE.published_or_final_versio

    Mixing, Ergodic, and Nonergodic Processes with Rapidly Growing Information between Blocks

    Full text link
    We construct mixing processes over an infinite alphabet and ergodic processes over a finite alphabet for which Shannon mutual information between adjacent blocks of length nn grows as nÎČn^\beta, where ÎČ∈(0,1)\beta\in(0,1). The processes are a modification of nonergodic Santa Fe processes, which were introduced in the context of natural language modeling. The rates of mutual information for the latter processes are alike and also established in this paper. As an auxiliary result, it is shown that infinite direct products of mixing processes are also mixing.Comment: 21 page

    Taylor series expansions for the entropy rate of Hidden Markov Processes

    Full text link
    Finding the entropy rate of Hidden Markov Processes is an active research topic, of both theoretical and practical importance. A recently used approach is studying the asymptotic behavior of the entropy rate in various regimes. In this paper we generalize and prove a previous conjecture relating the entropy rate to entropies of finite systems. Building on our new theorems, we establish series expansions for the entropy rate in two different regimes. We also study the radius of convergence of the two series expansions

    Concavity of Mutual Information Rate for Input-Restricted Finite-State Memoryless Channels at High SNR

    Full text link
    We consider a finite-state memoryless channel with i.i.d. channel state and the input Markov process supported on a mixing finite-type constraint. We discuss the asymptotic behavior of entropy rate of the output hidden Markov chain and deduce that the mutual information rate of such a channel is concave with respect to the parameters of the input Markov processes at high signal-to-noise ratio. In principle, the concavity result enables good numerical approximation of the maximum mutual information rate and capacity of such a channel.Comment: 26 page

    Consistency of the maximum likelihood estimator for general hidden Markov models

    Full text link
    Consider a parametrized family of general hidden Markov models, where both the observed and unobserved components take values in a complete separable metric space. We prove that the maximum likelihood estimator (MLE) of the parameter is strongly consistent under a rather minimal set of assumptions. As special cases of our main result, we obtain consistency in a large class of nonlinear state space models, as well as general results on linear Gaussian state space models and finite state models. A novel aspect of our approach is an information-theoretic technique for proving identifiability, which does not require an explicit representation for the relative entropy rate. Our method of proof could therefore form a foundation for the investigation of MLE consistency in more general dependent and non-Markovian time series. Also of independent interest is a general concentration inequality for VV-uniformly ergodic Markov chains.Comment: Published in at http://dx.doi.org/10.1214/10-AOS834 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Synchronization and Control in Intrinsic and Designed Computation: An Information-Theoretic Analysis of Competing Models of Stochastic Computation

    Full text link
    We adapt tools from information theory to analyze how an observer comes to synchronize with the hidden states of a finitary, stationary stochastic process. We show that synchronization is determined by both the process's internal organization and by an observer's model of it. We analyze these components using the convergence of state-block and block-state entropies, comparing them to the previously known convergence properties of the Shannon block entropy. Along the way, we introduce a hierarchy of information quantifiers as derivatives and integrals of these entropies, which parallels a similar hierarchy introduced for block entropy. We also draw out the duality between synchronization properties and a process's controllability. The tools lead to a new classification of a process's alternative representations in terms of minimality, synchronizability, and unifilarity.Comment: 25 pages, 13 figures, 1 tabl
    • 

    corecore