208 research outputs found
Taylor series expansions for the entropy rate of Hidden Markov Processes
Finding the entropy rate of Hidden Markov Processes is an active research
topic, of both theoretical and practical importance. A recently used approach
is studying the asymptotic behavior of the entropy rate in various regimes. In
this paper we generalize and prove a previous conjecture relating the entropy
rate to entropies of finite systems. Building on our new theorems, we establish
series expansions for the entropy rate in two different regimes. We also study
the radius of convergence of the two series expansions
Asymptotics of entropy rate in special families of hidden Markov chains
We derive an asymptotic formula for entropy rate of a hidden Markov chain under certain parameterizations. We also discuss applications of the asymptotic formula to the asymptotic behaviors of entropy rate of hidden Markov chains as outputs of certain channels, such as binary symmetric channel, binary erasure channel, and some special Gilbert-Elliot channel. © 2006 IEEE.published_or_final_versio
Concavity of Mutual Information Rate for Input-Restricted Finite-State Memoryless Channels at High SNR
We consider a finite-state memoryless channel with i.i.d. channel state and
the input Markov process supported on a mixing finite-type constraint. We
discuss the asymptotic behavior of entropy rate of the output hidden Markov
chain and deduce that the mutual information rate of such a channel is concave
with respect to the parameters of the input Markov processes at high
signal-to-noise ratio. In principle, the concavity result enables good
numerical approximation of the maximum mutual information rate and capacity of
such a channel.Comment: 26 page
Analyticity of Entropy Rate of Hidden Markov Chains With Continuous Alphabet
We first prove that under certain mild assumptions, the entropy rate of a hidden Markov chain, observed when passing a finite-state stationary Markov chain through a discrete-time continuous-output channel, is analytic with respect to the input Markov chain parameters. We then further prove, under strengthened assumptions on the chan- nel, that the entropy rate is jointly analytic as a function of both the input Markov chain parameters and the channel parameters. In particular, the main theorems estab- lish the analyticity of the entropy rate for two representative channels: Cauchy and Gaussian.published_or_final_versio
Analyticity of Entropy Rates of Continuous-State Hidden Markov Models
The analyticity of the entropy and relative entropy rates of continuous-state
hidden Markov models is studied here. Using the analytic continuation principle
and the stability properties of the optimal filter, the analyticity of these
rates is shown for analytically parameterized models. The obtained results hold
under relatively mild conditions and cover several classes of hidden Markov
models met in practice. These results are relevant for several (theoretically
and practically) important problems arising in statistical inference, system
identification and information theory
Derivatives of Entropy Rate in Special Families of Hidden Markov Chains
Consider a hidden Markov chain obtained as the observation process of an
ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how,
in principle, one can explicitly compute the derivatives of the entropy rate of
at extreme values of the noise. Namely, they showed that the derivatives of
standard upper approximations to the entropy rate actually stabilize at an
explicit finite time. We generalize this result to a natural class of hidden
Markov chains called ``Black Holes.'' We also discuss in depth special cases of
binary Markov chains observed in binary symmetric noise, and give an abstract
formula for the first derivative in terms of a measure on the simplex due to
Blackwell.Comment: The relaxed condtions for entropy rate and examples are taken out (to
be part of another paper). The section about general principle and an example
to determine the domain of analyticity is taken out (to be part of another
paper). A section about binary Markov chains corrupted by binary symmetric
noise is adde
Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains
We derive an asymptotic formula for entropy rate of a hidden Markov chain
around a "weak Black Hole". We also discuss applications of the asymptotic
formula to the asymptotic behaviors of certain channels.Comment: 16 page
A Randomized Algorithm for the Capacity of Finite-State Channels
Inspired by ideas from the field of stochastic approximation, we propose a ran- domized algorithm to compute the capacity of a finite-state channel with a Markovian input. When the mutual information rate of the channel is concave with respect to the chosen parameterization, the proposed algorithm proves to be convergent to the ca- pacity of the channel almost surely with the derived convergence rate. We also discuss the convergence behavior of the algorithm without the concavity assumption.published_or_final_versio
- …