Analyticity of Entropy Rate of Hidden Markov Chains


We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain {\em itself} varies analytically, in a strong sense, as a function of the underlying Markov chain parameters.Comment: The title has been changed. The new main theorem now combines the old main theorem and the remark following the old main theorem. A new section is added as an introduction to complex analysis. General principle and an example to determine the domain of analyticity of entropy rate have been added. Relaxed conditions for analyticity of entropy rate and the corresponding examples are added. The section about binary markov chain corrupted by binary symmetric noise is taken out (to be part of another paper

    Similar works