We present and analyse three online algorithms for learning in discrete
Hidden Markov Models (HMMs) and compare them with the Baldi-Chauvin Algorithm.
Using the Kullback-Leibler divergence as a measure of generalisation error we
draw learning curves in simplified situations. The performance for learning
drifting concepts of one of the presented algorithms is analysed and compared
with the Baldi-Chauvin algorithm in the same situations. A brief discussion
about learning and symmetry breaking based on our results is also presented.Comment: 8 pages, 6 figure