research

Comparing and Evaluating HMM Ensemble Training Algorithms Using Train and Test and Condition Number Criteria

Abstract

Hidden Markov Models have many applications in signal processing and pattern recognition, but their convergence-based training algorithms are known to suffer from over-sensitivity to the initial random model choice. This paper describes the boundary between regions in which ensemble learning is superior to Rabiner's multiplesequence Baum-Welch training method, and proposes techniques for determining the best method in any arbitrary situation. It also studies the suitability of the training methods using the condition number, a recently proposed diagnostic tool for testing the quality of the model. A new method for training Hidden Markov Models called the Viterbi Path counting algorithm is introduced and is found to produce significantly better performance than current methods in a range of trials

    Similar works