KL-HMM and Probabilistic Lexical Modeling

Abstract

Kullback-Leibler divergence based hidden Markov model (KL-HMM) is an approach where a posteriori probabilities of phonemes estimated by artificial neural networks (ANN) are modeled directly as feature observation. In this paper, we show the relation between standard HMM-based automatic speech recognition (ASR) approach and KL-HMM approach. More specifically, we show that KL-HMM is a probabilistic lexical modeling approach which is applicable to both HMM/GMM ASR system and hybrid HMM/ANN ASR system. Through experimental studies on DARPA Resource Management task, we show that KL-HMM approach can improve over state-of-the-art ASR system

    Similar works