Article thumbnail
Location of Repository


By Dr Dimitri Kanevsky


The discrimination technique for estimating the parameters of Gaussian mixtures that is based on the Extended Baum transformations (EB) has had significant impact on the speech recognition community. The proof that definitively shows that these transformations increase the value of an objective function with iteration (i.e., so-called "growth transformations") was presented by the author two years ago for a diagonal Gaussian mixture densities. In this paper this proof is extended to a multidimensional multivariate Gaussian mixtures. The proof presented in the current paper is based on the linearization process and the explicit growth estimate for linear forms of Gaussian mixtures

Topics: Statistical Models
Year: 2005
OAI identifier:
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles


    1. (1991). An improved MMIE Training Algorithm for Speaker Independent, Small Vocabulary, Continuous Speech Recognition”,
    2. (1991). An inequality for rational functions with applications to some statistical estimation problems”,
    3. (1967). An inequality with applications to statistical prediction for functions of Markov processes and to a model of ecology,”
    4. (2001). Comparison of discriminative training criteria and optimization methods for speech recognition”, doi
    5. (2002). Discriminative Speaker Adaptation with Conditional Maximum Likelihood Linear Regression,” ICASSP,
    6. Discriminative Training of Subspace Constrained GMMs for Speech Recognition,”
    7. (2004). Extended Baum transformations for general functions”, in
    8. (2003). Growth Transformations for General Functions”,
    9. (1996). Latticebased Discriminative Training for Large Vocabulary Speech Recognition Systems”,

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.