Position Models and Language Modeling

Abstract

International audienceIn statistical language modelling the classic model used is nn-gram. This model is not able however to capture long term dependencies, \emph{i.e.} dependencies larger than nn. An alternative to this model is the probabilistic automaton. Unfortunately, it appears that preliminary experiments on the use of this model in language modelling is not yet competitive, partly because it tries to model too long term dependencies. We propose here to improve the use of this model by restricting the dependency to a more reasonable value. Experiments shows an improvement of 45\% reduction in the perplexity obtained on the Wall Street Journal language modeling task

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 12/11/2016
    Last time updated on 30/03/2019