1 research outputs found

    Position Models and Language Modeling

    Full text link
    International audienceIn statistical language modelling the classic model used is nn-gram. This model is not able however to capture long term dependencies, \emph{i.e.} dependencies larger than nn. An alternative to this model is the probabilistic automaton. Unfortunately, it appears that preliminary experiments on the use of this model in language modelling is not yet competitive, partly because it tries to model too long term dependencies. We propose here to improve the use of this model by restricting the dependency to a more reasonable value. Experiments shows an improvement of 45\% reduction in the perplexity obtained on the Wall Street Journal language modeling task
    corecore