4 research outputs found

    A compact statistical model of the song syntax in Bengalese finch

    Get PDF
    Songs of many songbird species consist of variable sequences of a finite number of syllables. A common approach for characterizing the syntax of these complex syllable sequences is to use transition probabilities between the syllables. This is equivalent to the Markov model, in which each syllable is associated with one state, and the transition probabilities between the states do not depend on the state transition history. Here we analyze the song syntax in a Bengalese finch. We show that the Markov model fails to capture the statistical properties of the syllable sequences. Instead, a state transition model that accurately describes the statistics of the syllable sequences includes adaptation of the self-transition probabilities when states are repeatedly revisited, and allows associations of more than one state to the same syllable. Such a model does not increase the model complexity significantly. Mathematically, the model is a partially observable Markov model with adaptation (POMMA). The success of the POMMA supports the branching chain network hypothesis of how syntax is controlled within the premotor song nucleus HVC, and suggests that adaptation and many-to-one mapping from neural substrates to syllables are important features of the neural control of complex song syntax

    Inducing Hidden Markov Models to Model Long-Term Dependencies

    Full text link

    A Markovian approach to the induction of regular string distributions

    No full text
    We propose in this paper a novel approach to the induction of the structure of Hidden Markov Models (HMMs). The notion of partially observable Markov models (POMMs) is introduced. POMMs form a particular case of HMMs where any state emits a single letter with probability one, but several states can emit the same letter. It is shown that any HMM can be represented by an equivalent POMM. The proposed induction algorithm aims at finding a POMM fitting a sample drawn from an unknown target POMM. The induced model is built to fit the dynamics of the target machine observed in the sample. A POMM is seen as a lumped process of a Markov chain and the induced POMM is constructed to best approximate the stationary distribution and the mean first passage times (MFPT) observed in the sample. The induction relies on iterative state splitting from an initial maximum likelihood model. The transition probabilities of the updated model are found by solving an optimization problem to minimize the difference between the observed MFPT and their values computed in the induced model

    A Markovian Approach to the Induction of Regular String Distributions

    No full text
    We propose in this paper a novel approach to the induction of the structure of Hidden Markov Models (HMMs). The notion of partially observable Markov models (POMMs) is introduced. POMMs form a particular case of HMMs where any state emits a single letter with probability one, but several states can emit the same letter. It is shown that any HMM can be represented by an equivalent POMM. The proposed induction algorithm aims at finding a POMM fitting a sample drawn from an unknown target POMM. The induced model is built to fit the dynamics of the target machine observed in the sample. A POMM is seen as a lumped process of a Markov chain and the induced POMM is constructed to best approximate the stationary distribution and the mean first passage times (MFPT) observed in the sample. The induction relies on iterative state splitting from an initial maximum likelihood model. The transition probabilities of the updated model are found by solving an optimization problem to minimize the di#erence between the observed MFPT and their values computed in the induced model
    corecore