'Institute of Electrical and Electronics Engineers (IEEE)'
Doi
Abstract
We present a sequential Monte Carlo (SMC) method for maximum
likelihood (ML) parameter estimation in latent variable models. Standard
methods rely on gradient algorithms such as the Expectation-
Maximization (EM) algorithm and its Monte Carlo variants. Our
approach is different and motivated by similar considerations to simulated
annealing (SA); that is we propose to sample from a sequence
of artificial distributions whose support concentrates itself on the set
of ML estimates. To achieve this we use SMC methods. We conclude
by presenting simulation results on a toy problem and a nonlinear
non-Gaussian time series model