Data augmentation improves the convergence of iterative algorithms, such as
the EM algorithm and Gibbs sampler by introducing carefully designed latent
variables. In this article, we first propose a data augmentation scheme for the
first-order autoregression plus noise model, where optimal values of working
parameters introduced for recentering and rescaling of the latent states, can
be derived analytically by minimizing the fraction of missing information in
the EM algorithm. The proposed data augmentation scheme is then utilized to
design efficient Markov chain Monte Carlo (MCMC) algorithms for Bayesian
inference of some non-Gaussian and nonlinear state space models, via a mixture
of normals approximation coupled with a block-specific reparametrization
strategy. Applications on simulated and benchmark real datasets indicate that
the proposed MCMC sampler can yield improvements in simulation efficiency
compared with centering, noncentering and even the ancillarity-sufficiency
interweaving strategy.Comment: Keywords: Data augmentation, State space model, Stochastic volatility
model, EM algorithm, Reparametrization, Markov chain Monte Carlo,
Ancillarity-sufficiency interweaving strateg