research

Note on neural network sampling for Bayesian inference of mixture processes

Abstract

In this paper we show some further experiments with neural network sampling,a class of sampling methods that make use of neural network approximationsto (posterior) densities, introduced by Hoogerheide et al. (2007). We considera method where a mixture of Student's t densities, which can be interpreted asa neural network function, is used as a candidate density in importance samplingor the Metropolis-Hastings algorithm. It is applied to an illustrative2-regime mixture model for the US real GNP growth rate. We explain thenon-elliptical shapes of the posterior distribution, and show that the proposedmethod outperforms Gibbs sampling with data augmentation and the griddy Gibbs sampler.

    Similar works