13 research outputs found
Modified Wynn's Sequential Algorithm for Constructing D-Optimal Designs: Adding Two Points at a Time
On Approximations of the Beta Process in Latent Feature Models: Point Processes Approach
On simulations from the two-parameter Poisson-Dirichlet process and the normalized inverse-Gaussian process
An adaptive truncation method for inference in Bayesian nonparametric models
Many exact Markov chain Monte Carlo algorithms have been developed for posterior inference in Bayesian nonparametric models which involve infinite-dimensional priors. However, these methods are not generic and special methodology must be developed for different classes of prior or different models. Alternatively, the infinite-dimensional prior can be truncated and standard Markov chain Monte Carlo methods used for inference. However, the error in approximating the infinite-dimensional posterior can be hard to control for many models. This paper describes an adaptive truncation method which allows the level of the truncation to be decided by the algorithm and so can avoid large errors in approximating the posterior. A sequence of truncated priors is constructed which are sampled using Markov chain Monte Carlo methods embedded in a sequential Monte Carlo algorithm. Implementational details for infinite mixture models with stick-breaking priors and normalized random measures with independent increments priors are discussed. The methodology is illustrated on infinite mixture models, a semiparametric linear mixed model and a nonparametric time series model
