5,355 research outputs found
On the equivalence between standard and sequentially ordered hidden Markov models
Chopin (2007) introduced a sequentially ordered hidden Markov model, for
which states are ordered according to their order of appearance, and claimed
that such a model is a re-parametrisation of a standard Markov model. This note
gives a formal proof that this equivalence holds in Bayesian terms, as both
formulations generate equivalent posterior distributions, but does not hold in
Frequentist terms, as both formulations generate incompatible likelihood
functions. Perhaps surprisingly, this shows that Bayesian re-parametrisation
and Frequentist re-parametrisation are not identical concepts
Free energy Sequential Monte Carlo, application to mixture modelling
We introduce a new class of Sequential Monte Carlo (SMC) methods, which we
call free energy SMC. This class is inspired by free energy methods, which
originate from Physics, and where one samples from a biased distribution such
that a given function of the state is forced to be
uniformly distributed over a given interval. From an initial sequence of
distributions of interest, and a particular choice of ,
a free energy SMC sampler computes sequentially a sequence of biased
distributions with the following properties: (a) the
marginal distribution of with respect to is
approximatively uniform over a specified interval, and (b)
and have the same conditional distribution with respect to . We
apply our methodology to mixture posterior distributions, which are highly
multimodal. In the mixture context, forcing certain hyper-parameters to higher
values greatly faciliates mode swapping, and makes it possible to recover a
symetric output. We illustrate our approach with univariate and bivariate
Gaussian mixtures and two real-world datasets.Comment: presented at "Bayesian Statistics 9" (Valencia meetings, 4-8 June
2010, Benidorm
- …