891 research outputs found
Bayesian Adaptive Hamiltonian Monte Carlo with an Application to High-Dimensional BEKK GARCH Models
Hamiltonian Monte Carlo (HMC) is a recent statistical procedure to sample from complex distributions. Distant proposal draws are taken in a equence of steps following the Hamiltonian dynamics of the underlying parameter space, often yielding superior mixing properties of the resulting Markov chain. However, its performance can deteriorate sharply with the degree of irregularity of the underlying likelihood due to its lack of local adaptability in the parameter space. Riemann Manifold HMC (RMHMC), a locally adaptive version of HMC, alleviates this problem, but at a substantially increased computational cost that can become prohibitive in high-dimensional scenarios. In this paper we propose the Adaptive HMC (AHMC), an alternative inferential method based on HMC that is both fast and locally adaptive, combining the advantages of both HMC and RMHMC. The benefits become more pronounced with higher dimensionality of the parameter space and with the degree of irregularity of the underlying likelihood surface. We show that AHMC satisfies detailed balance for a valid MCMC scheme and provide a comparison with RMHMC in terms of effective sample size, highlighting substantial efficiency gains of AHMC. Simulation examples and an application of the BEKK GARCH model show the usefulness of the new posterior sampler.High-dimensional joint sampling; Markov chain Monte Carlo; Multivariate GARCH
Bayesian Adaptive Hamiltonian Monte Carlo with an Application to High-Dimensional BEKK GARCH Models
Hamiltonian Monte Carlo (HMC) is a recent statistical procedure to sample from complex distributions. Distant proposal draws are taken in a equence of steps following the Hamiltonian dynamics of the underlying parameter space, often yielding superior mixing properties of the resulting Markov chain. However, its performance can deteriorate sharply with the degree of irregularity of the underlying likelihood due to its lack of local adaptability in the parameter space. Riemann Manifold HMC (RMHMC), a locally adaptive version of HMC, alleviates this problem, but at a substantially increased computational cost that can become prohibitive in high-dimensional scenarios. In this paper we propose the Adaptive HMC (AHMC), an alternative inferential method based on HMC that is both fast and locally adaptive, combining the advantages of both HMC and RMHMC. The benefits become more pronounced with higher dimensionality of the parameter space and with the degree of irregularity of the underlying likelihood surface. We show that AHMC satisfies detailed balance for a valid MCMC scheme and provide a comparison with RMHMC in terms of effective sample size, highlighting substantial efficiency gains of AHMC. Simulation examples and an application of the BEKK GARCH model show the usefulness of the new posterior sampler.High-dimensional joint sampling; Markov chain Monte Carlo; Multivariate GARCH
Connecting the Dots: Towards Continuous Time Hamiltonian Monte Carlo
Continuous time Hamiltonian Monte Carlo is introduced, as a powerful
alternative to Markov chain Monte Carlo methods for continuous target
distributions. The method is constructed in two steps: First Hamiltonian
dynamics are chosen as the deterministic dynamics in a continuous time
piecewise deterministic Markov process. Under very mild restrictions, such a
process will have the desired target distribution as an invariant distribution.
Secondly, the numerical implementation of such processes, based on adaptive
numerical integration of second order ordinary differential equations is
considered. The numerical implementation yields an approximate, yet highly
robust algorithm that, unlike conventional Hamiltonian Monte Carlo, enables the
exploitation of the complete Hamiltonian trajectories (hence the title). The
proposed algorithm may yield large speedups and improvements in stability
relative to relevant benchmarks, while incurring numerical errors that are
negligible relative to the overall Monte Carlo errors
MCMC inference for Markov Jump Processes via the Linear Noise Approximation
Bayesian analysis for Markov jump processes is a non-trivial and challenging
problem. Although exact inference is theoretically possible, it is
computationally demanding thus its applicability is limited to a small class of
problems. In this paper we describe the application of Riemann manifold MCMC
methods using an approximation to the likelihood of the Markov jump process
which is valid when the system modelled is near its thermodynamic limit. The
proposed approach is both statistically and computationally efficient while the
convergence rate and mixing of the chains allows for fast MCMC inference. The
methodology is evaluated using numerical simulations on two problems from
chemical kinetics and one from systems biology
- …