375 research outputs found
Discussion of "Geodesic Monte Carlo on Embedded Manifolds"
Contributed discussion and rejoinder to "Geodesic Monte Carlo on Embedded
Manifolds" (arXiv:1301.6064)Comment: Discussion of arXiv:1301.6064. To appear in the Scandinavian Journal
of Statistics. 18 page
Curvature and Concentration of Hamiltonian Monte Carlo in High Dimensions
In this article, we analyze Hamiltonian Monte Carlo (HMC) by placing it in
the setting of Riemannian geometry using the Jacobi metric, so that each step
corresponds to a geodesic on a suitable Riemannian manifold. We then combine
the notion of curvature of a Markov chain due to Joulin and Ollivier with the
classical sectional curvature from Riemannian geometry to derive error bounds
for HMC in important cases, where we have positive curvature. These cases
include several classical distributions such as multivariate Gaussians, and
also distributions arising in the study of Bayesian image registration. The
theoretical development suggests the sectional curvature as a new diagnostic
tool for convergence for certain Markov chains.Comment: Comments welcom
Differential geometric MCMC methods and applications
This thesis presents novel Markov chain Monte Carlo methodology that exploits the natural representation of a statistical model as a Riemannian manifold. The methods developed provide generalisations of the Metropolis-adjusted Langevin algorithm and the Hybrid Monte Carlo algorithm for Bayesian statistical inference, and resolve many shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlation structure. The performance of these Riemannian manifold Markov chain Monte Carlo algorithms is rigorously assessed by performing Bayesian inference on logistic regression models, log-Gaussian Cox point process models, stochastic volatility models, and both parameter and model level inference of dynamical systems described by nonlinear differential equations
The geometric foundations of Hamiltonian Monte Carlo
Although Hamiltonian Monte Carlo has proven an empirical success, the lack of a rigorous theoretical understanding of the algorithm has in many ways impeded both principled developments of the method and use of the algorithm in practice. In this paper we develop the formal foundations of the algorithm through the construction of measures on smooth manifolds, and demonstrate how the theory naturally identifies efficient implementations and motivates promising generalizations
Hybrid Classical-Quantum Computing: Applications to Statistical Mechanics of Neocortical Interactions
Several commerical quantum computers are now available that offer Hybrid Classical-Quantum computing Application is made to a classical-quantum model of human neocortex Statistical Mechanics of Neocortical Interactions SMNI which has had its applications published in many papers since 1981 However this project only uses Classical super- computers Since 2015 PATHINT has been used as a numerical algorithm for folding path-integrals Applications in several systems in several disciplines has generalized been from 1 dimension to N dimensions and from classical to quantum systems qPATHINT Papers have applied qPATHINT to neocortical interactions and financial options The classical space described by SMNI applies nonlinear nonequilibrium multivariate statistical mechanics to synaptic neuronal interactions while the quantum space described by qPATHINT applies synaptic contributions from Ca2 waves generated by astrocytes at tripartite neuron-astrocyte-neuron site
Modern Monte Carlo Methods and Their Application in Semiparametric Regression
Indiana University-Purdue University Indianapolis (IUPUI)The essence of Bayesian data analysis is to ascertain posterior distributions. Posteriors
generally do not have closed-form expressions for direct computation in practical applications.
Analysts, therefore, resort to Markov Chain Monte Carlo (MCMC) methods for the generation
of sample observations that approximate the desired posterior distribution. Standard MCMC
methods simulate sample values from the desired posterior distribution via random proposals.
As a result, the mechanism used to generate the proposals inevitably determines the
efficiency of the algorithm. One of the modern MCMC techniques designed to explore
the high-dimensional space more efficiently is Hamiltonian Monte Carlo (HMC), based on
the Hamiltonian differential equations. Inspired by classical mechanics, these equations
incorporate a latent variable to generate MCMC proposals that are likely to be accepted.
This dissertation discusses how such a powerful computational approach can be used for
implementing statistical models. Along this line, I created a unified computational procedure
for using HMC to fit various types of statistical models. The procedure that I proposed can
be applied to a broad class of models, including linear models, generalized linear models,
mixed-effects models, and various types of semiparametric regression models. To facilitate
the fitting of a diverse set of models, I incorporated new parameterization and decomposition
schemes to ensure the numerical performance of Bayesian model fitting without sacrificing
the procedure’s general applicability. As a concrete application, I demonstrate how to use the
proposed procedure to fit a multivariate generalized additive model (GAM), a nonstandard
statistical model with a complex covariance structure and numerous parameters. Byproducts of the research include two software packages that all practical data analysts to use the
proposed computational method to fit their own models. The research’s main methodological
contribution is the unified computational approach that it presents for Bayesian model
fitting that can be used for standard and nonstandard statistical models. Availability of
such a procedure has greatly enhanced statistical modelers’ toolbox for implementing new
and nonstandard statistical models
Variational Hamiltonian Monte Carlo via Score Matching
Traditionally, the field of computational Bayesian statistics has been
divided into two main subfields: variational methods and Markov chain Monte
Carlo (MCMC). In recent years, however, several methods have been proposed
based on combining variational Bayesian inference and MCMC simulation in order
to improve their overall accuracy and computational efficiency. This marriage
of fast evaluation and flexible approximation provides a promising means of
designing scalable Bayesian inference methods. In this paper, we explore the
possibility of incorporating variational approximation into a state-of-the-art
MCMC method, Hamiltonian Monte Carlo (HMC), to reduce the required gradient
computation in the simulation of Hamiltonian flow, which is the bottleneck for
many applications of HMC in big data problems. To this end, we use a {\it
free-form} approximation induced by a fast and flexible surrogate function
based on single-hidden layer feedforward neural networks. The surrogate
provides sufficiently accurate approximation while allowing for fast
exploration of parameter space, resulting in an efficient approximate inference
algorithm. We demonstrate the advantages of our method on both synthetic and
real data problems
- …