17 research outputs found
Bayesian log-Gaussian Cox process regression: applications to meta-analysis of neuroimaging working memory studies
Working memory (WM) was one of the first cognitive processes studied with
functional magnetic resonance imaging. With now over 20 years of studies on WM,
each study with tiny sample sizes, there is a need for meta-analysis to
identify the brain regions that are consistently activated by WM tasks, and to
understand the interstudy variation in those activations. However, current
methods in the field cannot fully account for the spatial nature of
neuroimaging meta-analysis data or the heterogeneity observed among WM studies.
In this work, we propose a fully Bayesian random-effects metaregression model
based on log-Gaussian Cox processes, which can be used for meta-analysis of
neuroimaging studies. An efficient Markov chain Monte Carlo scheme for
posterior simulations is presented which makes use of some recent advances in
parallel computing using graphics processing units. Application of the proposed
model to a real data set provides valuable insights regarding the function of
the WM
Langevin and Hamiltonian based Sequential MCMC for Efficient Bayesian Filtering in High-dimensional Spaces
Nonlinear non-Gaussian state-space models arise in numerous applications in
statistics and signal processing. In this context, one of the most successful
and popular approximation techniques is the Sequential Monte Carlo (SMC)
algorithm, also known as particle filtering. Nevertheless, this method tends to
be inefficient when applied to high dimensional problems. In this paper, we
focus on another class of sequential inference methods, namely the Sequential
Markov Chain Monte Carlo (SMCMC) techniques, which represent a promising
alternative to SMC methods. After providing a unifying framework for the class
of SMCMC approaches, we propose novel efficient strategies based on the
principle of Langevin diffusion and Hamiltonian dynamics in order to cope with
the increasing number of high-dimensional applications. Simulation results show
that the proposed algorithms achieve significantly better performance compared
to existing algorithms
STANLEY: Stochastic Gradient Anisotropic Langevin Dynamics for Learning Energy-Based Models
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin
dYnamics, for sampling high dimensional data. With the growing efficacy and
potential of Energy-Based modeling, also known as non-normalized probabilistic
modeling, for modeling a generative process of different natures of high
dimensional data observations, we present an end-to-end learning algorithm for
Energy-Based models (EBM) with the purpose of improving the quality of the
resulting sampled data points. While the unknown normalizing constant of EBMs
makes the training procedure intractable, resorting to Markov Chain Monte Carlo
(MCMC) is in general a viable option. Realizing what MCMC entails for the EBM
training, we propose in this paper, a novel high dimensional sampling method,
based on an anisotropic stepsize and a gradient-informed covariance matrix,
embedded into a discretized Langevin diffusion. We motivate the necessity for
an anisotropic update of the negative samples in the Markov Chain by the
nonlinearity of the backbone of the EBM, here a Convolutional Neural Network.
Our resulting method, namely STANLEY, is an optimization algorithm for training
Energy-Based models via our newly introduced MCMC method. We provide a
theoretical understanding of our sampling scheme by proving that the sampler
leads to a geometrically uniformly ergodic Markov Chain. Several image
generation experiments are provided in our paper to show the effectiveness of
our method.Comment: arXiv admin note: text overlap with arXiv:1207.5938 by other author
Exact algorithms for simulation of diffusions with discontinuous drift and robust Curvature Metropolis-adjusted Langevin algorithms
In our work we propose new Exact Algorithms for simulation of diffusions with discontinuous drift and new methodology for simulating Brownian motion jointly with its local time. In the second part of the thesis we introduce Metropolis-adjusted Langevin algorithm which uses local geometry and we prove geometric ergodicity in case of benchmark distributions with light tails