2 research outputs found

    Bayesian computation in astronomy: novel methods for parallel and gradient-free inference

    Get PDF
    The goal of this thesis is twofold; introduce the fundamentals of Bayesian inference and computation focusing on astronomical and cosmological applications, and present recent advances in probabilistic computational methods developed by the author that aim to facilitate Bayesian data analysis for the next generation of astronomical observations and theoretical models. The first part of this thesis familiarises the reader with the notion of probability and its relevance for science through the prism of Bayesian reasoning, by introducing the key constituents of the theory and discussing its best practices. The second part includes a pedagogical introduction to the principles of Bayesian computation motivated by the geometric characteristics of probability distributions and followed by a detailed exposition of various methods including Markov chain Monte Carlo (MCMC), Sequential Monte Carlo (SMC) and Nested Sampling (NS). Finally, the third part presents two novel computational methods and their respective software implementations. The first such development is Ensemble Slice Sampling (ESS), a new class of MCMC algorithms that extend the applicability of the standard Slice Sampler by adaptively tuning its only hyperparameter and utilising an ensemble of parallel walkers in order to efficiently handle strong correlations between parameters. The parallel, black–box and gradient-free nature of the method renders it ideal for use in combination with computationally expensive and non–differentiable models often met in astronomy. ESS is implemented in Python in the well–tested and open-source software package called zeus that is specifically designed to tackle the computational challenges posed by modern astronomical and cosmological analyses. In particular, use of the code requires minimal, if any, hand–tuning of hyperparameters while its performance is insensitive to linear correlations and it can scale up to thousands of CPUs without any extra effort. The next contribution includes the introduction of Preconditioned Monte Carlo (PMC), a novel Monte Carlo method for Bayesian inference that facilitates effective sampling of probability distributions with non–trivial geometry. PMC utilises a Normalising Flow (NF) in order to decorrelate the parameters of the distribution and then proceeds by sampling from the preconditioned target distribution using an adaptive SMC scheme. PMC, through its Python implementation pocoMC, achieves excellent sampling performance, including accurate estimation of the model evidence, for highly correlated, non–Gaussian, and multimodal target distributions. Finally, the code is directly parallelisable, manifesting linear scaling up to thousands of CPUs
    corecore