4,483 research outputs found

    Diffusive Nested Sampling

    Get PDF
    We introduce a general Monte Carlo method based on Nested Sampling (NS), for sampling complex probability distributions and estimating the normalising constant. The method uses one or more particles, which explore a mixture of nested probability distributions, each successive distribution occupying ~e^-1 times the enclosed prior mass of the previous distribution. While NS technically requires independent generation of particles, Markov Chain Monte Carlo (MCMC) exploration fits naturally into this technique. We illustrate the new method on a test problem and find that it can achieve four times the accuracy of classic MCMC-based Nested Sampling, for the same computational effort; equivalent to a factor of 16 speedup. An additional benefit is that more samples and a more accurate evidence value can be obtained simply by continuing the run for longer, as in standard MCMC.Comment: Accepted for publication in Statistics and Computing. C++ code available at http://lindor.physics.ucsb.edu/DNes

    Modelling of the Complex CASSOWARY/SLUGS Gravitational Lenses

    Full text link
    We present the first high-resolution images of CSWA 31, a gravitational lens system observed as part of the SLUGS (Sloan Lenses Unravelled by Gemini Studies) program. These systems exhibit complex image structure with the potential to strongly constrain the mass distribution of the massive lens galaxies, as well as the complex morphology of the sources. In this paper, we describe the strategy used to reconstruct the unlensed source profile and the lens galaxy mass profiles. We introduce a prior distribution over multi-wavelength sources that is realistic as a representation of our knowledge about the surface brightness profiles of galaxies and groups of galaxies. To carry out the inference computationally, we use Diffusive Nested Sampling, an efficient variant of Nested Sampling that uses Markov Chain Monte Carlo (MCMC) to sample the complex posterior distributions and compute the normalising constant. We demonstrate the efficacy of this approach with the reconstruction of the group-group gravitational lens system CSWA 31, finding the source to be composed of five merging spiral galaxies magnified by a factor of 13.Comment: Accepted for publication in MNRA

    Inference for Trans-dimensional Bayesian Models with Diffusive Nested Sampling

    Full text link
    Many inference problems involve inferring the number NN of components in some region, along with their properties {xi}i=1N\{\mathbf{x}_i\}_{i=1}^N, from a dataset D\mathcal{D}. A common statistical example is finite mixture modelling. In the Bayesian framework, these problems are typically solved using one of the following two methods: i) by executing a Monte Carlo algorithm (such as Nested Sampling) once for each possible value of NN, and calculating the marginal likelihood or evidence as a function of NN; or ii) by doing a single run that allows the model dimension NN to change (such as Markov Chain Monte Carlo with birth/death moves), and obtaining the posterior for NN directly. In this paper we present a general approach to this problem that uses trans-dimensional MCMC embedded within a Nested Sampling algorithm, allowing us to explore the posterior distribution and calculate the marginal likelihood (summed over NN) even if the problem contains a phase transition or other difficult features such as multimodality. We present two example problems, finding sinusoidal signals in noisy data, and finding and measuring galaxies in a noisy astronomical image. Both of the examples demonstrate phase transitions in the relationship between the likelihood and the cumulative prior mass, highlighting the need for Nested Sampling.Comment: Only published here for the time being. 17 pages, 10 figures. Software available at https://github.com/eggplantbren/RJObjec

    DNest4: Diffusive Nested Sampling in C++ and Python

    Get PDF
    In probabilistic (Bayesian) inferences, we typically want to compute properties of the posterior distribution, describing knowledge of unknown quantities in the context of a particular dataset and the assumed prior information. The marginal likelihood, also known as the "evidence", is a key quantity in Bayesian model selection. The diffusive nested sampling algorithm, a variant of nested sampling, is a powerful tool for generating posterior samples and estimating marginal likelihoods. It is effective at solving complex problems including many where the posterior distribution is multimodal or has strong dependencies between variables. DNest4 is an open source (MIT licensed), multi-threaded implementation of this algorithm in C++11, along with associated utilities including: (i) 'RJObject', a class template for finite mixture models; and (ii) a Python package allowing basic use without C++ coding. In this paper we demonstrate DNest4 usage through examples including simple Bayesian data analysis, finite mixture models, and approximate Bayesian computation

    Computing Entropies With Nested Sampling

    Full text link
    The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario.Comment: Accepted for publication in Entropy. 21 pages, 3 figures. Software available at https://github.com/eggplantbren/InfoNes
    • …
    corecore