26,215 research outputs found

    Multimodal nested sampling: an efficient and robust alternative to MCMC methods for astronomical data analysis

    Full text link
    In performing a Bayesian analysis of astronomical data, two difficult problems often emerge. First, in estimating the parameters of some model for the data, the resulting posterior distribution may be multimodal or exhibit pronounced (curving) degeneracies, which can cause problems for traditional MCMC sampling methods. Second, in selecting between a set of competing models, calculation of the Bayesian evidence for each model is computationally expensive. The nested sampling method introduced by Skilling (2004), has greatly reduced the computational expense of calculating evidences and also produces posterior inferences as a by-product. This method has been applied successfully in cosmological applications by Mukherjee et al. (2006), but their implementation was efficient only for unimodal distributions without pronounced degeneracies. Shaw et al. (2007), recently introduced a clustered nested sampling method which is significantly more efficient in sampling from multimodal posteriors and also determines the expectation and variance of the final evidence from a single run of the algorithm, hence providing a further increase in efficiency. In this paper, we build on the work of Shaw et al. and present three new methods for sampling and evidence evaluation from distributions that may contain multiple modes and significant degeneracies; we also present an even more efficient technique for estimating the uncertainty on the evaluated evidence. These methods lead to a further substantial improvement in sampling efficiency and robustness, and are applied to toy problems to demonstrate the accuracy and economy of the evidence calculation and parameter estimation. Finally, we discuss the use of these methods in performing Bayesian object detection in astronomical datasets.Comment: 14 pages, 11 figures, submitted to MNRAS, some major additions to the previous version in response to the referee's comment

    Use of the MultiNest algorithm for gravitational wave data analysis

    Full text link
    We describe an application of the MultiNest algorithm to gravitational wave data analysis. MultiNest is a multimodal nested sampling algorithm designed to efficiently evaluate the Bayesian evidence and return posterior probability densities for likelihood surfaces containing multiple secondary modes. The algorithm employs a set of live points which are updated by partitioning the set into multiple overlapping ellipsoids and sampling uniformly from within them. This set of live points climbs up the likelihood surface through nested iso-likelihood contours and the evidence and posterior distributions can be recovered from the point set evolution. The algorithm is model-independent in the sense that the specific problem being tackled enters only through the likelihood computation, and does not change how the live point set is updated. In this paper, we consider the use of the algorithm for gravitational wave data analysis by searching a simulated LISA data set containing two non-spinning supermassive black hole binary signals. The algorithm is able to rapidly identify all the modes of the solution and recover the true parameters of the sources to high precision.Comment: 18 pages, 4 figures, submitted to Class. Quantum Grav; v2 includes various changes in light of referee's comment

    Bayesian methods of astronomical source extraction

    Get PDF
    We present two new source extraction methods, based on Bayesian model selection and using the Bayesian Information Criterion (BIC). The first is a source detection filter, able to simultaneously detect point sources and estimate the image background. The second is an advanced photometry technique, which measures the flux, position (to sub-pixel accuracy), local background and point spread function. We apply the source detection filter to simulated Herschel-SPIRE data and show the filter's ability to both detect point sources and also simultaneously estimate the image background. We use the photometry method to analyse a simple simulated image containing a source of unknown flux, position and point spread function; we not only accurately measure these parameters, but also determine their uncertainties (using Markov-Chain Monte Carlo sampling). The method also characterises the nature of the source (distinguishing between a point source and extended source). We demonstrate the effect of including additional prior knowledge. Prior knowledge of the point spread function increase the precision of the flux measurement, while prior knowledge of the background has onlya small impact. In the presence of higher noise levels, we show that prior positional knowledge (such as might arise from a strong detection in another waveband) allows us to accurately measure the source flux even when the source is too faint to be detected directly. These methods are incorporated in SUSSEXtractor, the source extraction pipeline for the forthcoming Akari FIS far-infrared all-sky survey. They are also implemented in a stand-alone, beta-version public tool that can be obtained at http://astronomy.sussex.ac.uk/\simrss23/sourceMiner\_v0.1.2.0.tar.gzComment: Accepted for publication by ApJ (this version compiled used emulateapj.cls

    Addressing the shortcomings of three recent bayesian methods for detecting interspecific recombination in DNA sequence alignments

    Get PDF
    We address a potential shortcoming of three probabilistic models for detecting interspecific recombination in DNA sequence alignments: the multiple change-point model (MCP) of Suchard et al. (2003), the dual multiple change-point model (DMCP) of Minin et al. (2005), and the phylogenetic factorial hidden Markov model (PFHMM) of Husmeier (2005). These models are based on the Bayesian paradigm, which requires the solution of an integral over the space of branch lengths. To render this integration analytically tractable, all three models make the same assumption that the vectors of branch lengths of the phylogenetic tree are independent among sites. While this approximation reduces the computational complexity considerably, we show that it leads to the systematic prediction of spurious topology changes in the Felsenstein zone, that is, the area in the branch lengths configuration space where maximum parsimony consistently infers the wrong topology due to long-branch attraction. We apply two Bayesian hypothesis tests, based on an inter- and an intra-model approach to estimating the marginal likelihood. We then propose a revised model that addresses these shortcomings, and compare it with the aforementioned models on a set of synthetic DNA sequence alignments systematically generated around the Felsenstein zone
    corecore