8,210 research outputs found
The impact of priors and observables on parameter inferences in the Constrained MSSM
We use a newly released version of the SuperBayeS code to analyze the impact
of the choice of priors and the influence of various constraints on the
statistical conclusions for the preferred values of the parameters of the
Constrained MSSM. We assess the effect in a Bayesian framework and compare it
with an alternative likelihood-based measure of a profile likelihood. We employ
a new scanning algorithm (MultiNest) which increases the computational
efficiency by a factor ~200 with respect to previously used techniques. We
demonstrate that the currently available data are not yet sufficiently
constraining to allow one to determine the preferred values of CMSSM parameters
in a way that is completely independent of the choice of priors and statistical
measures. While b->s gamma generally favors large m_0, this is in some contrast
with the preference for low values of m_0 and m_1/2 that is almost entirely a
consequence of a combination of prior effects and a single constraint coming
from the anomalous magnetic moment of the muon, which remains somewhat
controversial. Using an information-theoretical measure, we find that the
cosmological dark matter abundance determination provides at least 80% of the
total constraining power of all available observables. Despite the remaining
uncertainties, prospects for direct detection in the CMSSM remain excellent,
with the spin-independent neutralino-proton cross section almost guaranteed
above sigma_SI ~ 10^{-10} pb, independently of the choice of priors or
statistics. Likewise, gluino and lightest Higgs discovery at the LHC remain
highly encouraging. While in this work we have used the CMSSM as particle
physics model, our formalism and scanning technique can be readily applied to a
wider class of models with several free parameters.Comment: Minor changes, extended discussion of profile likelihood. Matches
JHEP accepted version. SuperBayeS code with MultiNest algorithm available at
http://www.superbayes.or
Unbiased and Consistent Nested Sampling via Sequential Monte Carlo
We introduce a new class of sequential Monte Carlo methods called Nested
Sampling via Sequential Monte Carlo (NS-SMC), which reframes the Nested
Sampling method of Skilling (2006) in terms of sequential Monte Carlo
techniques. This new framework allows convergence results to be obtained in the
setting when Markov chain Monte Carlo (MCMC) is used to produce new samples. An
additional benefit is that marginal likelihood estimates are unbiased. In
contrast to NS, the analysis of NS-SMC does not require the (unrealistic)
assumption that the simulated samples be independent. As the original NS
algorithm is a special case of NS-SMC, this provides insights as to why NS
seems to produce accurate estimates despite a typical violation of its
assumptions. For applications of NS-SMC, we give advice on tuning MCMC kernels
in an automated manner via a preliminary pilot run, and present a new method
for appropriately choosing the number of MCMC repeats at each iteration.
Finally, a numerical study is conducted where the performance of NS-SMC and
temperature-annealed SMC is compared on several challenging and realistic
problems. MATLAB code for our experiments is made available at
https://github.com/LeahPrice/SMC-NS .Comment: 45 pages, some minor typographical errors fixed since last versio
- …