3,329 research outputs found

    The Virtues of Frugality - Why cosmological observers should release their data slowly

    Get PDF
    Cosmologists will soon be in a unique position. Observational noise will gradually be replaced by cosmic variance as the dominant source of uncertainty in an increasing number of observations. We reflect on the ramifications for the discovery and verification of new models. If there are features in the full data set that call for a new model, there will be no subsequent observations to test that model's predictions. We give specific examples of the problem by discussing the pitfalls of model discovery by prior adjustment in the context of dark energy models and inflationary theories. We show how the gradual release of data can mitigate this difficulty, allowing anomalies to be identified, and new models to be proposed and tested. We advocate that observers plan for the frugal release of data from future cosmic variance limited observations.Comment: 5 pages, expanded discussion of Lambda and of blind anlysis, added refs. Matches version to appear in MNRAS Letter

    Should we doubt the cosmological constant?

    Get PDF
    While Bayesian model selection is a useful tool to discriminate between competing cosmological models, it only gives a relative rather than an absolute measure of how good a model is. Bayesian doubt introduces an unknown benchmark model against which the known models are compared, thereby obtaining an absolute measure of model performance in a Bayesian framework. We apply this new methodology to the problem of the dark energy equation of state, comparing an absolute upper bound on the Bayesian evidence for a presently unknown dark energy model against a collection of known models including a flat LambdaCDM scenario. We find a strong absolute upper bound to the Bayes factor B between the unknown model and LambdaCDM, giving B < 3. The posterior probability for doubt is found to be less than 6% (with a 1% prior doubt) while the probability for LambdaCDM rises from an initial 25% to just over 50% in light of the data. We conclude that LambdaCDM remains a sufficient phenomenological description of currently available observations and that there is little statistical room for model improvement.Comment: 10 pages, 2 figure

    A Coverage Study of the CMSSM Based on ATLAS Sensitivity Using Fast Neural Networks Techniques

    Get PDF
    We assess the coverage properties of confidence and credible intervals on the CMSSM parameter space inferred from a Bayesian posterior and the profile likelihood based on an ATLAS sensitivity study. In order to make those calculations feasible, we introduce a new method based on neural networks to approximate the mapping between CMSSM parameters and weak-scale particle masses. Our method reduces the computational effort needed to sample the CMSSM parameter space by a factor of ~ 10^4 with respect to conventional techniques. We find that both the Bayesian posterior and the profile likelihood intervals can significantly over-cover and identify the origin of this effect to physical boundaries in the parameter space. Finally, we point out that the effects intrinsic to the statistical procedure are conflated with simplifications to the likelihood functions from the experiments themselves.Comment: Further checks about accuracy of neural network approximation, fixed typos, added refs. Main results unchanged. Matches version accepted by JHE

    The impact of priors and observables on parameter inferences in the Constrained MSSM

    Get PDF
    We use a newly released version of the SuperBayeS code to analyze the impact of the choice of priors and the influence of various constraints on the statistical conclusions for the preferred values of the parameters of the Constrained MSSM. We assess the effect in a Bayesian framework and compare it with an alternative likelihood-based measure of a profile likelihood. We employ a new scanning algorithm (MultiNest) which increases the computational efficiency by a factor ~200 with respect to previously used techniques. We demonstrate that the currently available data are not yet sufficiently constraining to allow one to determine the preferred values of CMSSM parameters in a way that is completely independent of the choice of priors and statistical measures. While b->s gamma generally favors large m_0, this is in some contrast with the preference for low values of m_0 and m_1/2 that is almost entirely a consequence of a combination of prior effects and a single constraint coming from the anomalous magnetic moment of the muon, which remains somewhat controversial. Using an information-theoretical measure, we find that the cosmological dark matter abundance determination provides at least 80% of the total constraining power of all available observables. Despite the remaining uncertainties, prospects for direct detection in the CMSSM remain excellent, with the spin-independent neutralino-proton cross section almost guaranteed above sigma_SI ~ 10^{-10} pb, independently of the choice of priors or statistics. Likewise, gluino and lightest Higgs discovery at the LHC remain highly encouraging. While in this work we have used the CMSSM as particle physics model, our formalism and scanning technique can be readily applied to a wider class of models with several free parameters.Comment: Minor changes, extended discussion of profile likelihood. Matches JHEP accepted version. SuperBayeS code with MultiNest algorithm available at http://www.superbayes.or

    Capture barrier distributions: Some insights and details

    No full text
    The “experimental barrier distribution” provides a parameter-free representation of experimental heavy-ion capture cross sections that highlights the effects of entrance-channel couplings. Its relation to the s-wave transmission is discussed, and in particular it is shown how the full capture cross section can be generated from an l=0l=0 coupled-channels calculation. Furthermore, it is shown how this transmission can be simply exploited in calculations of quasifission and evaporation-residue cross sections. The system ^{48}Ca+^{154}Sm is studied in detail. A calculation of the compound-nucleus spin distribution reveals a possible energy dependence of barrier weights due to polarization arising from target and projectile quadrupole phonon states; this effect also gives rise to an entrance-channel “extra-push.

    Solving Cosmological Problems of Supersymmetric Axion Models in an Inflationary Universe

    Full text link
    We revisit inflationary cosmology of axion models in the light of recent developments on the inflaton decay in supergravity. We find that all the cosmological difficulties, including gravitino, axino overproduction and axionic isocurvature fluctuation, can be avoided if the saxion field has large initial amplitude during inflation and decays before big-bang nucleosynthesis.Comment: 19 pages, 4 figure

    Rate-Control or Rhythm-Contol: Where do we stand?

    Get PDF
    Atrial fibrillation is the most common sustained rhythm disturbance and its prevalence is increasing worldwide due to the progressive aging of the population. Current guidelines clearly depict the gold standard management of acute symptomatic atrial fibrillation but the best-long term approach for first or recurrent atrial fibrillation is still debated with regard to quality of life, risk of new hospitalizations, and possible disabling complications, such as thromboembolic stroke, major bleeds and death. Some authors propose that regaining sinus rhythm in all cases, thus re-establishing a physiologic cardiac function not requiring a prolonged antithrombotic therapy, avoids the threat of intracranial or extracranial haemorrhages due to Vitamin K antagonists or aspirin. On the contrary, advocates of a rate control approach with an accurate antithrombotic prophylaxis propose that such a strategy may avoid the risk of cardiovascular and non cardiovascular side effects related to antiarrhythmic drugs. This review aims to explore the state of our knowledge in order to summarize evidences and issues that need to be furthermore clarified

    Phonon-Assisted Two-Photon Interference from Remote Quantum Emitters

    Get PDF
    Photonic quantum technologies are on the verge offinding applications in everyday life with quantum cryptography andquantum simulators on the horizon. Extensive research has beencarried out to identify suitable quantum emitters and single epitaxialquantum dots have emerged as near-optimal sources of bright, on-demand, highly indistinguishable single photons and entangledphoton-pairs. In order to build up quantum networks, it is essentialto interface remote quantum emitters. However, this is still anoutstanding challenge, as the quantum states of dissimilar“artificialatoms”have to be prepared on-demand with highfidelity and thegenerated photons have to be made indistinguishable in all possibledegrees of freedom. Here, we overcome this major obstacle and show an unprecedented two-photon interference (visibility of 51±5%) from remote strain-tunable GaAs quantum dots emitting on-demand photon-pairs. We achieve this result by exploiting forthefirst time the full potential of a novel phonon-assisted two-photon excitation scheme, which allows for the generation ofhighly indistinguishable (visibility of 71±9%) entangled photon-pairs (fidelity of 90±2%), enables push-button biexciton statepreparation (fidelity of 80±2%) and outperforms conventional resonant two-photon excitation schemes in terms of robustnessagainst environmental decoherence. Our results mark an important milestone for the practical realization of quantum repeatersand complex multiphoton entanglement experiments involving dissimilar artificial atom
    • …
    corecore