80 research outputs found

    Preparing for Dark Matter: Maximising our discrimination power in the event of detection

    Get PDF
    Numerous experimental observations place Dark Matter (DM) as a central character in our cosmological history. Many extensions to the Standard Model of particles physics provide candidates for DM, often predicting interactions additional to gravity. This gives us the opportunity to experimentally probe these extensions and determine the nature of DM. In this thesis, we explore how direct DM detection could be used most effectively to achieve this goal. With this in mind, we have developed a tool for performing multidimensional parameter scans. This tool allows us to evaluate the capabilities of current and future detectors for detecting and understanding DM interactions. We show that by extending the energy region analysed, detection sensitivities and parameter reconstruction can be improved substantially. These insights play an important role in more global analyses, where hints of DM could come from other experiments, but verification depends on direct detection

    Does the Orientation of an Euler Diagram Affect User Comprehension?

    Get PDF
    Euler diagrams, which form the basis of numerous visual languages, can be an effective representation of information when they are both well-matched and well-formed. However, being well-matched and well-formed alone does not imply effectiveness. Other diagrammatical properties need to be considered. Information visualization theorists have known for some time that orientation has the potential to affect our interpretation of diagrams. This paper begins by explaining why well-matched and well-formed drawing principles are insufficient and discusses why we should study the orientation of Euler diagrams. To this end an empirical study is presented, designed to observe the effect of orientation upon the comprehension of Euler diagrams. The paper concludes that the orientation of Euler diagrams does not significantly affect comprehension

    Extending preferred axion models via heavy-quark induced early matter domination

    Full text link
    We examine the cosmological consequences of the heavy quarks in KSVZ-type axion models. We find that their presence often causes an early matter domination phase, altering the evolution of the Universe. This extends the axion mass into the region where standard cosmology leads to overproduction, and allows for a greater number of axion models with non-renormalizable terms to be viable. Quantitatively, we find that decays proceeding through effective terms of up to dimension 9 (d=9d=9) remain consistent with cosmological constraints, in contrast with the result d5d\leq5 previously found in the literature. As a consequence, the heavy quarks can be much heavier and the axion mass window with the correct relic density for dark matter is extended by orders of magnitude, down to ma6×109eVm_a\approx 6\times 10^{-9} \,{\rm eV}. This is achieved without resorting to fine-tuning of the initial misalignment angle, bolstering the motivation for many future axion haloscope experiments. Additionally, we explore how these models can be probed through measurements of the number of relativistic degrees of freedom at recombination.Comment: 24 pages, 6 figure

    Light and Darkness: consistently coupling dark matter to photons via effective operators

    Full text link
    We revise the treatment of fermionic dark matter interacting with photons via dimension-5 and -6 effective operators. We show how the application of the effective operators beyond their validity introduces unphysical, gauge violating effects that are relevant for current experimental searches. Restoring gauge invariance by coupling dark matter to the hypercharge gauge field has implications for the parameter space above and below the electroweak scale. We review the phenomenology of these hypercharge form factors at the LHC as well as for direct and indirect detection experiments. We highlight where the electromagnetic and hypercharge descriptions lead to wildly different conclusions about the viable parameter space and the relative sensitivity of various probes. These include a drastic weakening of vector bosons fusion versus mono-jet searches at the LHC, and the incorrect impression that indirect searches could lead to better constraints than direct detection for larger dark matter masses. We find that the dimension-5 operators are strongly constrained by direct detection bounds, while for dimension-6 operators LHC mono-jet searches are competitive or performing better than the other probes we consider.Comment: 24 pages, 14 figures, 2 tables. Matches published version, additional information in figure

    Isospin-violating dark matter at liquid noble detectors: new constraints, future projections, and an exploration of target complementarity

    Full text link
    There is no known reason that dark matter interactions with the Standard Model should couple to neutrons and protons in the same way. This isospin violation can have large consequences, modifying the sensitivity of existing and future direct detection experimental constraints by orders of magnitude. Previous works in the literature have focused on the zero-momentum limit which has its limitations when extending the analysis to the Non-Relativistic Effective Field Theory basis (NREFT). In this paper, we study isospin violation in a detailed manner, paying specific attention to the experimental setups of liquid noble detectors. We analyse two effective Standard Model gauge invariant models as interesting case studies as well as the more model-independent NREFT operators. This work demonstrates the high degree of complementarity between the target nuclei xenon and argon. Most notably, we show that the Standard Model gauge-invariant formulation of the standard spin-dependent interaction often generates a sizeable response from argon, a target nuclei with zero spin. This work is meant as an update and a useful reference to model builders and experimentalists.Comment: 22 pages in total, 13 figures, 1 table, 3 appendices. Data from the main results of this paper is available at https://doi.org/10.1140/epjc/s10052-023-11826-

    Dark matter production through a non-thermal flavon portal

    Full text link
    The Froggatt-Nielsen (FN) mechanism provides an attractive way of generating the determined fermion mass hierarchy and quark mixing matrix elements in the Standard Model (SM). Here we extend it by coupling the FN field, the flavon, to a dark sector containing one or more dark matter particles which are produced non-thermally sequentially through flavon production. Non-thermal flavon production occurs efficiently via freeze-in and through field oscillations. We explore this in the regime of high-scale breaking Λ\Lambda of the global U(1)FNU(1)_{\textrm{FN}} group and at the reheating temperature TRΛT_R\ll \Lambda where the flavon remains out of equilibrium at all times. We identify phenomenologically acceptable regions of TRT_R and the flavon mass where the relic abundance of dark matter and other cosmological constraints are satisfied. In the case of one-component dark matter we find an effective upper limit on the FN charges at high Λ\Lambda, i.e. QFNDM13Q_{\rm FN}^{\rm DM}\leq13. In the multi-component dark sector scenario the dark particle can be the heaviest dark particle that can be effectively stable at cosmological timescales, alternatively it can be produced sequentially by decays of the heavier ones. For scenarios where dark decays occur at intermediate timescales, i.e. t0.11028st\sim 0.1- 10^{28}\,{\rm s}, we find that existing searches can effectively probe interesting regions of parameter space. These searches include indirect probes on decays such as γ\gamma-ray and neutrino telescopes as well as analyses of the Cosmic Microwave Background, as well as constraints on small scale structure formation from the Lyman-α\alpha forest. We comment on the future prospects of such probes and place projected sensitivities.Comment: 21 pages, 4 figure

    Evaporation of Primordial Black Holes in the Early Universe: Mass and Spin Distributions

    Get PDF
    Many cosmological phenomena lead to the production of primordial black holes in the early Universe. These phenomena often create a population of black holes with extended mass and spin distributions. As these black holes evaporate via Hawking radiation, they can modify various cosmological observables, lead to the production of dark matter, modify the number of effective relativistic degrees of freedom, NeffN_{\rm eff}, source a stochastic gravitational wave background and alter the dynamics of baryogenesis. We consider the Hawking evaporation of primordial black holes that feature non-trivial mass and spin distributions in the early Universe. We demonstrate that the shape of such a distribution can strongly affect most of the aforementioned cosmological observables. We outline the numerical machinery we use to undertake this task. We also release a new version of FRISBHEE that handles the evaporation of primordial black holes with an arbitrary mass and spin distribution throughout cosmic history.Comment: 16 pages, 6 figures. Numerical codes released in https://github.com/yfperezg/frisbhe

    Concert recording 2016-09-16

    Get PDF
    [Tracks 1-3]. Passione amorosa / Giovanni Bottesini -- [Tracks 4-5]. Gran duo concertante for violin, double bass, and piano / Bottesini -- [Track 6]. Quintet for piano and strings in A major, op. 114, D. 667, Trout quintet / Franz Schubert

    The problem with Kappa

    Get PDF
    It is becoming clear that traditional evaluation measures used in Computational Linguistics (including Error Rates, Accuracy, Recall, Precision and F-measure) are of limited value for unbiased evaluation of systems, and are not meaningful for comparison of algorithms unless both the dataset and algorithm parameters are strictly controlled for skew (Prevalence and Bias). The use of techniques originally designed for other purposes, in particular Receiver Operating Characteristics Area Under Curve, plus variants of Kappa, have been proposed to fill the void. This paper aims to clear up some of the confusion relating to evaluation, by demonstrating that the usefulness of each evaluation method is highly dependent on the assumptions made about the distributions of the dataset and the underlying populations. The behaviour of a number of evaluation measures is compared under common assumptions. Deploying a system in a context which has the opposite skew from its validation set can be expected to approximately negate Fleiss Kappa and halve Cohen Kappa but leave Powers Kappa unchanged. For most performance evaluation purposes, the latter is thus most appropriate, whilst for comparison of behaviour, Matthews Correlation is recommended
    corecore