86 research outputs found

    Phenomenology of Dark Matter Annihilations in the Sun

    Get PDF
    The annihilation of dark matter (DM) particles accumulated in the Sun could produce a flux of neutrinos, which is potentially detectable with neutrino detectors/telescopes and the DM elastic scattering cross section can be constrained. Although the process of DM capture in astrophysical objects like the Sun is commonly assumed to be due to interactions only with nucleons, there are scenarios in which tree-level DM couplings to quarks are absent, and even if loop-induced interactions with nucleons are allowed, scatterings off electrons could be the dominant capture mechanism. We consider this possibility and study in detail all the ingredients necessary to compute the neutrino production rates from DM annihilations in the Sun (capture, annihilation and evaporation rates) for velocity-independent and isotropic, velocity-dependent and isotropic and momentum-dependent scattering cross sections for DM interactions with electrons and compare them with the results obtained for the case of interactions with nucleons. Moreover, we improve the usual calculations in a number of ways and provide analytical expressions. Interestingly, we find that the evaporation mass in the case of interactions with electrons could be below the GeV range, depending on the high-velocity tail of the DM distribution in the Sun, which would open a new mass window for searching for this type of scenarios.Die Annihilation von Dunkler Materie (DM), die sich in der Sonne angesammelt hat, könnte einen Fluss von Neutrinos erzeugen, der potentiell mit Neutrino-Detektoren / Teleskopen detektierbar ist und der DM-Streuquerschnitt eingeschrĂ€nkt werden kann. Obwohl angenommen wird, dass der Prozess der DM-Erfassung in astrophysikalischen Objekten wie der Sonne nur auf Wechselwirkungen mit Nukleonen zurĂŒckzufĂŒhren ist, gibt es Szenarien, in denen DM-Kopplungen zwischen BĂ€umen auf Quarks fehlen und auch wenn Schleifen-induzierte Wechselwirkungen mit Nukleonen zulĂ€ssig sind Streuungen von Elektronen könnten der dominierende Erfassungsmechanismus sein. Wir betrachten diese Möglichkeit und untersuchen detailliert alle notwendigen Bestandteile, um die Neutrino-Produktionsraten aus den DM-Annihilationen in der Sonne (Einfang-, Annihilations- und Verdampfungsraten) fĂŒr geschwindigkeitsunabhĂ€ngiges und isotropes, geschwindigkeitsabhĂ€ngiges und isotropes sowie impulsabhĂ€ngiges Streukreuz zu berechnen Abschnitte fĂŒr DM-Wechselwirkungen mit Elektronen und vergleichen sie mit den Ergebnissen fĂŒr den Fall der Wechselwirkungen mit Nukleonen. DarĂŒber hinaus verbessern wir die ĂŒblichen Berechnungen auf verschiedene Arten und liefern analytische AusdrĂŒcke. Interessanterweise finden wir, dass die Verdampfungsmasse im Falle von Wechselwirkungen mit Elektronen unterhalb des GeV-Bereichs liegen könnte, abhĂ€ngig von dem Hochgeschwindigkeits-Schwanz der DM-Verteilung in der Sonne, was ein neues Massenfenster fĂŒr die Suche nach dieser Art von Atomen eröffnen wĂŒrde Szenarien

    Warm Dark Matter Galaxy Formation

    Get PDF
    Numerous hypothetical particles have been predicted which might possibly make up the dark matter content of the Universe. One class of these particle candidates includes warm dark matter (WDM) particles, which have large early-time thermal velocities that serve to erase small-scale perturbations. This creates a cutoff in the linear power spectrum - the scale of which depends on the mass of the WDM particle - and results in a suppression in the numbers of low mass halos. Since the number of satellite galaxies around Milky Way-mass host galaxies is sensitive to this cutoff, we can use the number of satellites actually observed around our own galaxy as a test of different WDM models (such as sterile neutrinos). First, we explore the simplest case of a thermal relic WDM particle (and alter- natively a sterile neutrino produced via non-resonant oscillations). We use the galform semi-analytic model of galaxy formation to compare predicted satel- lite luminosity functions to Milky Way data and determine a lower bound on the WDM particle mass. This depends strongly on the Milky Way halo mass, and to some extent, on the baryonic physics assumed. For our fiducial model we find that for a thermal relic particle mass of 3.3 keV (the 2σ lower limit from an anal- ysis of the Lyman-α forest by Viel et al.) the Milky Way halo mass is required to be > 1.4 × 1012 M⊙. For this same fiducial model, we also find that all WDM particle masses are ruled out (at 95% confidence) if the Milky Way halo mass is smaller than 1.0 × 1012 M⊙, while if the mass of the Galactic halo is less than 1.8 ×1012 M⊙, only WDM relic particle masses larger than 2 keV are allowed. Next, we consider models in which some of the WDM particles are resonantly produced sterile neutrinos, which behave “colder” than the non-resonantly pro- duced population also being generated. This model of sterile neutrino darkmatter is well-motivated theoretically, and is also in less conflict with current Lyman-α bounds. This scenario then becomes a two-parameter problem involving both the particle mass and the resonant fraction. We repeat the satellite abundance test applied to this new problem to rule out parts of the parameter space for different Milky Way halo masses. Focusing on a 7 keV sterile neutrino particle which may have been hinted at by recent observations, we find that if the Milky Way halo mass is 2 × 1012 M⊙ then most cases are allowed, but if the mass is 1 × 1012 M⊙ then this particle is likely ruled out

    Current, September 26, 2011

    Get PDF
    https://irl.umsl.edu/current2010s/1087/thumbnail.jp

    Characterization and Optimization of the KATRIN Tritium Source

    Get PDF
    The Karlsruhe Tritium Neutrino (KATRIN) experiment aims to measure the effective electron anti-neutrino mass via high-precision spectroscopy of the energy spectrum of ÎČ\beta-decay electrons of tritium near the 18.6 keV endpoint with an unprecedented accuracy. The specifications of the KATRIN experiment result in an experimental setup with a target discovery potential of 5 σ\sigma for a neutrino mass of 350 meV/c−2^{-2} and the expected capability to push the upper limit on the neutrino mass down to a target of 200 meV/c−2^{-2} (90 % C. L.) if no neutrino mass signal is detected. To achieve this unprecedented sensitivity, both statistical and systematical uncertainties have to be stringently minimized. The reduction of statistical uncertainties requires a total measurement duration of 3 years with a windowless gaseous tritium source (WGTS) capable of producing 101110^{11} ÎČ\beta-electrons per second. To keep the systematical uncertainties at the level required to reach the target sensitivity, the activity of this ÎČ\beta-electron source needs to be stable at the level of 0.1%. This stability is influenced by several factors such as the purity and pressure of the tritium gas, as well as the temperature of the WGTS beam tube enclosing the gaseous tritium. Therefore, in order to minimize systematical uncertainties, a very stable injection of tritium gas into the beam tube is necessary. This is achieved by the tritium loop system. The main focus of this thesis is the optimization of stabilized injection of tritium gas into the WGTS, the in-depth characterization and modeling of the isotopic gas composition inside of the WGTS, as well as the measurement and continuous monitoring methods of the WGTS column density. Summarizing the results presented in this thesis, it was shown that the stability of the tritium column density of the WGTS meets the requirement of 0.1%. A variety of column density monitoring methods were implemented and have been used to derive the best current upper limit on the neutrino mass from direct measurement of 1.1 eV/c−2^{-2}. This outstanding performance in both stability and monitoring can be achieved reliably and reproducibly. This is necessary for the upcoming measurement campaigns, which are needed by the KATRIN experiment in order to meet its scientific goal of a 200 meV/c−2^{-2} sensitivity on the neutrino mass

    Prototype of machine learning “as a service” for CMS physics in signal vs background discrimination

    Get PDF
    Big volumes of data are collected and analysed by LHC experiments at CERN. The success of this scientific challenges is ensured by a great amount of computing power and storage capacity, operated over high performance networks, in very complex LHC computing models on the LHC Computing Grid infrastructure. Now in Run-2 data taking, LHC has an ambitious and broad experimental programme for the coming decades: it includes large investments in detector hardware, and similarly it requires commensurate investment in the R&D in software and com- puting to acquire, manage, process, and analyse the shear amounts of data to be recorded in the High-Luminosity LHC (HL-LHC) era. The new rise of Artificial Intelligence - related to the current Big Data era, to the technological progress and to a bump in resources democratization and efficient allocation at affordable costs through cloud solutions - is posing new challenges but also offering extremely promising techniques, not only for the commercial world but also for scientific enterprises such as HEP experiments. Machine Learning and Deep Learning are rapidly evolving approaches to characterising and describing data with the potential to radically change how data is reduced and analysed, also at LHC. This thesis aims at contributing to the construction of a Machine Learning “as a service” solution for CMS Physics needs, namely an end-to-end data-service to serve Machine Learning trained model to the CMS software framework. To this ambitious goal, this thesis work contributes firstly with a proof of concept of a first prototype of such infrastructure, and secondly with a specific physics use-case: the Signal versus Background discrimination in the study of CMS all-hadronic top quark decays, done with scalable Machine Learning techniques

    Quantum Ecologies in Cosmological Infrastructures: A Critical Holographers Encounters with the Meta/Physics of Landscape-Laboratories

    Get PDF
    Quantum Ecologies interrogates the role of physics in the construction of an indifferent and disenchanted universe. It explores conceptual resonances within and between new materialism, Indigenous philosophy of place, science fiction, and art. Quantum Ecologies recognizes that the world is alive and wise and considers relevant modes of responsible address within and as the Earth. Through theoretical and historical analysis, site based research and a/v installation Quantum Ecologies has developed the heuristic of the ‘holographic’ as a way to attend to the multi-temporal, co-present, and multi-scalar pluralities and layers of knowing, agency, and landscape. This feminist, anti-colonial art-science framework for critically engaging (physics) sites and philosophies addresses the scientific cosmology of the West that (inadvertently) legitimates the exploitation, dispossession, and extraction of Earthly beings and bodies. Holography as critical interferometry is applied to experimental sites and assemblages known as ‘landscape-laboratories’ as a mode of both reading and (re)writing them. My field/work has taken place in remote environmentally protected sites that are entangled and instrumentalized as cosmological sensing arrays, experimental nuclear fusion energy, or dark matter particle physics laboratories in Russia, France, the UK, Germany, and Canada. By thinking through the strangeness of these planetary quantum assemblages alongside sciences inheritances and genealogies in magic, alchemy, and mysticism I argue for the necessity of ‘another science’ that is situated, compassionate, and responsible. Quantum Ecologies proposes a plural, poly-perspectival assessment of place, where accounting for the promiscuous more-than of materials, sites, forces, and energies is a necessary and continuous (re)configuring of meta/physics and respectful anti-colonial engagement with Land
    • 

    corecore