95 research outputs found

    JUNO Conceptual Design Report

    Get PDF
    The Jiangmen Underground Neutrino Observatory (JUNO) is proposed to determine the neutrino mass hierarchy using an underground liquid scintillator detector. It is located 53 km away from both Yangjiang and Taishan Nuclear Power Plants in Guangdong, China. The experimental hall, spanning more than 50 meters, is under a granite mountain of over 700 m overburden. Within six years of running, the detection of reactor antineutrinos can resolve the neutrino mass hierarchy at a confidence level of 3-4σ\sigma, and determine neutrino oscillation parameters sin2θ12\sin^2\theta_{12}, Δm212\Delta m^2_{21}, and Δmee2|\Delta m^2_{ee}| to an accuracy of better than 1%. The JUNO detector can be also used to study terrestrial and extra-terrestrial neutrinos and new physics beyond the Standard Model. The central detector contains 20,000 tons liquid scintillator with an acrylic sphere of 35 m in diameter. \sim17,000 508-mm diameter PMTs with high quantum efficiency provide \sim75% optical coverage. The current choice of the liquid scintillator is: linear alkyl benzene (LAB) as the solvent, plus PPO as the scintillation fluor and a wavelength-shifter (Bis-MSB). The number of detected photoelectrons per MeV is larger than 1,100 and the energy resolution is expected to be 3% at 1 MeV. The calibration system is designed to deploy multiple sources to cover the entire energy range of reactor antineutrinos, and to achieve a full-volume position coverage inside the detector. The veto system is used for muon detection, muon induced background study and reduction. It consists of a Water Cherenkov detector and a Top Tracker system. The readout system, the detector control system and the offline system insure efficient and stable data acquisition and processing.Comment: 328 pages, 211 figure

    Energy reconstruction on the LHC ATLAS TileCal upgraded front end: feasibility study for a sROD co-processing unit

    Get PDF
    Dissertation presented in ful lment of the requirements for the degree of: Master of Science in Physics 2016The Phase-II upgrade of the Large Hadron Collider at CERN in the early 2020s will enable an order of magnitude increase in the data produced, unlocking the potential for new physics discoveries. In the ATLAS detector, the upgraded Hadronic Tile Calorimeter (TileCal) Phase-II front end read out system is currently being prototyped to handle a total data throughput of 5.1 TB/s, from the current 20.4 GB/s. The FPGA based Super Read Out Driver (sROD) prototype must perform an energy reconstruction algorithm on 2.88 GB/s raw data, or 275 million events per second. Due to the very high level of pro ciency required and time consuming nature of FPGA rmware development, it may be more e ective to implement certain complex energy reconstruction and monitoring algorithms on a general purpose, CPU based sROD co-processor. Hence, the feasibility of a general purpose ARM System on Chip based co-processing unit (PU) for the sROD is determined in this work. A PCI-Express test platform was designed and constructed to link two ARM Cortex-A9 SoCs via their PCI-Express Gen-2 x1 interfaces. Test results indicate that the latency of the PCI-Express interface is su ciently low and the data throughput is superior to that of alternative interfaces such as Ethernet, for use as an interconnect for the SoCs to the sROD. CPU performance benchmarks were performed on ve ARM development platforms to determine the CPU integer, oating point and memory system performance as well as energy e ciency. To complement the benchmarks, Fast Fourier Transform and Optimal Filtering (OF) applications were also tested. Based on the test results, in order for the PU to process 275 million events per second with OF, within the 6 s timing budget of the ATLAS triggering system, a cluster of three Tegra-K1, Cortex-A15 SoCs connected to the sROD via a Gen-2 x8 PCI-Express interface would be suitable. A high level design for the PU is proposed which surpasses the requirements for the sROD co-processor and can also be used in a general purpose, high data throughput system, with 80 Gb/s Ethernet and 15 GB/s PCI-Express throughput, using four X-Gene SoCs

    Belle II Technical Design Report

    Full text link
    The Belle detector at the KEKB electron-positron collider has collected almost 1 billion Y(4S) events in its decade of operation. Super-KEKB, an upgrade of KEKB is under construction, to increase the luminosity by two orders of magnitude during a three-year shutdown, with an ultimate goal of 8E35 /cm^2 /s luminosity. To exploit the increased luminosity, an upgrade of the Belle detector has been proposed. A new international collaboration Belle-II, is being formed. The Technical Design Report presents physics motivation, basic methods of the accelerator upgrade, as well as key improvements of the detector.Comment: Edited by: Z. Dole\v{z}al and S. Un

    The ATLAS experiment at the CERN Large Hadron Collider: a description of the detector configuration for Run 3

    Get PDF
    The ATLAS detector is installed in its experimental cavern at Point 1 of the CERN Large Hadron Collider. During Run 2 of the LHC, a luminosity of ℒ = 2 × 1034 cm-2 s-1 was routinely achieved at the start of fills, twice the design luminosity. For Run 3, accelerator improvements, notably luminosity levelling, allow sustained running at an instantaneous luminosity of ℒ = 2 × 1034 cm-2 s-1, with an average of up to 60 interactions per bunch crossing. The ATLAS detector has been upgraded to recover Run 1 single-lepton trigger thresholds while operating comfortably under Run 3 sustained pileup conditions. A fourth pixel layer 3.3 cm from the beam axis was added before Run 2 to improve vertex reconstruction and b-tagging performance. New Liquid Argon Calorimeter digital trigger electronics, with corresponding upgrades to the Trigger and Data Acquisition system, take advantage of a factor of 10 finer granularity to improve triggering on electrons, photons, taus, and hadronic signatures through increased pileup rejection. The inner muon endcap wheels were replaced by New Small Wheels with Micromegas and small-strip Thin Gap Chamber detectors, providing both precision tracking and Level-1 Muon trigger functionality. Trigger coverage of the inner barrel muon layer near one endcap region was augmented with modules integrating new thin-gap resistive plate chambers and smaller-diameter drift-tube chambers. Tile Calorimeter scintillation counters were added to improve electron energy resolution and background rejection. Upgrades to Minimum Bias Trigger Scintillators and Forward Detectors improve luminosity monitoring and enable total proton-proton cross section, diffractive physics, and heavy ion measurements. These upgrades are all compatible with operation in the much harsher environment anticipated after the High-Luminosity upgrade of the LHC and are the first steps towards preparing ATLAS for the High-Luminosity upgrade of the LHC. This paper describes the Run 3 configuration of the ATLAS detector

    The search for new physics in the diphoton decay channel and the upgrade of the Tile-Calorimeter electronics of the ATLAS detector

    Get PDF
    A thesis submitted to the Faculty of Science, University of the Witwatersrand, Johannesburg, in fulfilment of the requirements for the degree of Doctor of Philosophy, School of Physics. Johannesburg. February, 2017The discovery of the Higgs boson at the Large Hadron Collider in Switzerland marks the beginning of a new era: Physics beyond the Standard Model (SM). A model is proposed to describe numerous Run I features observed with both the ATLAS and CMS experiments. The model introduces a heavy scalar estimated to be around 270 GeV and an intermediate scalar which can decay into both dark matter and SM particles. Three different final state searches, linked by the new hypothesis, are presented. These are the hh → γγb¯ b, γγ + Emiss T and high mass diphoton channels. No significant excesses were observed in any channel using the available datasets and limits were set on the relevant cross sections times branching ratios. The lack of statistics in the γγb¯ b analysis prevents any conclusive statement in regard to the excess observed with Run I data. Observing no excess in the γγ + Emiss T channel with the current amount of data is also consistent with the intermediate scalar decaying to SM particles. This could explain the excess of Higgs bosons produced in associations with top quarks in the multilepton final states observed in ATLAS and CMS in Run I and Run II. The work presented provides a deeper understanding on the underlying phenomenology of the hypothesis and provides a foundation for future work. The ATLAS detector underwent a stringent consolidation and validation effort before data taking could commence in 2015. A high voltage board was designed and implemented into a portable test-bench used in the certification and validation process. In addition to these efforts, the electronics on the ATLAS detector are being improved for the Phase-II upgrade program in 2024. A software tool has been designed which integrates the envisioned Phase-II backend infrastructure into the existing ATLAS detector control system. This software is now an ATLAS wide common tool used by multiple sub-detectors in the community.XL201

    Evaluation of an Associative Memory and FPGA-based System for the Track Trigger of the CMS-Detector

    Get PDF
    Im Jahr 2025 wird die Luminosität des Teilchenstrahls am Large Hadron Collider (LHC), dem größten Teilchenbeschleuniger der Welt mit den höchsten Energien, weiter erhöht. Dadurch werden noch mehr Teilchen gleichzeitig im Zentrum des Compact Muon Solenoid (CMS) Experimentes kollidieren. Um unter diesen neuen Bedingungen verwertbare Daten zu liefern, wird erstmals ein Spurtrigger für CMS entwickelt. Dieser verarbeitet die Daten des äußeren Spurdetektors und liefert die Parameter der Teilchenspuren an die erste Triggerstufe von CMS. Da die technischen Anforderungen an ein solches Spurtriggersystem enorm sind, wurde bisher noch nie ein Spurtrigger auf der ersten Triggerstufe eines Teilchenphysikexperimentes eingesetzt. Die Datenrate am Eingang des CMS-Spurtriggers wird beinahe 100 Tbit/s betragen und die Verarbeitungszeit darf 4 μs nicht überschreiten. Um diese außergewöhnlichen Anforderungen zu erfüllen, ist ein einzigartiges, heterogenes eingebettetes System erforderlich. Diese Dissertation präsentiert eine neu konzeptionierte Simulationsumgebung auf Systemebene für den CMS-Spurtrigger. Die Simulationsumgebung ermöglicht die Evaluation der CMS-Spurtriggerelektronik als Ganzes: von den Modulen mit den Siliziumdetektoren bis zu den Komponenten, welche die Algorithmen zur Spurerkennung ausführen. Die Simulation stellt dem Systementwickler drei Funktionen zur Verfügung: Erstens können Systemeigenschaften wie Latenz, Bandbreite und benötigte Puffergrößen abgeschätzt werden. Zweitens können verschiedene Systemarchitekturen miteinander verglichen werden. Drittens dient die Simulationsumgebung als Testumgebung für Algorithmen und Code, welcher in Field-Programmable Gate Arrays (FPGA) implementiert wird. Um realistische Ergebnisse zu erhalten, werden Daten einer Simulation des CMS-Experimentes als Eingangsdaten der Simulationsumgebung verwendet. Eines der untersuchten Konzepte für den CMS-Spurtrigger besteht aus bis zu 48 großen Baugruppenträgern mit Hunderten von Platinen. Zur Verarbeitung der Daten werden FPGAs und eigens für die Suche von Teilchenspuren entwickelte Assoziativspeicher genutzt. Prototypen einer Platine mit FPGAs und Assoziativspeicher Chips wurden am Karlsruher Institut für Technologie produziert und getestet. Zusätzlich wurde ein essenzieller Teil des CMS-Spurtriggers mithilfe der neuen Simulationsumgebung simuliert. Durch diese Implementierung wurde aufgezeigt, dass es möglich ist, ein solch großes System in der Simulationsumgebung zu simulieren. Innerhalb der Simulation werden viele Elemente des CMS-Spurttriggers vielfach instantiiert. Dabei sind die Elemente oft in regelmäßigen Strukturen wie zum Beispiel zwei- oder dreidimensionalen Rastern angeordnet. Eine SystemC-Bibliothek wurde entwickelt, um das Modellieren und Konfigurieren solcher Strukturen zu vereinfachen. Außerdem wurde eine unabhängige Kostenabschätzung des CMS-Spurtriggers durchgeführt. Diese zeigt, dass die veranschlagten 11,9 Millionen Euro ausreichen, um den auf Assoziativspeicher basierende CMS-Spurtrigger zu bauen. Werden die Werte anhand des Technologiefortschritts auf das Jahr 2022 hochgerechnet, kann sogar mit deutlich niedrigeren Kosten gerechnet werden
    corecore