7,155 research outputs found
Evolution of online algorithms in ATLAS and CMS in Run 2
The Large Hadron Collider has entered a new era in Run 2, with centre-of-mass
energy of 13 TeV and instantaneous luminosity reaching
10 cm s for pp
collisions. In order to cope with those harsher conditions, the ATLAS and CMS
collaborations have improved their online selection infrastructure to keep a
high efficiency for important physics processes - like W, Z and Higgs bosons in
their leptonic and diphoton modes - whilst keeping the size of data stream
compatible with the bandwidth and disk resources available. In this note, we
describe some of the trigger improvements implemented for Run 2, including
algorithms for selection of electrons, photons, muons and hadronic final
states.Comment: 6 pages. Presented at The Fifth Annual Conference on Large Hadron
Collider Physics (LHCP 2017), Shanghai, China, May 15-20, 201
An X-ray Survey in SA 57 with XMM-Newton
The maximum number density of Active Galactic Nuclei (AGNs), as deduced from
X-ray studies, occurs at z<~1, with lower luminosity objects peaking at smaller
redshifts. Optical studies lead to a different evolutionary behaviour, with a
number density peaking at z~2 independently of the intrinsic luminosity, but
this result is limited to active nuclei brighter than the host galaxy. A
selection based on optical variability can detect low luminosity AGNs (LLAGNs),
where the host galaxy light prevents the identification by non-stellar colours.
We want to collect X-ray data in a field where it exists an optically-selected
sample of "variable galaxies'', i.e. variable objects with diffuse appearance,
to investigate the X-ray and optical properties of the population of AGNs,
particularly of low luminosity ones, where the host galaxy is visible. We
observed a field of 0.2 deg^2 in the Selected Area 57, for 67ks with
XMM-Newton. We detected X-ray sources, and we correlated the list with a
photographic survey of SA 57, complete to B_J~23 and with available
spectroscopic data. We obtained a catalogue of 140 X-ray sources to limiting
fluxes 5x10^-16, 2x10^-15 erg/cm^2/s in the 0.5-2 keV and 2-10 keV
respectively, 98 of which are identified in the optical bands. The X-ray
detection of part of the variability-selected candidates confirms their AGN
nature. Diffuse variable objects populate the low luminosity side of the
sample. Only 25/44 optically-selected QSOs are detected in X-rays. 15% of all
QSOs in the field have X/O<0.1.Comment: 13 pages, 6 figures, 4 tables, A&A in pres
The CMS Trigger Upgrade for the HL-LHC
The CMS experiment has been designed with a two-level trigger system: the
Level-1 Trigger, implemented on custom-designed electronics, and the High Level
Trigger, a streamlined version of the CMS offline reconstruction software
running on a computer farm. During its second phase the LHC will reach a
luminosity of with a
pileup of 200 collisions, producing integrated luminosity greater than 3000
fb over the full experimental run. To fully exploit the higher
luminosity, the CMS experiment will introduce a more advanced Level-1 Trigger
and increase the full readout rate from 100 kHz to 750 kHz. CMS is designing an
efficient data-processing hardware trigger (Level-1) that will include tracking
information and high-granularity calorimeter information. The current
conceptual system design is expected to take full advantage of advances in FPGA
and link technologies over the coming years, providing a high-performance,
low-latency system for large throughput and sophisticated data correlation
across diverse sources. The higher luminosity, event complexity and input rate
present an unprecedented challenge to the High Level Trigger, that aims to
achieve a similar efficiency and rejection factor as today despite the higher
pileup and more pure preselection. In this presentation we will discuss the
ongoing studies and prospects for the online reconstruction and selection
algorithms for the high-luminosity era.Comment: 6 pages, 4 figures. Presented at CHEP 2019 - 24th International
Conference on Computing in High Energy and Nuclear Physics, Adelaide,
Australia, November 04-08, 2019. Replaced with published versio
Measurements and optimization of the light yield of a TeO crystal
Bolometers have proven to be good instruments to search for rare processes
because of their excellent energy resolution and their extremely low intrinsic
background. In this kind of detectors, the capability of discriminating alpha
particles from electrons represents an important aspect for the background
reduction. One possibility for obtaining such a discrimination is provided by
the detection of the Cherenkov light which, at the low energies of the natural
radioactivity, is only emitted by electrons. This paper describes the method
developed to evaluate the amount of light produced by a crystal of TeO when
hit by a 511 keV photon. The experimental measurements and the results of a
detailed simulation of the crystal and the readout system are shown and
compared. A light yield of about 52 Cherenkov photons per deposited MeV was
measured. The effect of wrapping the crystal with a PTFE layer, with the aim of
maximizing the light collection, is also presented
Analysis of flash flood scenarios in an urbanized catchment using a two-dimensional hydraulic model
Abstract. In Italy, growing urbanization is leading to a higher risk of flooding of small water courses, especially in steep catchments of limited area, where severe flash flood events can occur. The assessment of flash flood hazard requires new modelling tools that can reproduce both the rainfall–runoff processes in the catchment, and the flow processes in the drainage network. In this paper we propose the use of a simple two-dimensional hydraulic model for analysing a flood scenario in a small valley within the urban area of the city of Bologna, Italy. Historically this area has been prone to severe flood events, the most recent of which occurred in 1955 and 1932. Since then there has been a significant increase in urbanization of the lower portion of the catchment, while the natural stream bed has been partially replaced by a culvert. The two-dimensional hydraulic model was therefore applied at catchment scale, in order to simulate the possible effects of historical scenarios in the present catchment configuration. Rainfall and runoff data measured during recent rainfall events were used to calibrate model parameters. Model results show that the current culvert section would be insufficient to drain the runoff produced by intense rainfall events, with potential inundation of surrounding urban areas
Ancient Biomolecules Unravel our History: A Technical Update with Examples from the Middle East
Context: The study of ancient biomolecules represents a useful tool to address questions related to human history.
Objective: This manuscript provides an overview of the major categories of ancient biomolecules, highlighting their
potentialities when applied to research.
Methods: This study gathered knowledge from recently published papers on paleogenomics, paleoproteomics, ancient lipids
and stable isotope analyses with the aim of providing a technical and historical background on ancient biomolecules, and examples
of their application in the Arabian Peninsula and Middle East in general.
Results: The progress seen in the past decade with regard to the study of ancient biomolecules has led to a dramatic expansion
of the studies that apply those analyses. Increasing attention has also been paid to the development and optimization of protocols
aimed at reducing and/or preventing the risk of contamination. While extensively applied to Western areas, the study of ancient
biomolecules in the Middle East and the Arabian Peninsula has been limited.
Conclusions: Research on ancient biomolecules represents the most valuable source of information to understand our
evolutionary past at an inconceivable level of detail, especially when applied to areas so far underrepresented in this field, such as
the Middle East and the Arabian Peninsula in particular
TeO bolometers with Cherenkov signal tagging: towards next-generation neutrinoless double beta decay experiments
CUORE, an array of 988 TeO bolometers, is about to be one of the most
sensitive experiments searching for neutrinoless double-beta decay. Its
sensitivity could be further improved by removing the background from
radioactivity. A few years ago it has been pointed out that the signal from
s can be tagged by detecting the emitted Cherenkov light, which is not
produced by s. In this paper we confirm this possibility. For the first
time we measured the Cherenkov light emitted by a CUORE crystal, and found it
to be 100 eV at the -value of the decay. To completely reject the
background, we compute that one needs light detectors with baseline noise below
20 eV RMS, a value which is 3-4 times smaller than the average noise of the
bolometric light detectors we are using. We point out that an improved light
detector technology must be developed to obtain TeO bolometric experiments
able to probe the inverted hierarchy of neutrino masses.Comment: 5 pages, 4 figures. Added referee correction
Mense e personale addetto alle cucine: valutazione dei rischi occupazionali
The aim of the study is to evaluate the occupational risks among food service workers and cooks. During the occupational risks assessment the following risk factors must be evaluated: musculoskeletal disorders, chemical risk (cleaning kitchen work surface, dishes, utensils ecc.) biological risk (contact with foods or biological agents) cancerogenic risk (by baking smoke inhalation), and psycho-social stress. In this study the preventive measures and protective equipment to prevent health hazards for these workers have been evaluated (i.e. aspiration hood, adapted ventilation, chosen of less harmful methods of baking, ecc.). In particular the performance of rigid behavioural norms and hygienic procedures is very important for cooks and food service workers to reduce the risk of occupational infections
An early evaluation of the 2050 Calculator international outreach programme
This paper presents the findings of an early evaluation of the UK Department of Energy and Climate Change’s 2050 Calculator International Outreach Programme. The programme supported eleven countries to develop their own versions of the 2050 Calculator. Drawing on interviews with stakeholders who were involved directly and indirectly in the development of the 2050 Calculators, this paper evaluates the process of developing these tools in different national contexts and discusses the lessons learnt so far. The findings discussed include the original motivations for involvement and how these evolved through the project, and the process of stakeholder engagement. The latter was expected to be a key benefit of the Calculator, and one which would open up debate about long term energy futures. While the teams developing the Calculators faced challenges, including data availability, political buy-in, and defining scenario trajectories, a flexible approach enabled countries to develop Calculators that were tailored to their national objectives and political environments. Overall, the 2050 Calculators have led to a wide range of benefits and there is ongoing commitment to develop new iterations and applications to use these Calculators to support planning of, and debate on, future energy and emissions trajectories
- …