327 research outputs found

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Volume I. Introduction to DUNE

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. This TDR is intended to justify the technical choices for the far detector that flow down from the high-level physics goals through requirements at all levels of the Project. Volume I contains an executive summary that introduces the DUNE science program, the far detector and the strategy for its modular designs, and the organization and management of the Project. The remainder of Volume I provides more detail on the science program that drives the choice of detector technologies and on the technologies themselves. It also introduces the designs for the DUNE near detector and the DUNE computing model, for which DUNE is planning design reports. Volume II of this TDR describes DUNE\u27s physics program in detail. Volume III describes the technical coordination required for the far detector design, construction, installation, and integration, and its organizational structure. Volume IV describes the single-phase far detector technology. A planned Volume V will describe the dual-phase technology

    Measurement of the gamma ray background in the Davis Cavern at the Sanford Underground Research Facility

    Get PDF
    Deep underground environments are ideal for low background searches due to the attenuation of cosmic rays by passage through the earth. However, they are affected by backgrounds from Îł-rays emitted by 40K and the 238U and 232Th decay chains in the surrounding rock. The LUX-ZEPLIN (LZ) experiment will search for dark matter particle interactions with a liquid xenon TPC located within the Davis campus at the Sanford Underground Research Facility, Lead, South Dakota, at the 4,850-foot level. In order to characterise the cavern background, in-situ Îł-ray measurements were taken with a sodium iodide detector in various locations and with lead shielding. The integral count rates (0--3300~keV) varied from 596~Hz to 1355~Hz for unshielded measurements, corresponding to a total flux in the cavern of 1.9±0.4~Îł cm−2s−1. The resulting activity in the walls of the cavern can be characterised as 220±60~Bq/kg of 40K, 29±15~Bq/kg of 238U, and 13±3~Bq/kg of 232Th

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Deep Underground Neutrino Experiment (DUNE), far detector technical design report, volume III: DUNE far detector technical coordination

    Get PDF
    The preponderance of matter over antimatter in the early universe, the dynamics of the supernovae that produced the heavy elements necessary for life, and whether protons eventually decay—these mysteries at the forefront of particle physics and astrophysics are key to understanding the early evolution of our universe, its current state, and its eventual fate. The Deep Underground Neutrino Experiment (DUNE) is an international world-class experiment dedicated to addressing these questions as it searches for leptonic charge-parity symmetry violation, stands ready to capture supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector technical design report (TDR) describes the DUNE physics program and the technical designs of the single- and dual-phase DUNE liquid argon TPC far detector modules. Volume III of this TDR describes how the activities required to design, construct, fabricate, install, and commission the DUNE far detector modules are organized and managed. This volume details the organizational structures that will carry out and/or oversee the planned far detector activities safely, successfully, on time, and on budget. It presents overviews of the facilities, supporting infrastructure, and detectors for context, and it outlines the project-related functions and methodologies used by the DUNE technical coordination organization, focusing on the areas of integration engineering, technical reviews, quality assurance and control, and safety oversight. Because of its more advanced stage of development, functional examples presented in this volume focus primarily on the single-phase (SP) detector module

    Search for pair production of boosted Higgs bosons via vector-boson fusion in the bb¯bb¯ final state using pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    A search for Higgs boson pair production via vector-boson fusion is performed in the Lorentz-boosted regime, where a Higgs boson candidate is reconstructed as a single large-radius jet, using 140 fb−1 of proton–proton collision data at √s = 13 TeV recorded by the ATLAS detector at the Large Hadron Collider. Only Higgs boson decays into bottom quark pairs are considered. The search is particularly sensitive to the quartic coupling between two vector bosons and two Higgs bosons relative to its Standard Model prediction, K2V . This study constrains K2V to 0.55 < K2V < 1.49 at the 95% confidence level. The value K2V = 0 is excluded with a significance of 3.8 standard deviations with other Higgs boson couplings fixed to their Standard Model values. A search for new heavy spin-0 resonances that would mediate Higgs boson pair production via vector-boson fusion is carried out in the mass range of 1–5 TeV for the first time under several model and decay-width assumptions. No significant deviation from the Standard Model hypothesis is observed and exclusion limits at the 95% confidence level are derived

    Search for heavy resonances decaying into a Z or W boson and a Higgs boson in final states with leptons and b-jets in 139 fb−1 of pp collisions at s√ = 13 TeV with the ATLAS detector

    Get PDF
    This article presents a search for new resonances decaying into a Z or W boson and a 125 GeV Higgs boson h, and it targets the ÎœÎœÂŻÂŻÂŻbbÂŻÂŻ, ℓ+ℓ−bbÂŻÂŻ, or ℓ±ΜbbÂŻÂŻ final states, where ℓ = e or ÎŒ, in proton-proton collisions at s√ = 13 TeV. The data used correspond to a total integrated luminosity of 139 fb−1 collected by the ATLAS detector during Run 2 of the LHC at CERN. The search is conducted by examining the reconstructed invariant or transverse mass distributions of Zh or Wh candidates for evidence of a localised excess in the mass range from 220 GeV to 5 TeV. No significant excess is observed and 95% confidence-level upper limits between 1.3 pb and 0.3 fb are placed on the production cross section times branching fraction of neutral and charged spin-1 resonances and CP-odd scalar bosons. These limits are converted into constraints on the parameter space of the Heavy Vector Triplet model and the two-Higgs-doublet model

    The ATLAS trigger system for LHC Run 3 and trigger performance in 2022

    Get PDF
    The ATLAS trigger system is a crucial component of the ATLAS experiment at the LHC. It is responsible for selecting events in line with the ATLAS physics programme. This paper presents an overview of the changes to the trigger and data acquisition system during the second long shutdown of the LHC, and shows the performance of the trigger system and its components in the proton-proton collisions during the 2022 commissioning period as well as its expected performance in proton-proton and heavy-ion collisions for the remainder of the third LHC data-taking period (2022–2025)

    Search for boosted diphoton resonances in the 10 to 70 GeV mass range using 138 fb−1 of 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for diphoton resonances in the mass range between 10 and 70 GeV with the ATLAS experiment at the Large Hadron Collider (LHC) is presented. The analysis is based on pp collision data corresponding to an integrated luminosity of 138 fb−1 at a centre-of-mass energy of 13 TeV recorded from 2015 to 2018. Previous searches for diphoton resonances at the LHC have explored masses down to 65 GeV, finding no evidence of new particles. This search exploits the particular kinematics of events with pairs of closely spaced photons reconstructed in the detector, allowing examination of invariant masses down to 10 GeV. The presented strategy covers a region previously unexplored at hadron colliders because of the experimental challenges of recording low-energy photons and estimating the backgrounds. No significant excess is observed and the reported limits provide the strongest bound on promptly decaying axion-like particles coupling to gluons and photons for masses between 10 and 70 GeV

    Evidence for the charge asymmetry in pp → tt¯ production at s√ = 13 TeV with the ATLAS detector

    Get PDF
    Inclusive and differential measurements of the top–antitop (ttÂŻ) charge asymmetry AttÂŻC and the leptonic asymmetry Aℓℓ¯C are presented in proton–proton collisions at s√ = 13 TeV recorded by the ATLAS experiment at the CERN Large Hadron Collider. The measurement uses the complete Run 2 dataset, corresponding to an integrated luminosity of 139 fb−1, combines data in the single-lepton and dilepton channels, and employs reconstruction techniques adapted to both the resolved and boosted topologies. A Bayesian unfolding procedure is performed to correct for detector resolution and acceptance effects. The combined inclusive ttÂŻ charge asymmetry is measured to be AttÂŻC = 0.0068 ± 0.0015, which differs from zero by 4.7 standard deviations. Differential measurements are performed as a function of the invariant mass, transverse momentum and longitudinal boost of the ttÂŻ system. Both the inclusive and differential measurements are found to be compatible with the Standard Model predictions, at next-to-next-to-leading order in quantum chromodynamics perturbation theory with next-to-leading-order electroweak corrections. The measurements are interpreted in the framework of the Standard Model effective field theory, placing competitive bounds on several Wilson coefficients
    • 

    corecore