435 research outputs found
Topological Reconstruction of Particle Physics Processes using Graph Neural Networks
We present a new approach, the Topograph, which reconstructs underlying
physics processes, including the intermediary particles, by leveraging
underlying priors from the nature of particle physics decays and the
flexibility of message passing graph neural networks. The Topograph not only
solves the combinatoric assignment of observed final state objects, associating
them to their original mother particles, but directly predicts the properties
of intermediate particles in hard scatter processes and their subsequent
decays. In comparison to standard combinatoric approaches or modern approaches
using graph neural networks, which scale exponentially or quadratically, the
complexity of Topographs scales linearly with the number of reconstructed
objects.
We apply Topographs to top quark pair production in the all hadronic decay
channel, where we outperform the standard approach and match the performance of
the state-of-the-art machine learning technique.Comment: 25 pages, 24 figures, 8 table
Flow Away your Differences: Conditional Normalizing Flows as an Improvement to Reweighting
We present an alternative to reweighting techniques for modifying
distributions to account for a desired change in an underlying conditional
distribution, as is often needed to correct for mis-modelling in a simulated
sample. We employ conditional normalizing flows to learn the full conditional
probability distribution from which we sample new events for conditional values
drawn from the target distribution to produce the desired, altered
distribution. In contrast to common reweighting techniques, this procedure is
independent of binning choice and does not rely on an estimate of the density
ratio between two distributions.
In several toy examples we show that normalizing flows outperform reweighting
approaches to match the distribution of the target.We demonstrate that the
corrected distribution closes well with the ground truth, and a statistical
uncertainty on the training dataset can be ascertained with bootstrapping. In
our examples, this leads to a statistical precision up to three times greater
than using reweighting techniques with identical sample sizes for the source
and target distributions. We also explore an application in the context of high
energy particle physics.Comment: 21 pages, 9 figure
Study of the decay
The decay is studied
in proton-proton collisions at a center-of-mass energy of TeV
using data corresponding to an integrated luminosity of 5
collected by the LHCb experiment. In the system, the
state observed at the BaBar and Belle experiments is
resolved into two narrower states, and ,
whose masses and widths are measured to be where the first uncertainties are statistical and the second
systematic. The results are consistent with a previous LHCb measurement using a
prompt sample. Evidence of a new
state is found with a local significance of , whose mass and width
are measured to be and , respectively. In addition, evidence of a new decay mode
is found with a significance of
. The relative branching fraction of with respect to the
decay is measured to be , where the first
uncertainty is statistical, the second systematic and the third originates from
the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb
public pages
Multidifferential study of identified charged hadron distributions in -tagged jets in proton-proton collisions at 13 TeV
Jet fragmentation functions are measured for the first time in proton-proton
collisions for charged pions, kaons, and protons within jets recoiling against
a boson. The charged-hadron distributions are studied longitudinally and
transversely to the jet direction for jets with transverse momentum 20 GeV and in the pseudorapidity range . The
data sample was collected with the LHCb experiment at a center-of-mass energy
of 13 TeV, corresponding to an integrated luminosity of 1.64 fb. Triple
differential distributions as a function of the hadron longitudinal momentum
fraction, hadron transverse momentum, and jet transverse momentum are also
measured for the first time. This helps constrain transverse-momentum-dependent
fragmentation functions. Differences in the shapes and magnitudes of the
measured distributions for the different hadron species provide insights into
the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any
supplementary material and additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb
public pages
Measurement of the ratios of branching fractions and
The ratios of branching fractions
and are measured, assuming isospin symmetry, using a
sample of proton-proton collision data corresponding to 3.0 fb of
integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The
tau lepton is identified in the decay mode
. The measured values are
and
, where the first uncertainty is
statistical and the second is systematic. The correlation between these
measurements is . Results are consistent with the current average
of these quantities and are at a combined 1.9 standard deviations from the
predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb
public pages
SOFTWARE Manual for VMM3 Slow Control
For the New Small Wheel upgrade of the ATLAS detector a new readout chip, called VMM3(a), was developed. In order to provide this new technology to a larger community, the RD51 collaboration is integrating the VMM3 in their scalable readout system (SRS). For this purpose, a new slow control and calibration tool is necessary. This new software was developed and improved within a CERN Summer Student project
Search for Production in the Lepton + Jets Channel and Quark Flavour Tagging with Deep Learning at the ATLAS Experiment
Since several decades, the predictions of the Standard Model (SM) of particle physics are being probed and validated. One major success of the Large Hadron Collider (LHC) at CERN was the discovery of the Higgs boson in 2012. With the increasing amount of proton-proton collisions recorded with the experiments located at the LHC, precise Higgs measurements are now possible and rare processes are accessible. ATLAS and CMS recently discovered the production process of a Higgs boson in association with a pair of top quarks using LHC RUN II data. The process allows for a direct measurement of the Top-Yukawa coupling which is the strongest fermion-Higgs coupling in the Standard Model and plays therefore an important role in Higgs physics. The challenging final state with at least 4 -jets requires an advanced analysis strategy as well as sophisticated -jet identification methods. -tagging is not only crucial in the analysis, but most physics analyses within ATLAS are making use of it. The reoptimisation of the deep-learning-based heavy flavour tagger in ATLAS is shown in this thesis for two different jet collections. Various improvements were made resulting in a drastic performance increase up to a factor two in certain regions of the phase space. The analysis is performed using of RUN II ATLAS data at a centre-of-mass energy of . The signal strength, being the ratio of the measured cross-section over the predicted cross-section in the SM, was measured to be with an observed (expected) significance of standard deviations in the inclusive cross-section measurement. In addition, a simplified template cross-section (STXS) measurement in different Higgs bins is performed which is possible because of the ability to reconstruct the Higgs boson. The measurement is limited by the capability to describe the challenging irreducible background and by systematic uncertainties
Signal Region Optimisation Studies Based on BDT and Multi-Bin Approaches in the Context of Supersymmetry Searches in Hadronic Final States with the ATLAS Detector
The searches for supersymmetric phenomena are mostly based on simple Cut & Count methods. One example is the search for squarks and gluinos in final states with multiple jets, missing transverse momentum and without leptons. This analysis, based on of collision data at = 13 TeV recorded with the ATLAS detector, uses Cut & Count based methods in the signal regions. In order to improve the analysis sensitivity, the use of sophisticated techniques, such as boosted decision trees (BDT) and Multi-Bin, is being investigated in this thesis. The focus of the study lies on squarks and gluino searches. These techniques are evaluated using Monte Carlo simulation. The goal is to find a new approach which is on the one hand simple but allows for a significant improvement. A gain up to approximately 200 GeV in the neutralino mass and an enhancement of about 200 GeV in the squark and gluino mass is achieved with these new techniques
Search for the Standard Model Higgs boson produced in association with top quarks and decaying into a pair of b-quarks with the ATLAS detector
Testing the Yukawa couplings of the Higgs boson to fermions is an important part to understand the origin of fermion masses. The strongest Yukawa coupling is that of the Higgs boson to the top quark, where the Higgs-boson production in association with two top-quarks is a direct probe of this interaction. The decay of the Higgs boson into two b-quarks has the highest branching ratio and allows in principle for the kinematic reconstruction of the Higgs boson. The talk presents the latest results of this process performed by the ATLAS Collaboration based on pp collision data collected at 13 TeV
Recherche de la production ttH(bb) dans le canal lepton+jets et étiquetage de quarks de saveur lourde par apprentissage profond dans l'expérience ATLAS
Seit mehreren Jahrzehnten werden die Vorhersagen des Standardmodells (SM) der Teilchenphysik erprobt und validiert. Mit der zunehmenden Anzahl von Proton-Proton-Kollisionen, die mit den Experimenten am LHC aufgezeichnet werden, sind nun präzise Higgs-Messungen möglich.ATLAS und CMS haben kürzlich den ttH-Produktionsprozess mit Hilfe von LHC RUN II-Daten entdeckt. Der ttH(bb)-Prozess ermöglicht eine direkte Messung der Top-Yukawa-Kopplung, welche die stärkste Fermion-Higgs-Kopplung ist und daher eine wichtige Rolle im SM einnimmt. Der anspruchsvolle Endzustand mit mindestens 4 b-Jets erfordert eine fortschrittliche Analysestrategie sowie elaborierte b-Jet-Identifikationsmethoden. -Tagging ist nicht nur in der ttH(bb)-Analyse von entscheidender Bedeutung, sondern die meisten Physik-Analysen innerhalb von ATLAS machen davon Gebrauch. Die Re-Optimierung des Deep-Learning-basierten Heavy-Flavour Taggers in ATLAS wird in dieser Arbeit für zwei verschiedene Jet-Definitionen gezeigt. Es wurden verschiedene Änderungen vorgenommen, die zu einer signifikanten Verbesserung von bis zu einem Faktor zwei in der Untergrundunterdrückung in bestimmten Phasenraumregionen führten.Die ttH(bb)-Analyse wurde mit 139 fb-1 RUN II ATLAS-Daten bei einer Schwerpunktsenergie von √s=13 TeV durchgeführt. Die Signalstärke, d.h. das Verhältnis des gemessenen Wirkungsquerschnitts zum vorhergesagten Wirkungsquerschnitt im SM, wurde mit 0,43+0,20/-0,19(stat.)+0,30/-0,27(syst.) mit einer beobachteten (erwarteten) Signifikanz von 1,3 (3,0) Standardabweichungen für den inklusiven Wirkungsquerschnitt gemessen. Zusätzlich wurde zum ersten Mal eine vereinfachte differenzielle Wirkungsquerschnittsmessung in verschiedenen Higgs pT-Bereichen durchgeführt. Die Messung wird durch systematische Unsicherheiten begrenzt, hauptsächlich im Zusammenhang mit dem anspruchsvollen irreduziblen ttbar+bb Untergrund.Since several decades, the predictions of the Standard Model (SM) of particle physics are being probed and validated. One major success of the Large Hadron Collider (LHC) at CERN was the discovery of the Higgs boson in 2012. With the increasing amount of proton-proton collisions recorded with the experiments located at the LHC, precise Higgs measurements are now possible and rare processes are accessible.ATLAS and CMS recently discovered the production process of a Higgs boson in association with a pair of top quarks using LHC RUN II data. The ttH(bb) process allows for a direct measurement of the Top-Yukawa coupling which is the strongest fermion-Higgs coupling in the Standard Model and plays therefore an important role in Higgs physics. The challenging final state with at least 4 b-jets requires an advanced analysis strategy as well as sophisticated b-jet identification methods. b-tagging is not only crucial in the ttH(bb) analysis, but most physics analyses within ATLAS are making use of it. The reoptimisation of the deep-learning-based heavy flavour tagger in ATLAS is shown in this thesis for two different jet collections. Various improvements were made resulting in a drastic performance increase up to a factor two in certain regions of the phase space.The ttH(bb) analysis is performed using 139 fb-1 of RUN II ATLAS data at a centre-of-mass energy of √s=13 TeV. The signal strength, being the ratio of the measured cross-section over the predicted cross-section in the SM, was measured to be 0.43+0.20/-0.19(stat.)+0.30/-0.27(syst.) with an observed (expected) significance of 1.3 (3.0) standard deviations in the inclusive cross-section measurement. In addition, a simplified template cross-section (STXS) measurement in different Higgs pT bins is performed which is possible because of the ability to reconstruct the Higgs boson. The measurement is limited by the capability to describe the challenging irreducible ttbar+bb background and by systematic uncertainties.ATLAS et CMS ont récemment découvert le processus de production ttH en utilisant les données prises durant le RUN II du LHC. Le processus ttH(bb) permet de mesurer directement le couplage de Yukawa du quark top, qui est le couplage fermion-Higgs le plus grand du modèle standard et joue donc un rôle important dans la physique du boson du Higgs.L'état final de ce processus contient au moins 4 jets provenant de quarks b ce qui nécessite d'établir une stratégie d'analyse avancée ainsi que de développer des méthodes sophistiquées pour l'identification des jets provenant de quarks b. L'étiquetage des quarks b n'est pas seulement crucial pour l'analyse ttH(bb), mais aussi pour la plupart des analyses de physique au sein de l'expérience d'ATLAS. La ré-optimisation de l'étiquetage des quarks de saveurs lourdes basé sur un apprentissage profond dans ATLAS est présentée dans cette thèse pour deux collections de jets différentes. Diverses améliorations ont été apportées, entraînant une augmentation importantes des performances allant jusqu'à un facteur deux dans certaines régions de l'espace des phases.L'analyse ttH(bb) est effectuée en utilisant 139 fb-1 de données enregistrées par ATLAS durant le RUN II à une énergie dans le centre de masse de √s=13 TeV.L'intensité du signal, qui est le rapport entre la section efficace mesurée et la section efficace prédite par le modèle standard, a été mesurée à 0,43+0,20/-0,19(stat.)+0,30/-0,27(syst.) avec une signification observée (prévue) de 1,3 (3,0) déviations standard pour la mesure de la section efficace inclusive. En outre, une mesure simplifiée de la section efficace utilisant des gabarits Monte Carlo en fonction de l'impulsion transverse du boson de Higgs est effectuée.Cette mesure est limitée par la difficulté de simuler correctement le bruit de fond dominant ttbar+bb ainsi que par de grandes incertitudes systématiques
- …
