41 research outputs found

    Potentiality of automatic parameter tuning suite available in ACTS track reconstruction software framework

    Full text link
    Particle tracking is among the most sophisticated and complex part of the full event reconstruction chain. A number of reconstruction algorithms work in a sequence to build these trajectories from detector hits. These algorithms use many configuration parameters that need to be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. The most popular method to tune these parameters is hand-tuning using brute-force techniques. These techniques can be inefficient and raise issues for the long-term maintainability of such algorithms. The open-source track reconstruction software framework known as "A Common Tracking Framework (ACTS)" offers an alternative solution to these parameter tuning techniques through the use of automatic parameter optimization algorithms. ACTS comes equipped with an auto-tuning suite that provides necessary setup for performing optimization of input parameters belonging to track reconstruction algorithms. The user can choose the tunable parameters in a flexible way and define a cost/benefit function for optimizing the full reconstruction chain. The fast execution speed of ACTS allows the user to run several iterations of optimization within a reasonable time bracket. The performance of these optimizers has been demonstrated on different track reconstruction algorithms such as trajectory seed reconstruction and selection, particle vertex reconstruction and generation of simplified material map, and on different detector geometries such as Generic Detector and Open Data Detector (ODD). We aim to bring this approach to all aspects of trajectory reconstruction by having a more flexible integration of tunable parameters within ACTS

    Optimization of the rejection of the ttbar background for the search of SuperSymmetry

    No full text
    This paper presented a method to reject the ttbar background apply to the Supersymmetric process where two gluino decay into 2 b quarks and the LSP (lightest supersymmetric particle). This rejection is done using a veto on the tau lepton. This report is going to show that even if this optimisation could in theory improve greatly the significance, the low efficiency of the tau identification is going to limit us to an augmentation of a few percent of the significance.This report also contain an analyse of the Atlas trigger

    A High-Granularity Timing Detector in ATLAS: Performance at the HL-LHC

    No full text
    The large increase of pileup is one of the main experimental challenges for the HL-LHC physics program. A powerful new way to address this challenge is to exploit the time spread of the interactions to distinguish between collisions occurring very close in space but well separated in time. A High-Granularity Timing Detector, based on low gain avalanche detector technology, is proposed for the ATLAS Phase-II upgrade. Covering the pseudorapidity region between 2.4 and 4.0, with a timing resolution of 30 ps for minimum-ionizing particles, this device will significantly improve the performance in the forward region. High-precision timing greatly improves the track-to-vertex association, leading to a performance similar to that in the central region for the reconstruction of both jets and leptons, as well as for the tagging of heavy-flavor jets. These improvements in object reconstruction performance translate into important sensitivity gains and enhance the reach of the HL-LHC physics program. In addition, the High-Granularity Timing Detector offers unique capabilities for the online and offline luminosity determination

    ATLAS : Search for Supersymmetry and optimization of the High Granularity timing detector

    No full text
    The Standard Model of particle physics has been extremely successful in describing the elementary particles and their interactions. Nevertheless, there are open questions that are left unanswered. Whether supersymmetry can provide answers to some of these is being studied in 13 TeV proton-proton collisions in the ATLAS experiment at the LHC. In this thesis a search for pair produced colored particles in ATLAS decaying into pairs of jets using data from 2016, 2017 and 2018 is presented. Such particles would escape standard Supersymmetry searches due to the absence of missing transverse energy in the final state. Stops decaying via a R-parity violating coupling and sgluon, scalar partners of the gluino, were considered. In the absence of a signal, an improvement of 200 GeV on the limit on the stop mass is expected. The HL-LHC will increase the integrated luminosity delivered to probe even higher mass ranges as well as improving the precision of Standard model measurements. The instantaneous luminosity will be increased by a factor 5 and an integrated luminosity of 4000 fb⁻Âč should be reached by the end of the LHC in 2037.A study of the Higgs coupling measurement prospects at the HL-LHC using SFitter is performed. Using the Delta and EFT framework shows that the increase in luminosity will result in a significant improvement of the precision of the measurement of the couplings. The High granularity timing detector detector will be installed in ATLAS for the HL-LHC. A simulation of the detector that takes into account the timing resolution was developed and used to optimize its layout. The detector performance was studied. More than 80 % of the tracks have their time correctly reconstructed with a resolution of 20 ps before irradiation and 50 ps after. Using the timing information, the electron isolation efficiency is improved by 10 %

    A High-Granularity Timing Detector in ATLAS: Performance at the HL-LHC

    No full text
    International audienceThe large increase of pileup is one of the main experimental challenges for the HL-LHC physics program. A powerful new way to address this challenge is to exploit the time spread of the interactions to distinguish between collisions occurring very close in space but well separated in time. A High-Granularity Timing Detector, based on low gain avalanche detector technology, is proposed for the ATLAS Phase-II upgrade. Covering the pseudorapidity region between 2.4 and 4.0, with a timing resolution of 30 ps for minimum-ionizing particles, this device will significantly improve the performance in the forward region. High-precision timing greatly improves the track-to-vertex association, leading to a performance similar to that in the central region for the reconstruction of both jets and leptons, as well as for the tagging of heavy-flavor jets. These improvements in object reconstruction performance translate into important sensitivity gains and enhance the reach of the HL-LHC physics program. In addition, the High-Granularity Timing Detector offers unique capabilities for the online and offline luminosity determination

    ATLAS : recherche de la supersymétrie et optimisation du détecteur de temps fortement segmenté

    No full text
    The Standard Model of particle physics has been extremely successful in describing the elementary particles and their interactions. Nevertheless, there are open questions that are left unanswered. Whether supersymmetry can provide answers to some of these is being studied in 13 TeV proton-proton collisions in the ATLAS experiment at the LHC. In this thesis a search for pair produced colored particles in ATLAS decaying into pairs of jets using data from 2016, 2017 and 2018 is presented. Such particles would escape standard Supersymmetry searches due to the absence of missing transverse energy in the final state. Stops decaying via a R-parity violating coupling and sgluon, scalar partners of the gluino, were considered. In the absence of a signal, an improvement of 200 GeV on the limit on the stop mass is expected. The HL-LHC will increase the integrated luminosity delivered to probe even higher mass ranges as well as improving the precision of Standard model measurements. The instantaneous luminosity will be increased by a factor 5 and an integrated luminosity of 4000 fb⁻Âč should be reached by the end of the LHC in 2037.A study of the Higgs coupling measurement prospects at the HL-LHC using SFitter is performed. Using the Delta and EFT framework shows that the increase in luminosity will result in a significant improvement of the precision of the measurement of the couplings. The High granularity timing detector detector will be installed in ATLAS for the HL-LHC. A simulation of the detector that takes into account the timing resolution was developed and used to optimize its layout. The detector performance was studied. More than 80 % of the tracks have their time correctly reconstructed with a resolution of 20 ps before irradiation and 50 ps after. Using the timing information, the electron isolation efficiency is improved by 10 %.Le ModĂšle Standard de la physique des particules a jusqu’alors extrĂȘmement bien rĂ©ussi Ă  dĂ©crire les particules Ă©lĂ©mentaires et leurs interactions. MalgrĂ© cela, il demeure toujours des questions ouvertes. La possibilitĂ© de rĂ©pondre Ă  ces questions grĂące la SupersymĂ©trie est actuellement Ă  l’étude dans les collisions proton-proton Ă  13 TeV dans le cadre de l’expĂ©rience ATLAS au LHC. Cette thĂšse prĂ©sente la recherche de la production de paires de particules colorĂ©es dans ATLAS, ces derniĂšres se dĂ©sintĂ©grant en paires de jets. Pour ce faire, les donnĂ©es de 2016, 2017 et 2018 ont Ă©tĂ© utilisĂ©es. De telles particules Ă©chappent aux recherches standards de la SupersymĂ©trie du fait de l’absence d’énergie transverse manquante dans l’état final. Deux signatures furent considĂ©rĂ©es, la dĂ©sintĂ©gration de stops via des couplages violant la R-paritĂ© et la production de sgluon, le partenaire scalaire du gluino. En l’absence de signal, une amĂ©lioration de 200 GeV sur la masse maximum exclue est attendue. Le HL-LHC augmentera la luminositĂ© intĂ©grĂ©e dĂ©livrĂ©e afin de nous permettre de rechercher des particules plus massives et d'amĂ©liorer les mesures de prĂ©cision du ModĂšle Standard. La luminositĂ© instantanĂ©e augmentera d’un facteur 5 et une luminositĂ© intĂ©grĂ©e de 4000 fb⁻Âč devrait pouvoir ĂȘtre atteinte Ă  la fin du LHC en 2037.Cette thĂšse prĂ©sente Ă©galement une Ă©tude des perspectives de mesure des couplages du Higgs au HL-LHC effectuĂ©e Ă  l’aide de SFitter. Il est dĂ©montrĂ© que dans le cadre des Delta et d’une EFT, l’augmentation de la luminositĂ© gĂ©nĂšre une amĂ©lioration de la prĂ©cision de la mesure des couplages. Finalement, le DĂ©tecteur de temps fortement segmentĂ©, qui sera installĂ© dans ATLAS au HL-LHC, est prĂ©sentĂ©. La simulation de ce dĂ©tecteur a Ă©tĂ© dĂ©veloppĂ©e pour prendre en compte la rĂ©solution temporelle du dĂ©tecteur et fut utilisĂ©e pour optimiser sa gĂ©omĂ©trie. Les performances de ce dĂ©tecteur ont Ă©tĂ© Ă©tudiĂ©es, plus de 80 % des traces ont leurs temps correctement associĂ©s avec une rĂ©solution de 20 ps avant irradiation de 50 ps aprĂšs. En utilisant les informations temporelles, l’isolation des Ă©lectrons peut ĂȘtre amĂ©liorĂ© de 10 %

    Ranking-based neural network for ambiguity resolution in ACTS

    No full text
    International audienceThe reconstruction of particle trajectories is a key challenge of particle physics experiments, as it directly impacts particle identification and physics performances while also representing one of the main CPU consumers of many high-energy physics experiments. As the luminosity of particle colliders increases, this reconstruction will become more challenging and resource-intensive. New algorithms are thus needed to address these challenges efficiently. One potential step of track reconstruction is ambiguity resolution. In this step, performed at the end of the tracking chain, we select which tracks candidates should be kept and which must be discarded. The speed of this algorithm is directly driven by the number of track candidates, which can be reduced at the cost of some physics performance. Since this problem is fundamentally an issue of comparison and classification, we propose to use a machine learning-based approach to the Ambiguity Resolution. Using a shared-hits-based clustering algorithm, we can efficiently determine which candidates belong to the same truth particle. Afterwards, we can apply a Neural Network (NN) to compare those tracks and decide which ones are duplicates and which ones should be kept. This approach is implemented within A Common Tracking Software (ACTS) framework and tested on the Open Data Detector (ODD), a realistic virtual detector similar to a future ATLAS one. This new approach was shown to be 15 times faster than the default ACTS algorithm while removing 32 times more duplicates down to less than one duplicated track per event

    Potentiality of automatic parameter tuning suite available in ACTS track reconstruction software framework

    No full text
    Particle tracking is among the most sophisticated and complex part of the full event reconstruction chain. A number of reconstruction algorithms work in a sequence to build these trajectories from detector hits. These algorithms use many configuration parameters that need to be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. The most popular method to tune these parameters is hand-tuning using brute-force techniques. These techniques can be inefficient and raise issues for the long-term maintainability of such algorithms. The open-source track reconstruction software framework known as "A Common Tracking Framework (ACTS)" offers an alternative solution to these parameter tuning techniques through the use of automatic parameter optimization algorithms. ACTS comes equipped with an auto-tuning suite that provides necessary setup for performing optimization of input parameters belonging to track reconstruction algorithms. The user can choose the tunable parameters in a flexible way and define a cost/benefit function for optimizing the full reconstruction chain. The fast execution speed of ACTS allows the user to run several iterations of optimization within a reasonable time bracket. The performance of these optimizers has been demonstrated on different track reconstruction algorithms such as trajectory seed reconstruction and selection, particle vertex reconstruction and generation of simplified material map, and on different detector geometries such as Generic Detector and Open Data Detector (ODD). We aim to bring this approach to all aspects of trajectory reconstruction by having a more flexible integration of tunable parameters within ACTS

    Potentiality of automatic parameter tuning suite available in ACTS track reconstruction software framework

    No full text
    Particle tracking is among the most sophisticated and complex part of the full event reconstruction chain. A number of reconstruction algorithms work in a sequence to build these trajectories from detector hits. These algorithms use many configuration parameters that need to be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. The most popular method to tune these parameters is hand-tuning using brute-force techniques. These techniques can be inefficient and raise issues for the long-term maintainability of such algorithms. The open-source track reconstruction software framework known as "A Common Tracking Framework (ACTS)" offers an alternative solution to these parameter tuning techniques through the use of automatic parameter optimization algorithms. ACTS comes equipped with an auto-tuning suite that provides necessary setup for performing optimization of input parameters belonging to track reconstruction algorithms. The user can choose the tunable parameters in a flexible way and define a cost/benefit function for optimizing the full reconstruction chain. The fast execution speed of ACTS allows the user to run several iterations of optimization within a reasonable time bracket. The performance of these optimizers has been demonstrated on different track reconstruction algorithms such as trajectory seed reconstruction and selection, particle vertex reconstruction and generation of simplified material map, and on different detector geometries such as Generic Detector and Open Data Detector (ODD). We aim to bring this approach to all aspects of trajectory reconstruction by having a more flexible integration of tunable parameters within ACTS

    Auto-tuning capabilities of the ACTS track reconstruction suite

    No full text
    The reconstruction of charged particle trajectories is a crucial challenge of particle physics experiments as it directly impacts particle reconstruction and physics performances. To reconstruct these trajectories, different reconstruction algorithms are used sequentially. Each of these algorithms uses many configuration parameters that must be fine-tuned to properly account for the detector/experimental setup, the available CPU budget and the desired physics performance. Examples of such parameters are cut values limiting the algorithm's search space, approximations accounting for complex phenomenons, or parameters controlling algorithm performance. Until now, these parameters had to be optimised by human experts, which is inefficient and raises issues for the long-term maintainability of such algorithms. Previous experience using machine learning for particle reconstruction (such as the TrackML challenge) has shown that they can be easily adapted to different experiments by learning directly from the data. We propose to bring the same approach to the classic track reconstruction algorithms by connecting them to an agent-driven optimiser, allowing us to find the best input parameters using an iterative tuning approach. We have so far demonstrated this method on different track reconstruction algorithms within A Common Tracking Software (ACTS) framework using the Open Data Detector (ODD). These algorithms include the trajectory seed reconstruction and selection, the particle vertex reconstruction and the generation of simplified material maps used for trajectory reconstruction
    corecore