28 research outputs found
Leptoquark manoeuvres in the dark: a simultaneous solution of the dark matter problem and the RD(*) anomalies
The measured branching fractions of B-mesons into leptonic final states derived by the LHCb, Belle and BaBar collaborations hint towards the breakdown of lepton flavour universality. In this work we take at face value the so-called RD(*) observables that are defined as the ratios of neutral B-meson charged-current decays into a D-meson, a charged lepton and a neutrino final state in the tau and light lepton channels. A well-studied and simple solution to this charged current anomaly is to introduce a scalar leptoquark S that couples to the second and third generation of fermions. We investigate how S can also serve as a mediator between the Standard Model and a dark sector. We study this scenario in detail and estimate the constraints arising from collider searches for leptoquarks, collider searches for missing energy signals, direct detection experiments and the dark matter relic abundance. We stress that the production of a pair of leptoquarks that decays into different final states (i.e. the commonly called âmixedâ channels) provides critical information for identifying the underlying dynamics, and we exemplify this by studying the tÏbÎœ and the resonant S plus missing energy channels. We find that direct detection data provides non-negligible constraints on the leptoquark coupling to the dark sector, which in turn affects the relic abundance. We also show that the correct relic abundance can not only arise via standard freeze-out, but also through conversion-driven freeze-out. We illustrate the rich phenomenology of the model with a few selected benchmark points, providing a broad stroke of the interesting connection between lepton flavour universality violation and dark matter.The work of AJ is supported in part by a KIAS Individual Grant No. QP084401 via the
Quantum Universe Center at Korea Institute for Advanced Study and by the National
Research Foundation of Korea, Grant No. NRF-2019R1A2C1009419. The work of AL
was supported by the SĂŁo Paulo Research Foundation (FAPESP), project 2015/20570-1.
JH acknowledges support from the DFG via the Collaborative Research Center TRR 257
and the F.R.S.-FNRS as a Chargé de recherche. The work of AP and GB was funded by the
RFBR and CNRS project number 20-52-15005. The work of AP was also supported in part
by an AAP-USMB grant and by the Interdisciplinary Scientific and Educational School of
Moscow University for Fundamental and Applied Space Research. The work of DS is based
upon work supported by the National Science Foundation under Grant No. PHY-1915147.
JZ is supported by the Generalitat Valenciana (Spain) through the plan GenT program
(CIDEGENT/2019/068), by the Spanish Government (Agencia Estatal de InvestigaciĂłn)
and ERDF funds from European Commission (MCIN/AEI/10.13039/501100011033, Grant
No. PID2020-114473GB-I00)
Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report
This Report summarizes the proceedings of the 2015 Les Houches workshop on
Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant
for high precision Standard Model calculations, (II) the new PDF4LHC parton
distributions, (III) issues in the theoretical description of the production of
Standard Model Higgs bosons and how to relate experimental measurements, (IV) a
host of phenomenological studies essential for comparing LHC data from Run I
with theoretical predictions and projections for future measurements in Run II,
and (V) new developments in Monte Carlo event generators.Comment: Proceedings of the Standard Model Working Group of the 2015 Les
Houches Workshop, Physics at TeV Colliders, Les Houches 1-19 June 2015. 227
page
Benchmark data and model independent event classification for the large hadron collider
We describe the outcome of a data challenge conducted as part of the Dark Machines (https://www.darkmachines.org) initiative and the Les Houches 2019 workshop on Physics at TeV colliders. The challenged aims to detect signals of new physics at the Large Hadron Collider (LHC) using unsupervised machine learning algorithms. First, we propose how an anomaly score could be implemented to define model-independent signal regions in LHC searches. We define and describe a large benchmark dataset, consisting of > 1 billion simulated LHC events corresponding to 10 fbâ1 of proton-proton collisions at a center-of-mass energy of 13 TeV. We then review a wide range of anomaly detection and density estimation algorithms, developed in the context of the data challenge, and we measure their performance in a set of realistic analysis environments. We draw a number of useful conclusions that will aid the development of unsupervised new physics searches during the third run of the LHC, and provide our benchmark dataset for future studies at https://www.phenoMLdata.org. Code to reproduce the analysis is provided at https://github.com/bostdiek/DarkMachines-UnsupervisedChallenge
Reinterpretation of LHC Results for New Physics: Status and recommendations after Run 2
We report on the status of efforts to improve the reinterpretation of searches and measurements at the LHC in terms of models for new physics, in the context of the LHC Reinterpretation Forum. We detail current experimental offerings in direct searches for new particles, measurements, technical implementations and Open Data, and provide a set of recommendations for further improving the presentation of LHC results in order to better enable reinterpretation in the future. We also provide a brief description of existing software reinterpretation frameworks and recent global analyses of new physics that make use of the current data
Les Houches 2019 Physics at TeV Colliders: New Physics Working Group Report
This report presents the activities of the `New Physics' working group for the `Physics at TeV Colliders' workshop (Les Houches, France, 10--28 June, 2019). These activities include studies of direct searches for new physics, approaches to exploit published data to constrain new physics, as well as the development of tools to further facilitate these investigations. Benefits of machine learning for both the search for new physics and the interpretation of these searches are also presented
Boosting probes of CP violation in the top Yukawa coupling with Deep Learning
The precise measurement of the top-Higgs coupling is crucial in particle physics, offering insights into potential new physics Beyond the Standard Model (BSM) carrying CP Violation (CPV) effects. In this paper, we explore the CP properties of a Higgs boson coupling with a top quark pair, focusing on events where the Higgs state decays into a pair of -quarks and the top-antitop system decays leptonically. The novelty of our analysis resides in the exploitation of two conditional Deep Learning (DL) networks: a Multi-Layer Perceptron (MLP) and a Graph Convolution Network (GCN). These models are trained for selected CPV phase values and then used to interpolate all possible values ranging from . This enables a comprehensive assessment of sensitivity across all CP phase values, thereby streamlining the process as the models are trained only once. Notably, the conditional GCN exhibits superior performance over the conditional MLP, owing to the nature of graph-based Neural Network (NN) structures. Our Machine Learning (ML) informed findings indicate that assessment of the CP properties of the Higgs coupling to the pair can be within reach of the HL-LHC, quantitatively surpassing the sensitivity of more traditional approaches
Leptoquark manoeuvres in the dark: a simultaneous solution of the dark matter problem and the anomalies
The measured branching fractions of B-mesons into leptonic final states derived by the LHCb, Belle and BaBar collaborations hint towards the breakdown of lepton flavour universality. In this work we take at face value the so-called observables that are defined as the ratios of neutral B-meson charged-current decays into a D-meson, a charged lepton and a neutrino final state in the tau and light lepton channels. A well-studied and simple solution to this charged current anomaly is to introduce a scalar leptoquark S that couples to the second and third generation of fermions. We investigate how S can also serve as a mediator between the Standard Model and a dark sector. We study this scenario in detail and estimate the constraints arising from collider searches for leptoquarks, collider searches for missing energy signals, direct detection experiments and the dark matter relic abundance. We stress that the production of a pair of leptoquarks that decays into different final states (i.e. the commonly called âmixedâ channels) provides critical information for identifying the underlying dynamics, and we exemplify this by studying the tÏbÎœ and the resonant S plus missing energy channels. We find that direct detection data provides non-negligible constraints on the leptoquark coupling to the dark sector, which in turn affects the relic abundance. We also show that the correct relic abundance can not only arise via standard freeze-out, but also through conversion-driven freeze-out. We illustrate the rich phenomenology of the model with a few selected benchmark points, providing a broad stroke of the interesting connection between lepton flavour universality violation and dark matter
Leptoquark manoeuvres in the dark: a simultaneous solution of the dark matter problem and the anomalies
International audienceThe measured branching fractions of B-mesons into leptonic final states derived by the LHCb, Belle and BaBar collaborations hint towards the breakdown of lepton flavour universality. In this work we take at face value the so-called observables that are defined as the ratios of neutral B-meson charged-current decays into a D-meson, a charged lepton and a neutrino final state in the tau and light lepton channels. A well-studied and simple solution to this charged current anomaly is to introduce a scalar leptoquark S that couples to the second and third generation of fermions. We investigate how S can also serve as a mediator between the Standard Model and a dark sector. We study this scenario in detail and estimate the constraints arising from collider searches for leptoquarks, collider searches for missing energy signals, direct detection experiments and the dark matter relic abundance. We stress that the production of a pair of leptoquarks that decays into different final states (i.e. the commonly called âmixedâ channels) provides critical information for identifying the underlying dynamics, and we exemplify this by studying the tÏbÎœ and the resonant S plus missing energy channels. We find that direct detection data provides non-negligible constraints on the leptoquark coupling to the dark sector, which in turn affects the relic abundance. We also show that the correct relic abundance can not only arise via standard freeze-out, but also through conversion-driven freeze-out. We illustrate the rich phenomenology of the model with a few selected benchmark points, providing a broad stroke of the interesting connection between lepton flavour universality violation and dark matter
Benchmark data and model independent event classification for the large hadron collider
We describe the outcome of a data challenge conducted as part of the Dark Machines (https://www.darkmachines.org) initiative and the Les Houches 2019 workshop on Physics at TeV colliders. The challenged aims to detect signals of new physics at the Large Hadron Collider (LHC) using unsupervised machine learning algorithms. First, we propose how an anomaly score could be implemented to define model-independent signal regions in LHC searches. We define and describe a large benchmark dataset, consisting of > 1 billion simulated LHC events corresponding to 10 fb1 of proton-proton collisions at a center-of-mass energy of 13 TeV. We then review a wide range of anomaly detection and density estimation algorithms, developed in the context of the data challenge, and we measure their performance in a set of realistic analysis environments. We draw a number of useful conclusions that will aid the development of unsupervised new physics searches during the third run of the LHC, and provide our benchmark dataset for future studies at https://www.phenoMLdata.org. Code to reproduce the analysis is provided at https://github.com/bostdiek/DarkMachines-UnsupervisedChallenge.Thea Aarrestad, Melissa van Beekveld, Marcella Bona, Antonio Boveia, Sascha Caron, Joe Davies, Andrea De Simone, Caterina Doglioni, Javier M. Duarte, Amir Farbin, Honey Gupta, Luc Hendriks, Lukas Heinrich, James Howarth, Pratik Jawahar, Adil Jueid, Jessica Lastow, Adam Leinweber, Judita Mamuzic, ErzsĂ©bet MerĂ©nyi, Alessandro Morandini, Polina Moskvitina, Clara Nellist, Jennifer Ngadiuba, Bryan Ostdiek, Maurizio Pierini, Baptiste Ravina, Roberto R. de Austri, Sezen Sekmen, Mary Touranakou, Marija VaĆĄkeviËci, ute, Ricardo Vilalta, Jean-Roch Vlimant, Rob Verheyen, Martin White, Eric Wulff, Erik Wallin, Kinga A. Wozniak, and Zhongyi Zhan