5,254 research outputs found

    Search for long-lived particles in ATLAS and CMS

    Get PDF
    The ATLAS and CMS detectors can be used to search for heavy long-lived particles which might signal physics beyond the Standard Model. Such new states can be distinguished from Standard Model particles by exploiting their unique signatures, ranging from multi-leptons and/or jets pro- duction anywhere within the detector volume, to minimum ionizing particles with low velocity and high momentum. Here are reviewed the strategies proposed by ATLAS and CMS to search for these signals, with particular emphasis on possible challenges to the trigger and detector operations.Comment: Parallel talk at ICHEP08, Philadelphia, USA, July 2008. 4 pages, LaTeX, 4 pdf figure

    Tau lepton identification with graph neural networks at future electron-positron colliders

    Get PDF
    Efficient and accurate reconstruction and identification of tau lepton decays plays a crucial role in the program of measurements and searches under the study for the future high-energy particle colliders. Leveraging recent advances in machine learning algorithms, which have dramatically improved the state of the art in visual object recognition, we have developed novel tau identification methods that are able to classify tau decays in leptons and hadrons and to discriminate them against QCD jets. We present the methodology and the results of the application at the interesting use case of the IDEA dual-readout calorimeter detector concept proposed for the future FCC-ee electron-positron collider

    Quantum circuit noise simulation with reinforcement learning

    Get PDF
    Quantum computing in the NISQ era requires powerful tools to reduce the gap between simulations and quantum hardware execution. In this work, we present a machine learning approach for reproducing the noise of a specific quantum device during simulations. The proposed algorithm is meant to be more flexible, in reproducing different noise conditions, than standard techniques like randomized benchmarking or heuristic noise models. This model has been tested both with simulation and on real superconducting qubits

    Tracin in Semantic Segmentation of Tumor Brains in MRI, an Extended Approach

    Get PDF
    In recent years, thanks to improved computational power and the availability of big data, AI has become a fundamental tool in basic research and industry. Despite this very rapid development, deep neural networks remain black boxes that are difficult to explain. While a multitude of explainability (xAI) methods have been developed, their effectiveness and usefulness in realistic use cases is understudied. This is a major limitation in the application of these algorithms in sensitive fields such as clinical diagnosis, where the robustness, transparency and reliability of the algorithm are indispensable for its use. In addition, the majority of works have focused on feature attribution (e.g., saliency maps) techniques, neglecting other interesting families of xAI methods such as data influence methods. The aim of this work is to implement, extend and test, for the first time, data influence functions in a challenging clinical problem, namely, the segmentation of tumor brains in Magnetic Resonance Images (MRI). We present a new methodology to calculate an influence score that is generalizable for all semantic segmentation tasks where the different labels are mutually exclusive, which is the standard framework for these tasks

    Minimal dark matter in type III seesaw

    Full text link
    We explore the possibility of a new dark matter candidate in the supersymmetric type III seesaw mechanism where a neutral scalar component of the Y=0 triplet can be the lightest supersymmetric particle. Its thermal abundance can be in the right range if non-standard cosmology such as kination domination is assumed. The enhanced cross-section of the dark matter annihilation to W+W- can leave detectable astrophysical and cosmological signals whose current observational data puts a lower bound on the dark matter mass. The model predicts the existence of a charged scalar almost degenerate with the dark matter scalar and its lifetime lies between 5.5 cm and 6.3 m. It provides a novel opportunity of the dark mater mass measurement by identifying slowly-moving and highly-ionizing tracks in the LHC experiments. If the ordinary lightest supersymmetric particle is the usual Bino, its decay leads to clean signatures of same-sign di-lepton and di-charged-scalar associated with observable displaced vertices which are essentially background-free and can be fully reconstructed.Comment: 3 figures, 12 pages; An error in the antiproton limit corrected; the lower bound on the dark matter mass strengthened; references added; typos correcte

    Artificial neural networks exploiting point cloud data for fragmented solid objects classification

    Get PDF
    This paper presents a novel approach for fragmented solid object classification exploiting neural networks based on point clouds. This work is the initial step of a project in collaboration with the Institution of ‘Ente Parco Archeologico del Colosseo’ in Rome, which aims to reconstruct ancient artifacts from their fragments. We built from scratch a synthetic dataset (DS) of fragments of different 3D objects including aging effects. We used this DS to train deep learning models for the task of classifying internal and external fragments. As model architectures, we adopted PointNet and dynamical graph convolutional neural network, which take as input a point cloud representing the spatial geometry of a fragment, and we optimized model performance by adding additional features sensitive to local geometry characteristics. We tested the approach by performing several experiments to check the robustness and generalization capabilities of the models. Finally, we test the models on a real case using a 3D scan of artifacts preserved in different museums, artificially fragmented, obtaining good performance

    Interferon regulatory factor 8-deficiency determines massive neutrophil recruitment but T cell defect in fast growing granulomas during tuberculosis

    Get PDF
    Following Mycobacterium tuberculosis (Mtb) infection, immune cell recruitment in lungs is pivotal in establishing protective immunity through granuloma formation and neogenesis of lymphoid structures (LS). Interferon regulatory factor-8 (IRF-8) plays an important role in host defense against Mtb, although the mechanisms driving anti-mycobacterial immunity remain unclear. In this study, IRF-8 deficient mice (IRF-8−/−) were aerogenously infected with a low-dose Mtb Erdman virulent strain and the course of infection was compared with that induced in wild-type (WT-B6) counterparts. Tuberculosis (TB) progression was examined in both groups using pathological, microbiological and immunological parameters. Following Mtb exposure, the bacterial load in lungs and spleens progressed comparably in the two groups for two weeks, after which IRF-8−/− mice developed a fatal acute TB whereas in WT-B6 the disease reached a chronic stage. In lungs of IRF-8−/−, uncontrolled growth of pulmonary granulomas and impaired development of LS were observed, associated with unbalanced homeostatic chemokines, progressive loss of infiltrating T lymphocytes and massive prevalence of neutrophils at late infection stages. Our data define IRF-8 as an essential factor for the maintenance of proper immune cell recruitment in granulomas and LS required to restrain Mtb infection. Moreover, IRF-8−/− mice, relying on a common human and mouse genetic mutation linked to susceptibility/severity of mycobacterial diseases, represent a valuable model of acute TB for comparative studies with chronically-infected congenic WT-B6 for dissecting protective and pathological immune reactions

    Measurement of the properties of Higgs boson production at √s = 13 TeV in the H → γγ channel using 139 fb−1 of p p collision data with the ATLAS experiment

    Get PDF
    Measurements of Higgs boson production cross-sections are carried out in the diphoton decay channel using 139 fb−1 of pp collision data at s√=13 TeV collected by the ATLAS experiment at the LHC. The analysis is based on the definition of 101 distinct signal regions using machine-learning techniques. The inclusive Higgs boson signal strength in the diphoton channel is measured to be 1.04+0.10−0.09. Cross-sections for gluon-gluon fusion, vector-boson fusion, associated production with a W or Z boson, and top associated production processes are reported. An upper limit of 10 times the Standard Model prediction is set for the associated production process of a Higgs boson with a single top quark, which has a unique sensitivity to the sign of the top quark Yukawa coupling. Higgs boson production is further characterized through measurements of Simplified Template Cross-Sections (STXS). In total, cross-sections of 28 STXS regions are measured. The measured STXS cross-sections are compatible with their Standard Model predictions, with a p-value of 93%. The measurements are also used to set constraints on Higgs boson coupling strengths, as well as on new interactions beyond the Standard Model in an effective field theory approach. No significant deviations from the Standard Model predictions are observed in these measurements, which provide significant sensitivity improvements compared to the previous ATLAS results

    Strong constraints on jet quenching in centrality-dependent p+Pb collisions at 5.02 TeV from ATLAS

    Get PDF
    Jet quenching is the process of color-charged partons losing energy via interactions with quark-gluon plasma droplets created in heavy-ion collisions. The collective expansion of such droplets is well described by viscous hydrodynamics. Similar evidence of collectivity is consistently observed in smaller collision systems, including pp and p+Pb collisions. In contrast, while jet quenching is observed in Pb+Pb collisions, no evidence has been found in these small systems to date, raising fundamental questions about the nature of the system created in these collisions. The ATLAS experiment at the Large Hadron Collider has measured the yield of charged hadrons correlated with reconstructed jets in 0.36 nb−1 of p+Pb and 3.6 pb−1 of pp collisions at 5.02 TeV. The yields of charged hadrons with pchT>0.5 GeV near and opposite in azimuth to jets with pjetT>30 or 60 GeV, and the ratios of these yields between p+Pb and pp collisions, IpPb, are reported. The collision centrality of p+Pb events is categorized by the energy deposited by forward neutrons from the struck nucleus. The IpPb values are consistent with unity within a few percent for hadrons with pchT>4 GeV at all centralities. These data provide new, strong constraints which preclude almost any parton energy loss in central p+Pb collisions
    • …
    corecore