90 research outputs found

    FCC-ee: The Lepton Collider – Future Circular Collider Conceptual Design Report Volume 2

    Get PDF

    FCC-ee: The Lepton Collider: Future Circular Collider Conceptual Design Report Volume 2

    Get PDF
    Overview of the research program of a future lepton collider

    FCC-hh: The Hadron Collider: Future Circular Collider Conceptual Design Report Volume 3

    Get PDF
    Overview of the physics potential of a future hadron collider

    FCC-hh: The Hadron Collider: Future Circular Collider Conceptual Design Report Volume 3

    Get PDF
    In response to the 2013 Update of the European Strategy for Particle Physics (EPPSU), the Future Circular Collider (FCC) study was launched as a world-wide international collaboration hosted by CERN. The FCC study covered an energy-frontier hadron collider (FCC-hh), a highest-luminosity high-energy lepton collider (FCC-ee), the corresponding 100 km tunnel infrastructure, as well as the physics opportunities of these two colliders, and a high-energy LHC, based on FCC-hh technology. This document constitutes the third volume of the FCC Conceptual Design Report, devoted to the hadron collider FCC-hh. It summarizes the FCC-hh physics discovery opportunities, presents the FCC-hh accelerator design, performance reach, and staged operation plan, discusses the underlying technologies, the civil engineering and technical infrastructure, and also sketches a possible implementation. Combining ingredients from the Large Hadron Collider (LHC), the high-luminosity LHC upgrade and adding novel technologies and approaches, the FCC-hh design aims at significantly extending the energy frontier to 100 TeV. Its unprecedented centre of-mass collision energy will make the FCC-hh a unique instrument to explore physics beyond the Standard Model, offering great direct sensitivity to new physics and discoveries

    FCC-ee: The Lepton Collider: Future Circular Collider Conceptual Design Report Volume 2

    Get PDF
    In response to the 2013 Update of the European Strategy for Particle Physics, the Future Circular Collider (FCC) study was launched, as an international collaboration hosted by CERN. This study covers a highest-luminosity high-energy lepton collider (FCC-ee) and an energy-frontier hadron collider (FCC-hh), which could, successively, be installed in the same 100 km tunnel. The scientific capabilities of the integrated FCC programme would serve the worldwide community throughout the 21st century. The FCC study also investigates an LHC energy upgrade, using FCC-hh technology. This document constitutes the second volume of the FCC Conceptual Design Report, devoted to the electron-positron collider FCC-ee. After summarizing the physics discovery opportunities, it presents the accelerator design, performance reach, a staged operation scenario, the underlying technologies, civil engineering, technical infrastructure, and an implementation plan. FCC-ee can be built with today’s technology. Most of the FCC-ee infrastructure could be reused for FCC-hh. Combining concepts from past and present lepton colliders and adding a few novel elements, the FCC-ee design promises outstandingly high luminosity. This will make the FCC-ee a unique precision instrument to study the heaviest known particles (Z, W and H bosons and the top quark), offering great direct and indirect sensitivity to new physics

    Statistical Learning and Inference at Particle Collider Experiments

    Get PDF
    Advances in data analysis techniques may play a decisive role in the discovery reach of particle collider experiments. However, the importing of expertise and methods from other data-centric disciplines such as machine learning and statistics faces significant hurdles, mainly due to the established use of different language and constructs. A large part of this document, also conceived as an introduction to the description of an analysis searching for non-resonant Higgs pair production in data collected by the CMS detector at the Large Hadron Collider (LHC), is therefore devoted to a broad redefinition of the relevant concepts for problems in experimental particle physics. The aim is to better connect these issues with those in other fields of research, so the solutions found can be repurposed. The formal exploration of the properties of the statistical models at particle colliders is useful to highlight the main challenges posed by statistical inference in this context: the multi-dimensional nature of the models, which can be studied only in a generative manner via forward simulation of observations, and the effect of nuisance parameters. The first issue can be tackled with likelihood-free inference methods coupled with the use of low-dimensional summary statistics, which may be constructed either with machine learning techniques or through physically motivated variables (e.g. event reconstruction). The second, i.e. the misspecification of the generative model which is addressed by the inclusion of nuisance parameters, reduces the effectiveness of summary statistics constructed with machine-learning techniques. A subset of the data analysis techniques formally discussed in the introductory part of the document are also exploited to study the non-resonant production process pp → HH → bbbb at the LHC in the context of the Standard Model (SM) and its extensions in effective fields theories (EFT), based on anomalous couplings of the Higgs field. Data collected in 2016 by the CMS detector and corresponding to a total of 35.9 fb−1 of proton-proton collisions are used to set an 95% confidence upper limit at 847 fb on the production cross section σ(pp → HH → bbbb) in the SM. Upper limits are also obtained for the cross sections corresponding to a representative set of points of the parameter space of EFT. The combination of those results with the ones obtained from the study of other decay channels of HH pairs is also discussed. In addition, the exercise of reformulating the goals of high energy physics analysis as a statistical inference problem is combined with modern machine learning technologies to develop a new technique, referred to as inference-aware neural optimisation. The technique produces summary statistics which directly minimise the expected uncertainty on the parameters of interest, optimally accounting for the effect of nuisance parameters. The application of this technique to a synthetic problem demonstrates that the obtained summary statistics are considerable more effective than those obtained with standard supervised learning methods, when the effect of the nuisance parameters is significant. Assuming its scalability to LHC data scenarios, this technique has ground-breaking potential for analyses dominated by systematic uncertainties

    Status and Future Perspectives for Lattice Gauge Theory Calculations to the Exascale and Beyond

    Full text link
    In this and a set of companion whitepapers, the USQCD Collaboration lays out a program of science and computing for lattice gauge theory. These whitepapers describe how calculation using lattice QCD (and other gauge theories) can aid the interpretation of ongoing and upcoming experiments in particle and nuclear physics, as well as inspire new ones.Comment: 44 pages. 1 of USQCD whitepapers
    • …
    corecore