724 research outputs found

    Measurement of the Drell--Yan differential cross section with the CMS detector at the LHC

    Get PDF
    This thesis describes precision measurements of electroweak interactions in a new energy regime and the application of these measurements to improve our understanding of the structure of the proton. The results are based on proton-proton collision data at √s = 7 and 8\TeV recorded with the Compact Muon Solenoid detector at the CERN Large Hadron Collider during the first years of operation. Measurements of the differential Drell–Yan cross section in the dimuon and dielectron channels covering the dilepton mass range of 15 to 2000\GeV and absolute dilepton rapidity from 0 to~2.4 are presented. The Drell–Yan cross section in proton-proton collisions depends on empirical quantities known as parton distribution functions (PDFs) which parameterize the structure of the proton. In addition to the differential cross sections, the measurements of ratios of the normalized differential cross sections (double ratios) at √s = 7 and 8\TeV are performed in order to provide further constraints for PDFs, substantially reducing theoretical systematic uncertainties due to correlations. These measurements are compared to predictions of perturbative QCD at the next-to-next-to-leading order computed with various sets of PDFs. The measured differential cross section and double ratio in bins of absolute rapidity are sufficiently precise to constrain the proton parton distribution functions. The inclusion of Drell–Yan data in PDF fits provides substantial constraints for the strange quark and the light sea quark distribution functions in a region of phase space which has not been accessible at hadron colliders in the past

    Measurement of the Drell--Yan differential cross section with the CMS detector at the LHC

    Get PDF
    This thesis describes precision measurements of electroweak interactions in a new energy regime and the application of these measurements to improve our understanding of the structure of the proton. The results are based on proton-proton collision data at √s = 7 and 8\TeV recorded with the Compact Muon Solenoid detector at the CERN Large Hadron Collider during the first years of operation. Measurements of the differential Drell–Yan cross section in the dimuon and dielectron channels covering the dilepton mass range of 15 to 2000\GeV and absolute dilepton rapidity from 0 to~2.4 are presented. The Drell–Yan cross section in proton-proton collisions depends on empirical quantities known as parton distribution functions (PDFs) which parameterize the structure of the proton. In addition to the differential cross sections, the measurements of ratios of the normalized differential cross sections (double ratios) at √s = 7 and 8\TeV are performed in order to provide further constraints for PDFs, substantially reducing theoretical systematic uncertainties due to correlations. These measurements are compared to predictions of perturbative QCD at the next-to-next-to-leading order computed with various sets of PDFs. The measured differential cross section and double ratio in bins of absolute rapidity are sufficiently precise to constrain the proton parton distribution functions. The inclusion of Drell–Yan data in PDF fits provides substantial constraints for the strange quark and the light sea quark distribution functions in a region of phase space which has not been accessible at hadron colliders in the past

    Large-scale text processing pipeline with Apache Spark

    Full text link
    In this paper, we evaluate Apache Spark for a data-intensive machine learning problem. Our use case focuses on policy diffusion detection across the state legislatures in the United States over time. Previous work on policy diffusion has been unable to make an all-pairs comparison between bills due to computational intensity. As a substitute, scholars have studied single topic areas. We provide an implementation of this analysis workflow as a distributed text processing pipeline with Spark dataframes and Scala application programming interface. We discuss the challenges and strategies of unstructured data processing, data formats for storage and efficient access, and graph processing at scale

    Big Data in HEP: A comprehensive use case study

    Full text link
    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.Comment: Proceedings for 22nd International Conference on Computing in High Energy and Nuclear Physics (CHEP 2016
    • …
    corecore