64 research outputs found

    A first determination of parton distributions with theoretical uncertainties

    Get PDF
    The parton distribution functions (PDFs) which characterize the structure of the proton are currently one of the dominant sources of uncertainty in the predictions for most processes measured at the Large Hadron Collider (LHC). Here we present the first extraction of the proton PDFs that accounts for the missing higher order uncertainty (MHOU) in the fixed-order QCD calculations used in PDF determinations. We demonstrate that the MHOU can be included as a contribution to the covariance matrix used for the PDF fit, and then introduce prescriptions for the computation of this covariance matrix using scale variations. We validate our results at next-to-leading order (NLO) by comparison to the known next order (NNLO) corrections. We then construct variants of the NNPDF3.1 NLO PDF set that include the effect of the MHOU, and assess their impact on the central values and uncertainties of the resulting PDFs

    Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report

    Get PDF
    This Report summarizes the proceedings of the 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) the new PDF4LHC parton distributions, (III) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (IV) a host of phenomenological studies essential for comparing LHC data from Run I with theoretical predictions and projections for future measurements in Run II, and (V) new developments in Monte Carlo event generators.Comment: Proceedings of the Standard Model Working Group of the 2015 Les Houches Workshop, Physics at TeV Colliders, Les Houches 1-19 June 2015. 227 page

    Reweighting a parton shower using a neural network: the final-state case

    Get PDF
    The use of QCD calculations that include the resummation of soft-collinear logarithms via parton-shower algorithms is currently not possible in PDF fits due to the high computational cost of evaluating observables for each variation of the PDFs. Unfortunately the interpolation methods that are otherwise applied to overcome this issue are not readily generalised to all-order parton-shower contributions. Instead, we propose an approximation based on training a neural network to predict the effect of varying the input parameters of a parton shower on the cross section in a given observable bin, interpolating between the variations of a training data set. This first publication focuses on providing a proof-of-principle for the method, by varying the shower dependence on αS\alpha_\text{S} for both a simplified shower model and a complete shower implementation for three different observables, the leading emission scale, the number of emissions and the Thrust event shape. The extension to the PDF dependence of the initial-state shower evolution that is needed for the application to PDF fits is left to a forthcoming publication.Comment: additional references added in introductio

    Parton distributions with theory uncertainties: general formalism and first phenomenological studies

    Get PDF
    Abstract: We formulate a general approach to the inclusion of theoretical uncertainties, specifically those related to the missing higher order uncertainty (MHOU), in the determination of parton distribution functions (PDFs). We demonstrate how, under quite generic assumptions, theory uncertainties can be included as an extra contribution to the covariance matrix when determining PDFs from data. We then review, clarify, and systematize the use of renormalization and factorization scale variations as a means to estimate MHOUs consistently in deep inelastic and hadronic processes. We define a set of prescriptions for constructing a theory covariance matrix using scale variations, which can be used in global fits of data from a wide range of different processes, based on choosing a set of independent scale variations suitably correlated within and across processes. We set up an algebraic framework for the choice and validation of an optimal prescription by comparing the estimate of MHOU encoded in the next-to-leading order (NLO) theory covariance matrix to the observed shifts between NLO and NNLO predictions. We perform a NLO PDF determination which includes the MHOU, assess the impact of the inclusion of MHOUs on the PDF central values and uncertainties, and validate the results by comparison to the known shift between NLO and NNLO PDFs. We finally study the impact of the inclusion of MHOUs in a global PDF determination on LHC cross-sections, and provide guidelines for their use in precision phenomenology. In addition, we also compare the results based on the theory covariance matrix formalism to those obtained by performing PDF determinations based on different scale choices

    Parton distributions for the LHC run II

    Get PDF
    We present NNPDF3.0, the first set of parton distribution functions (PDFs) determined with a methodology validated by a closure test. NNPDF3.0 uses a global dataset including HERA-II deep-inelastic inclusive cross-sections, the combined HERA charm data, jet production from ATLAS and CMS, vector boson rapidity and transverse momentum distributions from ATLAS, CMS and LHCb, W+c data from CMS and top quark pair production total cross sections from ATLAS and CMS. Results are based on LO, NLO and NNLO QCD theory and also include electroweak corrections. To validate our methodology, we show that PDFs determined from pseudo-data generated from a known underlying law correctly reproduce the statistical distributions expected on the basis of the assumed experimental uncertainties. This closure test ensures that our methodological uncertainties are negligible in comparison to the generic theoretical and experimental uncertainties of PDF determination. This enables us to determine with confidence PDFs at different perturbative orders and using a variety of experimental datasets ranging from HERA-only up to a global set including the latest LHC results, all using precisely the same validated methodology. We explore some of the phenomenological implications of our results for the upcoming 13 TeV Run of the LHC, in particular for Higgs production cross-sections.Comment: 151 pages, 69 figures. More typos corrected: published versio

    Report of the Snowmass 2013 energy frontier QCD working group

    Full text link
    This is the summary report of the energy frontier QCD working group prepared for Snowmass 2013. We review the status of tools, both theoretical and experimental, for understanding the strong interactions at colliders. We attempt to prioritize important directions that future developments should take. Most of the efforts of the QCD working group concentrate on proton-proton colliders, at 14 TeV as planned for the next run of the LHC, and for 33 and 100 TeV, possible energies of the colliders that will be necessary to carry on the physics program started at 14 TeV. We also examine QCD predictions and measurements at lepton-lepton and lepton-hadron colliders, and in particular their ability to improve our knowledge of strong coupling constant and parton distribution functions.Comment: 62 pages, 31 figures, Snowmass community summer study 201

    An open-source machine learning framework for global analyses of parton distributions

    Get PDF
    Abstract: We present the software framework underlying the NNPDF4.0 global determination of parton distribution functions (PDFs). The code is released under an open source licence and is accompanied by extensive documentation and examples. The code base is composed by a PDF fitting package, tools to handle experimental data and to efficiently compare it to theoretical predictions, and a versatile analysis framework. In addition to ensuring the reproducibility of the NNPDF4.0 (and subsequent) determination, the public release of the NNPDF fitting framework enables a number of phenomenological applications and the production of PDF fits under user-defined data and theory assumptions
    corecore