5 research outputs found

    Investigations into the effect of Hadron Collider data on MSTW Parton Distribution Functions

    Get PDF
    The latest jet data from the Large Hadron Collider is studied in the context of the MSTW Parton Distribution Functions (PDFs). The effect of recent ATLAS inclusive- and dijet results on the current standard PDF sets is investigated, compared and contrasted to similar results from the Tevatron. A parton reweighting technique is used to gain information on the optimum PDFs for each new data set. New PDF sets are produced and studied using the new LHC data. These jet results provide a new central value of the PDF and reduce the uncertainty on the distributions. Additionally, a new method of parametrising the quark PDFs using Chebyshev Polynomials is tested in relation to the ATLAS W/Z rapidity data, which is described poorly using the standard PDFs. The effect of parton showering on jet physics is studied using various shower Monte Carlo generators within the context of jets produced in deep inelastic scattering, and the possibility of using charged current jet production in PDF fits is tested

    aMCfast: automation of fast NLO computations for PDF fits

    Get PDF
    We present the interface between MadGraph5 aMC@NLO, a self-contained program that calculates cross sections up to next-to-leading order accuracy in an automated manner, and APPLgrid, a code that parametrises such cross sections in the form of look-up tables which can be used for the fast computations needed in the context of PDF fits. The main characteristic of this interface, which we dub aMCfast, is its being fully automated as well, which removes the need to extract manually the process-specific information for additional physics processes, as is the case with other matrix-element calculators, and renders it straightforward to include any new process in the PDF fits. We demonstrate this by studying several cases which are easily measured at the LHC, have a good constraining power on PDFs, and some of which were previously unavailable in the form of a fast interface

    Parton distributions for the LHC run II

    Get PDF
    We present NNPDF3.0, the first set of parton distribution functions (PDFs) determined with a methodology validated by a closure test. NNPDF3.0 uses a global dataset including HERA-II deep-inelastic inclusive cross-sections, the combined HERA charm data, jet production from ATLAS and CMS, vector boson rapidity and transverse momentum distributions from ATLAS, CMS and LHCb, W+c data from CMS and top quark pair production total cross sections from ATLAS and CMS. Results are based on LO, NLO and NNLO QCD theory and also include electroweak corrections. To validate our methodology, we show that PDFs determined from pseudo-data generated from a known underlying law correctly reproduce the statistical distributions expected on the basis of the assumed experimental uncertainties. This closure test ensures that our methodological uncertainties are negligible in comparison to the generic theoretical and experimental uncertainties of PDF determination. This enables us to determine with confidence PDFs at different perturbative orders and using a variety of experimental datasets ranging from HERA-only up to a global set including the latest LHC results, all using precisely the same validated methodology. We explore some of the phenomenological implications of our results for the upcoming 13 TeV Run of the LHC, in particular for Higgs production cross-sections.Comment: 151 pages, 69 figures. More typos corrected: published versio

    Incremental grouping of image elements in vision

    Get PDF
    One important task for the visual system is to group image elements that belong to an object and to segregate them from other objects and the background. We here present an incremental grouping theory (IGT) that addresses the role of object-based attention in perceptual grouping at a psychological level and, at the same time, outlines the mechanisms for grouping at the neurophysiological level. The IGT proposes that there are two processes for perceptual grouping. The first process is base grouping and relies on neurons that are tuned to feature conjunctions. Base grouping is fast and occurs in parallel across the visual scene, but not all possible feature conjunctions can be coded as base groupings. If there are no neurons tuned to the relevant feature conjunctions, a second process called incremental grouping comes into play. Incremental grouping is a time-consuming and capacity-limited process that requires the gradual spread of enhanced neuronal activity across the representation of an object in the visual cortex. The spread of enhanced neuronal activity corresponds to the labeling of image elements with object-based attention
    corecore