5,976 research outputs found
Physics Analysis Expert PAX: First Applications
PAX (Physics Analysis Expert) is a novel, C++ based toolkit designed to
assist teams in particle physics data analysis issues. The core of PAX are
event interpretation containers, holding relevant information about and
possible interpretations of a physics event. Providing this new level of
abstraction beyond the results of the detector reconstruction programs, PAX
facilitates the buildup and use of modern analysis factories. Class structure
and user command syntax of PAX are set up to support expert teams as well as
newcomers in preparing for the challenges expected to arise in the data
analysis at future hadron colliders.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla, Ca, USA, March 2003, 7 pages, LaTeX, 10 eps figures. PSN
THLT00
Challenges of the LHC Computing Grid by the CMS experiment
This document summarises the status of the existing grid infrastructure and functionality for the high-energy physics experiment CMS and the expertise in operation attained during the so-called ”Computing, Software and Analysis Challenge” performed in 2006 (CSA06). This report is especially focused on the role of the participating computing centres in Germany located at Karlsruhe, Hamburg and Aachen
CMS Software Distribution on the LCG and OSG Grids
The efficient exploitation of worldwide distributed storage and computing
resources available in the grids require a robust, transparent and fast
deployment of experiment specific software. The approach followed by the CMS
experiment at CERN in order to enable Monte-Carlo simulations, data analysis
and software development in an international collaboration is presented. The
current status and future improvement plans are described.Comment: 4 pages, 1 figure, latex with hyperref
High precision fundamental constants at the TeV scale
This report summarizes the proceedings of the 2014 Mainz Institute for
Theoretical Physics (MITP) scientific program on "High precision fundamental
constants at the TeV scale". The two outstanding parameters in the Standard
Model dealt with during the MITP scientific program are the strong coupling
constant and the top-quark mass . Lacking knowledge on the
value of those fundamental constants is often the limiting factor in the
accuracy of theoretical predictions. The current status on and
has been reviewed and directions for future research have been identified.Comment: 57 pages, 24 figures, pdflate
High-precision measurements from LHC to FCC-ee
This document provides a writeup of all contributions to the workshop on
"High precision measurements of : From LHC to FCC-ee" held at CERN,
Oct. 12--13, 2015. The workshop explored in depth the latest developments on
the determination of the QCD coupling from 15 methods where high
precision measurements are (or will be) available. Those include low-energy
observables: (i) lattice QCD, (ii) pion decay factor, (iii) quarkonia and (iv)
decays, (v) soft parton-to-hadron fragmentation functions, as well as
high-energy observables: (vi) global fits of parton distribution functions,
(vii) hard parton-to-hadron fragmentation functions, (viii) jets in p
DIS and -p photoproduction, (ix) photon structure function in
-, (x) event shapes and (xi) jet cross sections in
collisions, (xii) W boson and (xiii) Z boson decays, and (xiv) jets and (xv)
top-quark cross sections in proton-(anti)proton collisions. The current status
of the theoretical and experimental uncertainties associated to each extraction
method, the improvements expected from LHC data in the coming years, and future
perspectives achievable in collisions at the Future Circular Collider
(FCC-ee) with (1--100 ab) integrated luminosities yielding
10 Z bosons and jets, and 10 W bosons and leptons, are
thoroughly reviewed. The current uncertainty of the (preliminary) 2015 strong
coupling world-average value, = 0.1177 0.0013, is about
1\%. Some participants believed this may be reduced by a factor of three in the
near future by including novel high-precision observables, although this
opinion was not universally shared. At the FCC-ee facility, a factor of ten
reduction in the uncertainty should be possible, mostly thanks to
the huge Z and W data samples available.Comment: 135 pages, 56 figures. CERN-PH-TH-2015-299, CoEPP-MN-15-13. This
document is dedicated to the memory of Guido Altarell
Calculations for deep inelastic scattering using fast interpolation grid techniques at NNLO in QCD and the extraction of αs from HERA data
The extension of interpolation-grid frameworks for perturbative QCD calculations at next-to-next-to-leading order (NNLO) is presented for deep inelastic scattering (DIS) processes. A fast and flexible evaluation of higher-order predictions for any a posteriori choice of parton distribution functions (PDFs) or value of the strong coupling constant is essential in iterative fitting procedures to extract PDFs and Standard Model parameters as well as for a detailed study of the scale dependence. The APPLfast project, described here, provides a generic interface between the parton-level Monte Carlo program NNLOjet and both the APPLgrid and fastNLO libraries for the production of interpolation grids at NNLO accuracy. Details of the interface for DIS processes are presented together with the required interpolation grids at NNLO, which are made available. They cover numerous inclusive jet measurements by the H1 and ZEUS experiments at HERA. An extraction of the strong coupling constant is performed as an application of the use of such grids and a best-fit value of αs(MZ)=0.1170(15)exp(25)th is obtained using the HERA inclusive jet cross section data
NNLO interpolation grids for jet production at the LHC
Fast interpolation-grid frameworks facilitate an efficient and flexible evaluation of higher-order predictions for any choice of parton distribution functions or value of the strong coupling α. They constitute an essential tool for the extraction of parton distribution functions and Standard Model parameters, as well as studies of the dependence of cross sections on the renormalisation and factorisation scales. The APPLFAST project provides a generic interface between the parton-level Monte Carlo generator NNLOJET and both the APPLGRID and the FASTNLO libraries for the grid inter- polation. The extension of the project to include hadron– hadron collider processes at next-to-next-to-leading order in perturbative QCD is presented, together with an application for jet production at the LHC
NNLO interpolation grids for jet production at the LHC
Fast interpolation-grid frameworks facilitate an efficient and flexible evaluation of higher-order predictions for any choice of parton distribution functions or value of the strong coupling . They constitute an essential tool for the extraction of parton distribution functions and Standard Model parameters, as well as studies of the dependence of cross sections on the renormalisation and factorisation scales. The APPLfast project provides a generic interface between the parton-level Monte Carlo generator "Image missing" and both the APPLgrid and the fastNLO libraries for the grid interpolation. The extension of the project to include hadron–hadron collider processes at next-to-next-to-leading order in perturbative QCD is presented, together with an application for jet production at the LHC
New parton distributions in fixed flavour factorization scheme from recent deep-inelastic-scattering data
We present our QCD analysis of the proton structure function
to determine the parton distributions at the next-to-leading order (NLO). The
heavy quark contributions to , with = , have been
included in the framework of the `fixed flavour number scheme' (FFNS). The
results obtained in the FFNS are compared with available results such as the
general-mass variable-flavour-number scheme (GM-VFNS) and other prescriptions
used in global fits of PDFs. In the present QCD analysis, we use a wide range
of the inclusive neutral-current deep-inelastic-scattering (NC DIS) data,
including the most recent data for charm , bottom , longitudinal
structure functions and also the reduced DIS cross sections
from HERA experiments. The most recent HERMES data for
proton and deuteron structure functions are also added. We take into account
ZEUS neutral current DIS inclusive jet cross section data from HERA
together with the recent Tevatron Run-II inclusive jet cross section data from
CDF and D{\O}. The impact of these recent DIS data on the PDFs extracted from
the global fits are studied. We present two families of PDFs, {\tt KKT12} and
{\tt KKT12C}, without and with HERA `combined' data sets on DIS. We
find these are in good agreement with the available theoretical models.Comment: 23 pages, 26 figures and 4 tables. V3: Only few comments and
references added in the replaced version, results unchanged. Code can be
found at http://particles.ipm.ir/links/QCD.ht
Tevatron-for-LHC Report of the QCD Working Group
The experiments at Run 2 of the Tevatron have each accumulated over 1 inverse
femtobarn of high-transverse momentum data. Such a dataset allows for the first
precision (i.e. comparisons between theory and experiment at the few percent
level) tests of QCD at a hadron collider. While the Large Hadron Collider has
been designed as a discovery machine, basic QCD analyses will still need to be
performed to understand the working environment. The Tevatron-for-LHC workshop
was conceived as a communication link to pass on the expertise of the Tevatron
and to test new analysis ideas coming from the LHC community. The TeV4LHC QCD
Working Group focussed on important aspects of QCD at hadron colliders: jet
definitions, extraction and use of Parton Distribution Functions, the
underlying event, Monte Carlo tunes, and diffractive physics. This report
summarizes some of the results achieved during this workshop.Comment: 156 pages, Tevatron-for-LHC Conference Report of the QCD Working
Grou
- …