802 research outputs found

    ATLAS and CMS applications on the WorldGrid testbed

    Full text link
    WorldGrid is an intercontinental testbed spanning Europe and the US integrating architecturally different Grid implementations based on the Globus toolkit. It has been developed in the context of the DataTAG and iVDGL projects, and successfully demonstrated during the WorldGrid demos at IST2002 (Copenhagen) and SC2002 (Baltimore). Two HEP experiments, ATLAS and CMS, successful exploited the WorldGrid testbed for executing jobs simulating the response of their detectors to physics eve nts produced by real collisions expected at the LHC accelerator starting from 2007. This data intensive activity has been run since many years on local dedicated computing farms consisting of hundreds of nodes and Terabytes of disk and tape storage. Within the WorldGrid testbed, for the first time HEP simulation jobs were submitted and run indifferently on US and European resources, despite of their underlying different Grid implementations, and produced data which could be retrieved and further analysed on the submitting machine, or simply stored on the remote resources and registered on a Replica Catalogue which made them available to the Grid for further processing. In this contribution we describe the job submission from Europe for both ATLAS and CMS applications, performed through the GENIUS portal operating on top of an EDG User Interface submitting to an EDG Resource Broker, pointing out the chosen interoperability solutions which made US and European resources equivalent from the applications point of view, the data management in the WorldGrid environment, and the CMS specific production tools which were interfaced to the GENIUS portal.Comment: Poster paper from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 10 pages, PDF. PSN TUCP004; added credit to funding agenc

    The Grid-distributed data analysis in CMS

    Get PDF
    The CMS experiment will soon produce a huge amount of data (a few PBytes per year) that will be distributed and stored in many computing centres spread across the countries participating in the collaboration. Data will be available to the whole CMS physicists: this will be possible thanks to the services provided by supported Grids. CRAB is the CMS collaboration tool developed to allow physicists to access and analyze data stored over world-wide sites. It aims to simplify the data discovery process and the jobs creation, execution and monitoring tasks hiding the details related both to Grid infrastructures and CMS analysis framework. We will discuss the recent evolution of this tool from its standalone version up to the clientserver architecture adopted for particularly challenging workload volumes and we will report the usage statistics collected from the CRAB community, involving so far almost 600 distinct users

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Use of the gLite-WMS in CMS for production and analysis

    Get PDF
    The CMS experiment at LHC started using the Resource Broker (by the EDG and LCG projects) to submit Monte Carlo production and analysis jobs to distributed computing resources of the WLCG infrastructure over 6 years ago. Since 2006 the gLite Workload Management System (WMS) and Logging \& Bookkeeping (LB) are used. The interaction with the gLite-WMS/LB happens through the CMS production and analysis frameworks, respectively ProdAgent and CRAB, through a common component, BOSSLite. The important improvements recently made in the gLite-WMS/LB as well as in the CMS tools and the intrinsic independence of different WMS/LB instances allow CMS to reach the stability and scalability needed for LHC operations. In particular the use of a multi-threaded approach in BOSSLite allowed to increase the scalability of the systems significantly. In this work we present the operational set up of CMS production and analysis based on the gLite-WMS and the performances obtained in the past data challenges and in the daily Monte Carlo productions and user analysis usage in the experiment

    Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7 TeV is presented. The data were collected at the LHC, with the CMS detector, and correspond to an integrated luminosity of 4.6 inverse femtobarns. No significant excess is observed above the background expectation, and upper limits are set on the Higgs boson production cross section. The presence of the standard model Higgs boson with a mass in the 270-440 GeV range is excluded at 95% confidence level.Comment: Submitted to JHE

    Search for New Physics with Jets and Missing Transverse Momentum in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for new physics is presented based on an event signature of at least three jets accompanied by large missing transverse momentum, using a data sample corresponding to an integrated luminosity of 36 inverse picobarns collected in proton--proton collisions at sqrt(s)=7 TeV with the CMS detector at the LHC. No excess of events is observed above the expected standard model backgrounds, which are all estimated from the data. Exclusion limits are presented for the constrained minimal supersymmetric extension of the standard model. Cross section limits are also presented using simplified models with new particles decaying to an undetected particle and one or two jets

    Measurement of the Z/gamma* + b-jet cross section in pp collisions at 7 TeV

    Get PDF
    The production of b jets in association with a Z/gamma* boson is studied using proton-proton collisions delivered by the LHC at a centre-of-mass energy of 7 TeV and recorded by the CMS detector. The inclusive cross section for Z/gamma* + b-jet production is measured in a sample corresponding to an integrated luminosity of 2.2 inverse femtobarns. The Z/gamma* + b-jet cross section with Z/gamma* to ll (where ll = ee or mu mu) for events with the invariant mass 60 < M(ll) < 120 GeV, at least one b jet at the hadron level with pT > 25 GeV and abs(eta) < 2.1, and a separation between the leptons and the jets of Delta R > 0.5 is found to be 5.84 +/- 0.08 (stat.) +/- 0.72 (syst.) +(0.25)/-(0.55) (theory) pb. The kinematic properties of the events are also studied and found to be in agreement with the predictions made by the MadGraph event generator with the parton shower and the hadronisation performed by PYTHIA.Comment: Submitted to the Journal of High Energy Physic

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns

    Compressed representation of a partially defined integer function over multiple arguments

    Get PDF
    In OLAP (OnLine Analitical Processing) data are analysed in an n-dimensional cube. The cube may be represented as a partially defined function over n arguments. Considering that often the function is not defined everywhere, we ask: is there a known way of representing the function or the points in which it is defined, in a more compact manner than the trivial one

    X-ray emission from the Sombrero galaxy: discrete sources

    Get PDF
    We present a study of discrete X-ray sources in and around the bulge-dominated, massive Sa galaxy, Sombrero (M104), based on new and archival Chandra observations with a total exposure of ~200 ks. With a detection limit of L_X = 1E37 erg/s and a field of view covering a galactocentric radius of ~30 kpc (11.5 arcminute), 383 sources are detected. Cross-correlation with Spitler et al.'s catalogue of Sombrero globular clusters (GCs) identified from HST/ACS observations reveals 41 X-rays sources in GCs, presumably low-mass X-ray binaries (LMXBs). We quantify the differential luminosity functions (LFs) for both the detected GC and field LMXBs, whose power-low indices (~1.1 for the GC-LF and ~1.6 for field-LF) are consistent with previous studies for elliptical galaxies. With precise sky positions of the GCs without a detected X-ray source, we further quantify, through a fluctuation analysis, the GC LF at fainter luminosities down to 1E35 erg/s. The derived index rules out a faint-end slope flatter than 1.1 at a 2 sigma significance, contrary to recent findings in several elliptical galaxies and the bulge of M31. On the other hand, the 2-6 keV unresolved emission places a tight constraint on the field LF, implying a flattened index of ~1.0 below 1E37 erg/s. We also detect 101 sources in the halo of Sombrero. The presence of these sources cannot be interpreted as galactic LMXBs whose spatial distribution empirically follows the starlight. Their number is also higher than the expected number of cosmic AGNs (52+/-11 [1 sigma]) whose surface density is constrained by deep X-ray surveys. We suggest that either the cosmic X-ray background is unusually high in the direction of Sombrero, or a distinct population of X-ray sources is present in the halo of Sombrero.Comment: 11 figures, 5 tables, ApJ in pres
    • …
    corecore