8,248 research outputs found
Performance of R-GMA for monitoring grid jobs for CMS data production
High energy physics experiments, such as the Compact Muon Solenoid (CMS) at the CERN laboratory in Geneva, have large-scale data processing requirements, with data accumulating at a rate of 1 Gbyte/s. This load comfortably exceeds any previous processing requirements and we believe it may be most efficiently satisfied through grid computing. Furthermore the production of large quantities of Monte Carlo simulated data provides an ideal test bed for grid technologies and will drive their development. One important challenge when using the grid for data analysis is the ability to monitor transparently the large number of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the grid monitoring architecture of the Global Grid Forum. We have previously developed a system allowing us to test its performance under a heavy load while using few real grid resources. We present the latest results on this system running on the LCG 2 grid test bed using the LCG 2.6.0 middleware release. For a sustained load equivalent to 7 generations of 1000 simultaneous jobs, R-GMA was able to transfer all published messages and store them in a database for 98% of the individual jobs. The failures experienced were at the remote sites, rather than at the archiver's MON box as had been expected
Scalability tests of R-GMA-based grid job monitoring system for CMS Monte Carlo data production
Copyright @ 2004 IEEEHigh-energy physics experiments, such as the compact muon solenoid (CMS) at the large hadron collider (LHC), have large-scale data processing computing requirements. The grid has been chosen as the solution. One important challenge when using the grid for large-scale data processing is the ability to monitor the large numbers of jobs that are being executed simultaneously at multiple remote sites. The relational grid monitoring architecture (R-GMA) is a monitoring and information management service for distributed resources based on the GMA of the Global Grid Forum. We report on the first measurements of R-GMA as part of a monitoring architecture to be used for batch submission of multiple Monte Carlo simulation jobs running on a CMS-specific LHC computing grid test bed. Monitoring information was transferred in real time from remote execution nodes back to the submitting host and stored in a database. In scalability tests, the job submission rates supported by successive releases of R-GMA improved significantly, approaching that expected in full-scale production
Ultra-compact planoconcave zoned metallic lens based on the fishnet metamaterial
The following article appeared Pacheco-Pena, V., Orazbayev, B., Torres, V., Beruete, M., & Navarro-Cia, M. (n.d). Ultra-compact planoconcave zoned metallic lens based on the fishnet metamaterial. Applied Physics Letters, 103(18), and may be found at http://dx.doi.org/10.1063/1.4827876.A 1.5λ0 -thick planoconcave zoned lens based on the fishnet metamaterial is demonstrated experimentally at millimeter wavelengths. The zoning technique applied allows a volume reduction of 60% compared to a full fishnet metamaterial lens without any deterioration in performance. The structure is designed to exhibit an effective refractive index n = -0.25 at f = 56.7GHz (λ0 = 5.29 mm) with a focal length FL = 47.62 mm = 9λ0. The experimental enhancement achieved is 11.1dB, which is in good agreement with simulation and also with previous full fishnet metamaterial lenses and opens the door for integrated solutions.This work was supported in part by the Spanish
Government under contract Consolider Engineering
Metamaterials CSD2008-00066 and contract TEC2011-
28664-C02-01. V.P.-P. was sponsored by Spanish Ministerio
de Educacion, Cultura y Deporte under Grant No. FPU AP-
2012-3796. B.O. was sponsored by Spanish Ministerio de
Economıa y Competitividad under Grant No. FPI BES-2012-
054909. V.T. is sponsored by the Universidad Publica de
Navarra. M.B. is sponsored by the Spanish Government via
RYC-2011-08221. M.N.-C. was supported by the Imperial
College Junior Research Fellowship
Towards the implementation of laser engineered surface structures for electron cloud mitigation
The LHC operation has proven that the electron cloud could be a significant limiting factor in machine performance, in particular for future High Luminosity LHC (HL-LHC) beams. Electron clouds, generated by electron multipacting in the beam pipes, leads to beam instabilities and beam-induced heat load in cryogenic systems. Laser Engineered Surface Structures (LESS) is a novel surface treatment which changes the morphology of the internal surfaces of vacuum chambers. The surface modification results in a reduced secondary electron yield (SEY) and, consequently, in the eradication of the electron multipacting. Low SEY values of the treated surfaces and flexibility in choosing the laser parameters make LESS a promising treatment for future accelerators. LESS can be applied both in new and existing accelerators owing to the possibility of automated in-situ treatment. This approach has been developed and optimised for the LHC beam screens in which the electron cloud has to be mitigated before the HL-LHC upgrade. We will present the latest steps towards the implementation of LESS
Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV
A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay
channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7
TeV is presented. The data were collected at the LHC, with the CMS detector,
and correspond to an integrated luminosity of 4.6 inverse femtobarns. No
significant excess is observed above the background expectation, and upper
limits are set on the Higgs boson production cross section. The presence of the
standard model Higgs boson with a mass in the 270-440 GeV range is excluded at
95% confidence level.Comment: Submitted to JHE
Combined search for the quarks of a sequential fourth generation
Results are presented from a search for a fourth generation of quarks
produced singly or in pairs in a data set corresponding to an integrated
luminosity of 5 inverse femtobarns recorded by the CMS experiment at the LHC in
2011. A novel strategy has been developed for a combined search for quarks of
the up and down type in decay channels with at least one isolated muon or
electron. Limits on the mass of the fourth-generation quarks and the relevant
Cabibbo-Kobayashi-Maskawa matrix elements are derived in the context of a
simple extension of the standard model with a sequential fourth generation of
fermions. The existence of mass-degenerate fourth-generation quarks with masses
below 685 GeV is excluded at 95% confidence level for minimal off-diagonal
mixing between the third- and the fourth-generation quarks. With a mass
difference of 25 GeV between the quark masses, the obtained limit on the masses
of the fourth-generation quarks shifts by about +/- 20 GeV. These results
significantly reduce the allowed parameter space for a fourth generation of
fermions.Comment: Replaced with published version. Added journal reference and DO
- …
