8,579 research outputs found

    CMS Offline Conditions Framework and Services

    Get PDF
    Non-event data describing detector conditions change with time and come from different data sources. They are accessible by physicists within the offline event-processing applications for precise calibration of reconstructed data as well as for data-quality control purposes. Over the past years CMS has developed and deployed a software system managing such data. Object-relational mapping and the relational abstraction layer of the LHC persistency framework are the foundation; the offline conditions framework updates and delivers C++ data objects according to their validity. A high-level tag versioning system allows production managers to organize data in hierarchical view. A scripting API in python, command-line tools and a web service serve physicists in daily work. A mini-framework is available for handling data coming from external sources. Efficient data distribution over the worldwide network is guaranteed by a system of hierarchical web caches. The system has been tested and used in all major productions, test-beams and cosmic runs

    A test of the Feynman scaling in the fragmentation region

    Get PDF
    The result of the direct measurement of the fragmentation region will be presented. The result will be obtained at the CERN proton-antiproton collider, being exposured the Silicon calorimeters inside beam pipe. This experiment clarifies a long riddle of cosmic ray physics, whether the Feynman scaling does villate at the fragmentation region or the Iron component is increasing at 10 to the 15th power eV

    The LCG PI project: using interfaces for physics data analysis

    Get PDF
    In the context of the LHC computing grid (LCG) project, the applications area develops and maintains that part of the physics applications software and associated infrastructure that is shared among the LHC experiments. The "physicist interface" (PI) project of the LCG application area encompasses the interfaces and tools by which physicists will directly use the software, providing implementations based on agreed standards like the analysis systems subsystem (AIDA) interfaces for data analysis. In collaboration with users from the experiments, work has started with implementing the AIDA interfaces for (binned and unbinned) histogramming, fitting and minimization as well as manipulation of tuples. These implementations have been developed by re-using existing packages either directly or by using a (thin) layer of wrappers. In addition, bindings of these interfaces to the Python interpreted language have been done using the dictionary subsystem of the LCG applications area/SEAL project. The actual status and the future planning of the project will be presented

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Persistent storage of non-event data in the CMS databases

    Get PDF
    In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first results obtained during the 2008 and 2009 cosmic data taking are presented.In the CMS experiment, the non event data needed to set up the detector, or being produced by it, and needed to calibrate the physical responses of the detector itself are stored in ORACLE databases. The large amount of data to be stored, the number of clients involved and the performance requirements make the database system an essential service for the experiment to run. This note describes the CMS condition database architecture, the data-flow and PopCon, the tool built in order to populate the offline databases. Finally, the first experience obtained during the 2008 and 2009 cosmic data taking are presented

    Combined search for the quarks of a sequential fourth generation

    Get PDF
    Results are presented from a search for a fourth generation of quarks produced singly or in pairs in a data set corresponding to an integrated luminosity of 5 inverse femtobarns recorded by the CMS experiment at the LHC in 2011. A novel strategy has been developed for a combined search for quarks of the up and down type in decay channels with at least one isolated muon or electron. Limits on the mass of the fourth-generation quarks and the relevant Cabibbo-Kobayashi-Maskawa matrix elements are derived in the context of a simple extension of the standard model with a sequential fourth generation of fermions. The existence of mass-degenerate fourth-generation quarks with masses below 685 GeV is excluded at 95% confidence level for minimal off-diagonal mixing between the third- and the fourth-generation quarks. With a mass difference of 25 GeV between the quark masses, the obtained limit on the masses of the fourth-generation quarks shifts by about +/- 20 GeV. These results significantly reduce the allowed parameter space for a fourth generation of fermions.Comment: Replaced with published version. Added journal reference and DO

    Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7 TeV is presented. The data were collected at the LHC, with the CMS detector, and correspond to an integrated luminosity of 4.6 inverse femtobarns. No significant excess is observed above the background expectation, and upper limits are set on the Higgs boson production cross section. The presence of the standard model Higgs boson with a mass in the 270-440 GeV range is excluded at 95% confidence level.Comment: Submitted to JHE
    corecore