10,228 research outputs found
File-based data flow in the CMS Filter Farm
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small "documents" using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These "files" can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.National Science Foundation (U.S.)United States. Department of Energ
Online data handling and storage at the CMS experiment
During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.United States. Department of EnergyNational Science Foundation (U.S.
Recommended from our members
Event builder and level 3 at the CDF experiment
The Event Builder and Level3 systems constitute critical components of the DAQ in the CDF experiment at Fermilab. These systems are responsible for collecting data fragments from the front end electronics, assembling the data into complete event records, reconstructing the events, and forming the final trigger decision. With Tevatron Run IIa in progress, the systems have been running successfully at high throughput rates, the design utilizing scalable architecture and distributed event processing to meet the requirements. A brief description current performance in Run IIa and possible upgrade for Run IIb is presented
The XMM-Newton serendipitous survey. VII. The third XMM-Newton serendipitous source catalogue
Thanks to the large collecting area (3 x ~1500 cm at 1.5 keV) and wide
field of view (30' across in full field mode) of the X-ray cameras on board the
European Space Agency X-ray observatory XMM-Newton, each individual pointing
can result in the detection of hundreds of X-ray sources, most of which are
newly discovered. Recently, many improvements in the XMM-Newton data reduction
algorithms have been made. These include enhanced source characterisation and
reduced spurious source detections, refined astrometric precision, greater net
sensitivity and the extraction of spectra and time series for fainter sources,
with better signal-to-noise. Further, almost 50\% more observations are in the
public domain compared to 2XMMi-DR3, allowing the XMM-Newton Survey Science
Centre (XMM-SSC) to produce a much larger and better quality X-ray source
catalogue. The XMM-SSC has developed a pipeline to reduce the XMM-Newton data
automatically and using improved calibration a new catalogue version has been
produced from XMM-Newton data made public by 2013 Dec. 31 (13 years of data).
Manual screening ensures the highest data quality. This catalogue is known as
3XMM. In the latest release, 3XMM-DR5, there are 565962 X-ray detections
comprising 396910 unique X-ray sources. For the 133000 brightest sources,
spectra and lightcurves are provided. For all detections, the positions on the
sky, a measure of the quality of the detection, and an evaluation of the X-ray
variability is provided, along with the fluxes and count rates in 7 X-ray
energy bands, the total 0.2-12 keV band counts, and four hardness ratios. To
identify the detections, a cross correlation with 228 catalogues is also
provided for each X-ray detection. 3XMM-DR5 is the largest X-ray source
catalogue ever produced. Thanks to the large array of data products, it is an
excellent resource in which to find new and extreme objects.Comment: 23 pages, version accepted for publication in A&
Theoretical Overview: The New Mesons
After commenting on the state of contemporary hadronic physics and
spectroscopy, I highlight four areas where the action is: searching for the
relevant degrees of freedom, mesons with beauty and charm, chiral symmetry and
the D_{sJ} levels, and X(3872) and the lost tribes of charmonium.Comment: 10 pages, uses jpconf.cls; talk at First Meeting of the APS Topical
Group on Hadronic Physic
Search for charginos in e+e- interactions at sqrt(s) = 189 GeV
An update of the searches for charginos and gravitinos is presented, based on
a data sample corresponding to the 158 pb^{-1} recorded by the DELPHI detector
in 1998, at a centre-of-mass energy of 189 GeV. No evidence for a signal was
found. The lower mass limits are 4-5 GeV/c^2 higher than those obtained at a
centre-of-mass energy of 183 GeV. The (\mu,M_2) MSSM domain excluded by
combining the chargino searches with neutralino searches at the Z resonance
implies a limit on the mass of the lightest neutralino which, for a heavy
sneutrino, is constrained to be above 31.0 GeV/c^2 for tan(beta) \geq 1.Comment: 22 pages, 8 figure
Energy dependence of Cronin momentum in saturation model for and collisions
We calculate dependence of Cronin momentum for and
collisions in saturation model. We show that this dependence is consistent with
expectation from formula which was obtained using simple dimentional
consideration. This can be used to test validity of saturation model (and
distinguish among its variants) and measure dependence of saturation
momentum from experimental data.Comment: LaTeX2e, 12 pages, 8 figure
- …