8,540 research outputs found
File-based data flow in the CMS Filter Farm
During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small "documents" using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These "files" can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.National Science Foundation (U.S.)United States. Department of Energ
Online data handling and storage at the CMS experiment
During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.United States. Department of EnergyNational Science Foundation (U.S.
The XMM-Newton serendipitous survey. VII. The third XMM-Newton serendipitous source catalogue
Thanks to the large collecting area (3 x ~1500 cm at 1.5 keV) and wide
field of view (30' across in full field mode) of the X-ray cameras on board the
European Space Agency X-ray observatory XMM-Newton, each individual pointing
can result in the detection of hundreds of X-ray sources, most of which are
newly discovered. Recently, many improvements in the XMM-Newton data reduction
algorithms have been made. These include enhanced source characterisation and
reduced spurious source detections, refined astrometric precision, greater net
sensitivity and the extraction of spectra and time series for fainter sources,
with better signal-to-noise. Further, almost 50\% more observations are in the
public domain compared to 2XMMi-DR3, allowing the XMM-Newton Survey Science
Centre (XMM-SSC) to produce a much larger and better quality X-ray source
catalogue. The XMM-SSC has developed a pipeline to reduce the XMM-Newton data
automatically and using improved calibration a new catalogue version has been
produced from XMM-Newton data made public by 2013 Dec. 31 (13 years of data).
Manual screening ensures the highest data quality. This catalogue is known as
3XMM. In the latest release, 3XMM-DR5, there are 565962 X-ray detections
comprising 396910 unique X-ray sources. For the 133000 brightest sources,
spectra and lightcurves are provided. For all detections, the positions on the
sky, a measure of the quality of the detection, and an evaluation of the X-ray
variability is provided, along with the fluxes and count rates in 7 X-ray
energy bands, the total 0.2-12 keV band counts, and four hardness ratios. To
identify the detections, a cross correlation with 228 catalogues is also
provided for each X-ray detection. 3XMM-DR5 is the largest X-ray source
catalogue ever produced. Thanks to the large array of data products, it is an
excellent resource in which to find new and extreme objects.Comment: 23 pages, version accepted for publication in A&
B_s - \bar{B}_s mixing in the MSSM scenario with large flavor mixing in the LL/RR sector
We show that the recent measurements of mass difference,
, by D{\O} and CDF collaborations give very strong constraints on
MSSM scenario with large flavor mixing in the LL and/or RR sector of down-type
squark mass squared matrix. In particular, the region with large mixing angle
and large mass difference between scalar strange and scalar bottom is ruled out
by giving too large . The allowed region is sensitive to the CP
violating phases . The constraint is most stringent
on the scenario with both LL and RR mixing. We also predict the time-dependent
CP asymmetry in decay and semileptonic asymmetry in decay.Comment: 15 pages, 9 figure
Theoretical Overview: The New Mesons
After commenting on the state of contemporary hadronic physics and
spectroscopy, I highlight four areas where the action is: searching for the
relevant degrees of freedom, mesons with beauty and charm, chiral symmetry and
the D_{sJ} levels, and X(3872) and the lost tribes of charmonium.Comment: 10 pages, uses jpconf.cls; talk at First Meeting of the APS Topical
Group on Hadronic Physic
Measurement and Interpretation of Fermion-Pair Production at LEP energies above the Z Resonance
This paper presents DELPHI measurements and interpretations of
cross-sections, forward-backward asymmetries, and angular distributions, for
the e+e- -> ffbar process for centre-of-mass energies above the Z resonance,
from sqrt(s) ~ 130 - 207 GeV at the LEP collider. The measurements are
consistent with the predictions of the Standard Model and are used to study a
variety of models including the S-Matrix ansatz for e+e- -> ffbar scattering
and several models which include physics beyond the Standard Model: the
exchange of Z' bosons, contact interactions between fermions, the exchange of
gravitons in large extra dimensions and the exchange of sneutrino in R-parity
violating supersymmetry.Comment: 79 pages, 16 figures, Accepted by Eur. Phys. J.
A Determination of the Centre-of-Mass Energy at LEP2 using Radiative 2-fermion Events
Using e+e- -> mu+mu-(gamma) and e+e- -> qqbar(gamma) events radiative to the
Z pole, DELPHI has determined the centre-of-mass energy, sqrt{s}, using energy
and momentum constraint methods. The results are expressed as deviations from
the nominal LEP centre-of-mass energy, measured using other techniques. The
results are found to be compatible with the LEP Energy Working Group estimates
for a combination of the 1997 to 2000 data sets.Comment: 20 pages, 6 figures, Accepted by Eur. Phys. J.
Search for charginos in e+e- interactions at sqrt(s) = 189 GeV
An update of the searches for charginos and gravitinos is presented, based on
a data sample corresponding to the 158 pb^{-1} recorded by the DELPHI detector
in 1998, at a centre-of-mass energy of 189 GeV. No evidence for a signal was
found. The lower mass limits are 4-5 GeV/c^2 higher than those obtained at a
centre-of-mass energy of 183 GeV. The (\mu,M_2) MSSM domain excluded by
combining the chargino searches with neutralino searches at the Z resonance
implies a limit on the mass of the lightest neutralino which, for a heavy
sneutrino, is constrained to be above 31.0 GeV/c^2 for tan(beta) \geq 1.Comment: 22 pages, 8 figure
- …