5,275 research outputs found
The Cathode Strip Chamber Data Acquisition System for CMS
The Cathode Strip Chamber (CSC) [1] Data Acquisition (DAQ) system for the CMS [2] experiment at the LHC [3] will be described. The CSC system is large, consisting of 218K cathode channels and 183K anode channels. This leads to a substantial data rate of ~1.5GByte/s at LHC design luminosity (1034cm-2s-1) and the CMS first level trigger (L1A) rate of 100KHz. The DAQ system consists of three parts. The first part is on-chamber Cathode Front End Boards (CFEB)[4], which amplify, shape, store, and digitise chamber cathode signals, and Anode Front End Boards (AFEB)[5], which amplify, shape and discriminate chamber anode signals. The second part is the Peripheral Crate Data Acquisition Motherboards (DAQMB), which control the onchamber electronics and the readout of the chamber. The third part is the off-detector DAQ interface boards, which perform real time error checking, electronics reset requests and data concentration. It passes the resulting data to a CSC local DAQ farm, as well as CMS main DAQ [6]. All electronics in the system employ FPGAs allowing programmability. In addition, several high-speed serial interface technologies are employed
Radiation Testing of Electronics for the CMS Endcap Muon System
The electronics used in the data readout and triggering system for the
Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC)
particle accelerator at CERN are exposed to high radiation levels. This
radiation can cause permanent damage to the electronic circuitry, as well as
temporary effects such as data corruption induced by Single Event Upsets. Once
the High Luminosity LHC (HL-LHC) accelerator upgrades are completed it will
have five times higher instantaneous luminosity than LHC, allowing for
detection of rare physics processes, new particles and interactions. Tests have
been performed to determine the effects of radiation on the electronic
components to be used for the Endcap Muon electronics project currently being
designed for installation in the CMS experiment in 2013. During these tests the
digital components on the test boards were operating with active data readout
while being irradiated with 55 MeV protons. In reactor tests, components were
exposed to 30 years equivalent levels of neutron radiation expected at the
HL-LHC. The highest total ionizing dose (TID) for the muon system is expected
at the inner-most portion of the CMS detector, with 8900 rad over ten years.
Our results show that Commercial Off-The-Shelf (COTS) components selected for
the new electronics will operate reliably in the CMS radiation environment
Recommended from our members
Predicting Time to Nursing Home Care and Death in Individuals with Alzheimer Disease
Objective. —To develop and validate an approach that uses clinical features that can be determined in a standard patient visit to estimate the length of time before an individual patient with Alzheimer disease (AD) requires care equivalent to nursing home placement or dies. Design. —Prospective cohort study of 236 patients, followed up semiannually for up to 7 years. A second validation cohort of 105 patients was also followed. Setting. —Three AD research centers. Patients. —All patients met National Institute of Neurological and Communicative Disorders and Stroke—Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA) criteria for probable AD and had mild dementia at the initial visit. Intervention. —Predictive features, ascertained at the initial visit, were sex, duration of illness, age at onset, modified Mini-Mental State Examination (mMMS) score, and the presence or absence of extrapyramidal signs or psychotic features.Main Outcome Measures. —(1) Requiring the equivalent of nursing home placement and (2) death. Results. —Prediction algorithms were constructed for the 2 outcomes based on Cox proportional hazard models. For each algorithm, a predictor index is calculated based on the status of each predictive feature at the initial visit. A table that specifies the number of months in which 25%, 50%, and 75% of patients with any specific predictor index value are likely to reach the end point is then consulted.Survival curves for time to need for care equivalent to nursing home placement and for time to death derived from the algorithms for selected predictor indexes fell within the 95% confidence bands of actual survival curves for patients.When the predictor variables from the initial visit for the validation cohort patients were entered into the algorithm, the predicted survival curves for time to death fell within the 95% confidence bands of actual survival curves for the patients. Conclusions. —The prediction algorithms are a first but promising step toward providing specific prognoses to patients, families, and practitioners. This approach also has clear implications for the design and interpretation of clinical trials in patients with AD
Recommended from our members
Predicting Time to Nursing Home Care and Death in Individuals with Alzheimer Disease
Objective. —To develop and validate an approach that uses clinical features that can be determined in a standard patient visit to estimate the length of time before an individual patient with Alzheimer disease (AD) requires care equivalent to nursing home placement or dies. Design. —Prospective cohort study of 236 patients, followed up semiannually for up to 7 years. A second validation cohort of 105 patients was also followed. Setting. —Three AD research centers. Patients. —All patients met National Institute of Neurological and Communicative Disorders and Stroke—Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA) criteria for probable AD and had mild dementia at the initial visit. Intervention. —Predictive features, ascertained at the initial visit, were sex, duration of illness, age at onset, modified Mini-Mental State Examination (mMMS) score, and the presence or absence of extrapyramidal signs or psychotic features.Main Outcome Measures. —(1) Requiring the equivalent of nursing home placement and (2) death. Results. —Prediction algorithms were constructed for the 2 outcomes based on Cox proportional hazard models. For each algorithm, a predictor index is calculated based on the status of each predictive feature at the initial visit. A table that specifies the number of months in which 25%, 50%, and 75% of patients with any specific predictor index value are likely to reach the end point is then consulted.Survival curves for time to need for care equivalent to nursing home placement and for time to death derived from the algorithms for selected predictor indexes fell within the 95% confidence bands of actual survival curves for patients.When the predictor variables from the initial visit for the validation cohort patients were entered into the algorithm, the predicted survival curves for time to death fell within the 95% confidence bands of actual survival curves for the patients. Conclusions. —The prediction algorithms are a first but promising step toward providing specific prognoses to patients, families, and practitioners. This approach also has clear implications for the design and interpretation of clinical trials in patients with AD
Search for the Decay
We have searched for the decay of the tau lepton into seven charged particles
and zero or one pi0. The data used in the search were collected with the CLEO
II detector at the Cornell Electron Storage Ring (CESR) and correspond to an
integrated luminosity of 4.61 fb^(-1). No evidence for a signal is found.
Assuming all the charged particles are pions, we set an upper limit on the
branching fraction, B(tau- -> 4pi- 3pi+ (pi0) nu_tau) < 2.4 x 10^(-6) at the
90% confidence level. This limit represents a significant improvement over the
previous limit.Comment: 9 page postscript file, postscript file also available through
http://w4.lns.cornell.edu/public/CLN
Radiation testing of electronics for the CMS endcap muon system
The electronics used in the data readout and triggering system for the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) particle accelerator at CERN are exposed to high radiation levels. This radiation can cause permanent damage to the electronic circuitry, as well as temporary effects such as data corruption induced by Single Event Upsets. Once the High Luminosity LHC (HL-LHC) accelerator upgrades are completed it will have five times higher instantaneous luminosity than LHC, allowing for detection of rare physics processes, new particles and interactions. Tests have been performed to determine the effects of radiation on the electronic components to be used for the Endcap Muon electronics project currently being designed for installation in the CMS experiment in 2013. During these tests the digital components on the test boards were operating with active data readout while being irradiated with 55 MeV protons. In reactor tests, components were exposed to 30 years equivalent levels of neutron radiation expected at the HL-LHC. The highest total ionizing dose (TID) for the muon system is expected at the innermost portion of the CMS detector, with 8900 rad over 10 years. Our results show that Commercial Off-The-Shelf (COTS) components selected for the new electronics will operate reliably in the CMS radiation environment.Physic
Search for lepton-flavor violation at HERA
A search for lepton-flavor-violating interactions and has been performed with the ZEUS detector using the entire HERA I
data sample, corresponding to an integrated luminosity of 130 pb^{-1}. The data
were taken at center-of-mass energies, , of 300 and 318 GeV. No
evidence of lepton-flavor violation was found, and constraints were derived on
leptoquarks (LQs) that could mediate such interactions. For LQ masses below
, limits were set on , where
is the coupling of the LQ to an electron and a
first-generation quark , and is the branching ratio of
the LQ to the final-state lepton ( or ) and a quark . For
LQ masses much larger than , limits were set on the four-fermion
interaction term for LQs that couple to an electron and a quark
and to a lepton and a quark , where and are
quark generation indices. Some of the limits are also applicable to
lepton-flavor-violating processes mediated by squarks in -Parity-violating
supersymmetric models. In some cases, especially when a higher-generation quark
is involved and for the process , the ZEUS limits are the most
stringent to date.Comment: 37 pages, 10 figures, Accepted by EPJC. References and 1 figure (Fig.
6) adde
An NLO QCD analysis of inclusive cross-section and jet-production data from the ZEUS experiment
The ZEUS inclusive differential cross-section data from HERA, for charged and
neutral current processes taken with e+ and e- beams, together with
differential cross-section data on inclusive jet production in e+ p scattering
and dijet production in \gamma p scattering, have been used in a new NLO QCD
analysis to extract the parton distribution functions of the proton. The input
of jet data constrains the gluon and allows an accurate extraction of
\alpha_s(M_Z) at NLO;
\alpha_s(M_Z) = 0.1183 \pm 0.0028(exp.) \pm 0.0008(model)
An additional uncertainty from the choice of scales is estimated as \pm
0.005. This is the first extraction of \alpha_s(M_Z) from HERA data alone.Comment: 37 pages, 14 figures, to be submitted to EPJC. PDFs available at
http://durpdg.dur.ac.uk/hepdata in LHAPDFv
- …