6,599 research outputs found
Scalability tests of R-GMA-based grid job monitoring system for CMS Monte Carlo data production
Copyright @ 2004 IEEEHigh-energy physics experiments, such as the compact muon solenoid (CMS) at the large hadron collider (LHC), have large-scale data processing computing requirements. The grid has been chosen as the solution. One important challenge when using the grid for large-scale data processing is the ability to monitor the large numbers of jobs that are being executed simultaneously at multiple remote sites. The relational grid monitoring architecture (R-GMA) is a monitoring and information management service for distributed resources based on the GMA of the Global Grid Forum. We report on the first measurements of R-GMA as part of a monitoring architecture to be used for batch submission of multiple Monte Carlo simulation jobs running on a CMS-specific LHC computing grid test bed. Monitoring information was transferred in real time from remote execution nodes back to the submitting host and stored in a database. In scalability tests, the job submission rates supported by successive releases of R-GMA improved significantly, approaching that expected in full-scale production
Performance of R-GMA for monitoring grid jobs for CMS data production
High energy physics experiments, such as the Compact Muon Solenoid (CMS) at the CERN laboratory in Geneva, have large-scale data processing requirements, with data accumulating at a rate of 1 Gbyte/s. This load comfortably exceeds any previous processing requirements and we believe it may be most efficiently satisfied through grid computing. Furthermore the production of large quantities of Monte Carlo simulated data provides an ideal test bed for grid technologies and will drive their development. One important challenge when using the grid for data analysis is the ability to monitor transparently the large number of jobs that are being executed simultaneously at multiple remote sites. R-GMA is a monitoring and information management service for distributed resources based on the grid monitoring architecture of the Global Grid Forum. We have previously developed a system allowing us to test its performance under a heavy load while using few real grid resources. We present the latest results on this system running on the LCG 2 grid test bed using the LCG 2.6.0 middleware release. For a sustained load equivalent to 7 generations of 1000 simultaneous jobs, R-GMA was able to transfer all published messages and store them in a database for 98% of the individual jobs. The failures experienced were at the remote sites, rather than at the archiver's MON box as had been expected
Predictors of mathematics in primary school: Magnitude comparison, verbal and spatial working memory measures.
We determined the relative importance of the so-called approximate number system (ANS), symbolic number comparison (SNC) and verbal and spatial short-term and working memory (WM) capacity for mathematics achievement in 1,254 Grade 2, 4 and 6 children. The large sample size assured high power and low false report probability and allowed us to determine effect sizes precisely. We used reading decoding as a control outcome measure to test whether findings were specific to mathematics. Bayesian analysis allowed us to provide support for both null and alternative hypotheses. We found very weak zero-order correlations between ANS measures and math achievement. These correlations were not specific to mathematics, became non-significant once intelligence was considered and ANS measures were not selected as predictors of math by regression models. In contrast, overall SNC accuracy and spatial WM measures were reliable and mostly specific predictors of math achievement. Verbal short-term and WM and SNC reaction time were predictors of both reading and math achievement. We conclude that ANS tasks are not suitable as measures of math development in school-age populations. In contrast, all other cognitive functions we studied are promising markers of mathematics development
Testing of tritium breeder blanket activation foil spectrometer during JET operations
Accurate measurement of the nuclear environment within a test tritium breeding-blanket module of a fusion reactor is crucial to determine tritium production rates which are relevant to self-sufficiency of tritium fuel supply, tritium accountancy and also to the evaluation of localised power levels produced in blankets. This requires evaluation of the time-dependent spectral neutron flux within the test tritium breeding-blanket module under harsh radiation and temperature environments. The application of an activation foil-based spectrometer system to determine neutron flux density using a pneumatic transfer system in ITER has been studied, deployed and tested on the Joint European Torus (JET) machine in a recent deuterium - deuterium campaign for a selection of high purity activation foils. Deployment of the spectrometer system has provided important functional and practical testing of the detector measurement system, associated hardware and post processing techniques for the analysis of large data sets produced through the use of list mode data collection. The testing is invaluable for the optimisation of systems for future planned testing in tritium - tritium and deuterium - tritium conditions. Analysis of the time and energy spectra collected to date and the status of the development of methods for post processing are presented in this paper
HEP Applications Evaluation of the EDG Testbed and Middleware
Workpackage 8 of the European Datagrid project was formed in January 2001
with representatives from the four LHC experiments, and with experiment
independent people from five of the six main EDG partners. In September 2002
WP8 was strengthened by the addition of effort from BaBar and D0. The original
mandate of WP8 was, following the definition of short- and long-term
requirements, to port experiment software to the EDG middleware and testbed
environment. A major additional activity has been testing the basic
functionality and performance of this environment. This paper reviews
experiences and evaluations in the areas of job submission, data management,
mass storage handling, information systems and monitoring. It also comments on
the problems of remote debugging, the portability of code, and scaling problems
with increasing numbers of jobs, sites and nodes. Reference is made to the
pioneeering work of Atlas and CMS in integrating the use of the EDG Testbed
into their data challenges. A forward look is made to essential software
developments within EDG and to the necessary cooperation between EDG and LCG
for the LCG prototype due in mid 2003.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
Conference (CHEP03), La Jolla, CA, USA, March 2003, 7 pages. PSN THCT00
Preliminary results on the performance of a TeO2 thermal detector in a search for direct interactions of WIMPS
Abstract During a Double Beta Decay experiment performed at Laboratori Nazionali del Gran Sasso, a 1548 hours background spectrum was collected with a 340 g TeO2 thermal detector. An analysis of this spectrum has been carried out to search for possible WIMP signals. The values for parameters which are essential in the search for WIMPs, like energy resolution (2 keV), energy threshold (13 keV) and nuclear recoil quenching factor (≥ 0.93) have been experimentally determined and are discussed in detail. The spectrum of recoils induced by α decays has been directly observed for the first time in coincidence with the α particle pulse. Preliminary limits on the spin-independent cross sections of WIMPs on Te and O nuclei have been obtained
The bolometers as nuclear recoil detectors
Our group is involved in experiments using bolometric detectors since ten years for rare event searches like double beta decay or Dark Matter interactions. During last year, to check the quenching factor of TeO 2 bolometers, we have measured the nuclear recoils at energy as low as 15 keV in our experimental apparatus at Laboratori Nazionali del Gran Sasso. Two 72g TeO 2 detectors were exposed under vacuum to a 228Ra a source that implanted on them 224Ra nuclei. The nuclei emitted by the implanted source were detected in one bolometer in coincidence with the corresponding a particles in the other. The energy spectrum of the 103.4 keV 224Ra nuclei has been obtained with an energy resolution of about 12 keV. Furthermore an a measurement of Roman lead has exploited also the sensitivity of this technique to check for ultralow activity in matter, taking advantage of the source,detector approach. A limit on the 210Pb contamination in roman lead as low as 4 mBq/Kg has been obtained. ( 1998 Elsevier Science B.V. All rights reserved
The CRESST Dark Matter Search
The current status of CRESST (Cryogenic Rare Event Search using
Superconducting Thermometers) and new results concerning the detector
development are presented. The basic technique of CRESST is to search for
particle Dark Matter (WIMPS, Weakly Interacting Massive particles) by the
measurement of non-thermal phonons as created by WIMP-induced nuclear recoils.
Combined with the newly developed method of simultaneous measurement of
scintillation light, strong background discrimination is possible, resulting in
a substantial increase in WIMP detection sensitivity. The short and long term
perspectives of CRESST are discussed.Comment: 12 pages, 6 figure
Retrospective Analysis of Patients With Prostate Cancer Initiating GnRH Agonists/Antagonists Therapy Using a German Claims Database: Epidemiological and Patient Outcomes
Objective: The objective of this study was to obtain real-world information on gonadotropin-releasing hormone agonist/antagonist (GnRHa) therapy in patients with advanced prostate cancer (PCa).Materials and methods: Anonymized, routine healthcare claims data from approx. 75 German statutory health insurance funds from 2010–2015 (n = 4,205,227) were analyzed. Patients had an enrolment of 1 year before GnRHa, 1 index quarter of initial GnRHa prescription and ≥2 years of follow-up.Results: In total, 2,382 patients with PCa were eligible. The most frequent index therapy was leuprolide in 56.6%. The rank order of PCa comorbidity prevalence was consistent over time (% at index and 3-years of follow-up): hypertension (71.5; 85.0), hyperlipidemia (45.2; 60.8), cardiovascular disease (CVD) (35.7; 54.1), and diabetes (28.3; 36.2). Comparing pooled therapy classes (agonists, hybrids, and antagonist), no significant differences in the incidence of CVD or diabetes were observed. For hypertension, there was a significant increase for agonists (16.4%) compared to antagonists (6.9%, p = 0.022) and leuprolide hybrid group (11.6%, p = 0.006). During the follow-up period 23.9% of all PCa patients died. There were no significant differences concerning mortality rate and discontinuation rates between the cohorts. In total, 11.2% of all patients discontinued GnRHa after first prescription; the mean time to first switch to another GnRHa therapy was 100 days earlier for hybrids than for agonists (p = 0.016).Conclusion: This comparative retrospective analysis provides real-world information about healthcare characteristics and treatment patterns, highlighting the impact of different GnRHa on clinical outcomes for patients with advanced PCa in Germany
- …