6,119 research outputs found
Running a Production Grid Site at the London e-Science Centre
This paper describes how the London e-Science Centre cluster MARS, a production 400+ Opteron CPU cluster, was integrated into the production Large Hadron Collider Compute Grid. It describes the practical issues that we encountered when deploying and maintaining this system, and details the techniques that were applied to resolve them. Finally, we provide a set of recommendations based on our experiences for grid software development in general that we believe would make the technology more accessible. © 2006 IEEE
Recommended from our members
GRIDCC: Real-time workflow system
The Grid is a concept which allows the sharing of resources between distributed communities, allowing each to progress towards potentially different goals. As adoption of the Grid increases so are the activities that people wish to conduct through it. The GRIDCC project is a European Union funded project addressing the issues of integrating instruments into the Grid. This increases the requirement of workflows and Quality of Service upon these workflows as many of these instruments have real-time requirements. In this paper we present the workflow management service within the GRIDCC project which is tasked with optimising the workflows and ensuring that they meet the pre-defined QoS requirements specified upon them
Frequentist Analysis of the Parameter Space of Minimal Supergravity
We make a frequentist analysis of the parameter space of minimal supergravity
(mSUGRA), in which, as well as the gaugino and scalar soft
supersymmetry-breaking parameters being universal, there is a specific relation
between the trilinear, bilinear and scalar supersymmetry-breaking parameters,
A_0 = B_0 + m_0, and the gravitino mass is fixed by m_{3/2} = m_0. We also
consider a more general model, in which the gravitino mass constraint is
relaxed (the VCMSSM). We combine in the global likelihood function the
experimental constraints from low-energy electroweak precision data, the
anomalous magnetic moment of the muon, the lightest Higgs boson mass M_h, B
physics and the astrophysical cold dark matter density, assuming that the
lightest supersymmetric particle (LSP) is a neutralino. In the VCMSSM, we find
a preference for values of m_{1/2} and m_0 similar to those found previously in
frequentist analyses of the constrained MSSM (CMSSM) and a model with common
non-universal Higgs masses (NUHM1). On the other hand, in mSUGRA we find two
preferred regions: one with larger values of both m_{1/2} and m_0 than in the
VCMSSM, and one with large m_0 but small m_{1/2}. We compare the probabilities
of the frequentist fits in mSUGRA, the VCMSSM, the CMSSM and the NUHM1: the
probability that mSUGRA is consistent with the present data is significantly
less than in the other models. We also discuss the mSUGRA and VCMSSM
predictions for sparticle masses and other observables, identifying potential
signatures at the LHC and elsewhere.Comment: 18 pages 27 figure
HEP Applications Evaluation of the EDG Testbed and Middleware
Workpackage 8 of the European Datagrid project was formed in January 2001
with representatives from the four LHC experiments, and with experiment
independent people from five of the six main EDG partners. In September 2002
WP8 was strengthened by the addition of effort from BaBar and D0. The original
mandate of WP8 was, following the definition of short- and long-term
requirements, to port experiment software to the EDG middleware and testbed
environment. A major additional activity has been testing the basic
functionality and performance of this environment. This paper reviews
experiences and evaluations in the areas of job submission, data management,
mass storage handling, information systems and monitoring. It also comments on
the problems of remote debugging, the portability of code, and scaling problems
with increasing numbers of jobs, sites and nodes. Reference is made to the
pioneeering work of Atlas and CMS in integrating the use of the EDG Testbed
into their data challenges. A forward look is made to essential software
developments within EDG and to the necessary cooperation between EDG and LCG
for the LCG prototype due in mid 2003.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
Conference (CHEP03), La Jolla, CA, USA, March 2003, 7 pages. PSN THCT00
Specification and design for Full Energy Beam Exploitation of the Compact Linear Accelerator for Research and Applications
The Compact Linear Accelerator for Research and Applications (CLARA) is a 250
MeV ultrabright electron beam test facility at STFC Daresbury Laboratory. A
user beam line has been designed to maximise exploitation of CLARA in a variety
of fields, including novel acceleration and new modalities of radiotherapy. In
this paper we present the specification and design of this beam line for Full
Energy Beam Exploitation (FEBE). We outline the key elements which provide
users to access ultrashort, low emittance electron bunches in two large
experiment chambers. The results of start-to-end simulations are reported which
verify the expected beam parameters delivered to these chambers. Key technical
systems are detailed, including those which facilitate combination of electron
bunches with high power laser pulses.Comment: 13 pages, 12 figure
Characterisation of the muon beams for the Muon Ionisation Cooling Experiment
A novel single-particle technique to measure emittance has been developed and used to characterise seventeen different muon beams for the Muon Ionisation Cooling Experiment (MICE). The muon beams, whose mean momenta vary from 171 to 281 MeV/c, have emittances of approximately 1.2–2.3 π mm-rad horizontally and 0.6–1.0 π mm-rad vertically, a horizontal dispersion of 90–190 mm and momentum spreads of about 25 MeV/c. There is reasonable agreement between the measured parameters of the beams and the results of simulations. The beams are found to meet the requirements of MICE
Statistical inference and the replication crisis
The replication crisis has prompted many to call for statistical reform within the psychological sciences. Here we examine issues within Frequentist statistics that may have led to the replication crisis, and we examine the alternative—Bayesian statistics—that many have suggested as a replacement. The Frequentist approach and the Bayesian approach offer radically different perspectives on evidence and inference with the Frequentist approach prioritising error control and the Bayesian approach offering a formal method for quantifying the relative strength of evidence for hypotheses. We suggest that rather than mere statistical reform, what is needed is a better understanding of the different modes of statistical inference and a better understanding of how statistical inference relates to scientific inference
- …