286 research outputs found
Lessons Learnt from WLCG Service Deployment
This paper summarises the main lessons learnt from deploying WLCG production services, with a focus on Reliability, Scalability, Accountability, which lead to both manageability and usability. Each topic is analysed in turn. Techniques for zero-user-visible downtime for the main service interventions are described, together with pathological cases that need special treatment. The requirements in terms of scalability are analysed, calling for as much robustness and automation in the service as possible. The different aspects of accountability - which covers measuring / tracking / logging / monitoring what is going on -- and has gone on - is examined, with the goal of attaining a manageable service. Finally, a simple analogy is drawn with the Web in terms of usability - what do we need to achieve to cross the chasm from small-scale adoption to ubiquity
Databases in High Energy Physics: a critial review
The year 2000 is marked by a plethora of significant milestones in the history of High Energy Physics. Not only the true numerical end to the second millennium, this watershed year saw the final run of CERN's Large Electron-Positron collider (LEP) - the world-class machine that had been the focus of the lives of many of us for such a long time. It is also closely related to the subject of this chapter in the following respects: - Classified as a nuclear installation, information on the LEP machine must be retained indefinitely. This represents a challenge to the database community that is almost beyond discussion - archiving of data for a relatively small number of years is indeed feasible, but retaining it for centuries, millennia or more is a very different issue; - There are strong scientific arguments as to why the data from the LEP machine should be retained for a short period. However, the complexity of the data itself, the associated metadata and the programs that manipulate it make even this a huge challenge; - The story of databases in HEP is closely linked to that of LEP itself: what were the basic requirements that were identified in the early years of LEP preparation? How well have these been satisfied? What are the remaining issues and key messages? - Finally, the year 2000 also marked the entry of Grid architectures into the central stage of HEP computing. How has the Grid affected the requirements on databases or the manner in which they are deployed? Furthermore, as the LEP tunnel and even parts of the detectors that it housed are readied for re-use for the Large Hadron Collider (LHC), how have our requirements on databases evolved at this new scale of computing? A number of the key players in the field of databases - as can be seen from the author list of the various publications - have since retired from the field or else this world. Given the fallibility of human memory, the need for a record of the use of databases for physics data processing is clearly needed before memories fade completely and the story is lost forever. It is necessarily somewhat CERN-centric, although effort has been made to cover important developments and events elsewhere. Frequent reference is made to the Computing in High Energy Physics (CHEP) conference series - the most accessible and consistent record of this field
The correlation space of Gaussian latent tree models and model selection without fitting
We provide a complete description of possible covariance matrices consistent
with a Gaussian latent tree model for any tree. We then present techniques for
utilising these constraints to assess whether observed data is compatible with
that Gaussian latent tree model. Our method does not require us first to fit
such a tree. We demonstrate the usefulness of the inverse-Wishart distribution
for performing preliminary assessments of tree-compatibility using
semialgebraic constraints. Using results from Drton et al. (2008) we then
provide the appropriate moments required for test statistics for assessing
adherence to these equality constraints. These are shown to be effective even
for small sample sizes and can be easily adjusted to test either the entire
model or only certain macrostructures hypothesized within the tree. We
illustrate our exploratory tetrad analysis using a linguistic application and
our confirmatory tetrad analysis using a biological application.Comment: 15 page
The Green Guide to Specification â An Environmental Legacy. Reducing the Environmental Impact of Buildings
The Green Guide to Specification is an environmental profiling system that enables designers and constructors to select building materials and components which have the lowest environmental impact. Designed and developed at Oxford Brookes University, the Green Guide methodology provides the construction industry with reliable environmental evaluations based on quantitative Life Cycle Assessment (LCA) data. Now in its 4th edition and part of the BREEAM and Code for Sustainable Homes programmes, Green Guide has been used to reduce environmental impacts for over 230,000 recorded construction projects, with a further 1.07 million projects registered awaiting certification worldwide. In 2009, the Green Guide was adopted as the official design standard for all construction materials used in the London 2012 Olympics
Visualisation of bacterial behaviour using tapping-mode atomic force microscopy
Ex-situ and in-situ Tapping Mode AFM were used to investigate responses of attachedbacteria to stressful conditions. For ex-situ measurements, the AFM was equipped with acustomised re-positioning stage and sample mount to permit re-examination of the same surfacearea. For in-situ measurements, the inoculated pyrite coupon was immersed in solution in a flowthrough cell. Initial experiments using Sulfobacillus thermosulfidooxidans indicated that increasedacidity promoted EPS production but increased salinity resulted in cell detachment
Adherence with NICE guidance on lifestyle advice for people with schizophrenia: a survey
Background
Substantial weight gain is common in people taking antipsychotics. NICE recommends these patients
are offered physical health screening and intervention. The STEPWISE trial is currently evaluating a
lifestyle education programme in addition to usual care. However, it is difficult to define what
constitutes âusual careâ.
Aims
To define âusual careâ for lifestyle management in people with schizophrenia, schizoaffective
disorder and first episode psychosis in STEPWISE study sites.
Method
Ten NHS Mental Health Trusts participated in a bespoke survey based on NICE guidance.
Results
Eight trusts reported offering lifestyle education programmes. Nine Trusts reported offering smoking
cessation support. Reported recording of biomedical measures varied.
Conclusions
No consistent lifestyle education programme is currently offered across UK NHS Mental Health
Trusts. The survey benchmarks âusual careâ for the STEPWISE study on which changes can be
measured
Gaussian tree constraints applied to acoustic linguistic functional data
Evolutionary models of languages are usually considered to take the form of trees. With the development of so-called tree constraints the plausibility of the tree model assumptions can be assessed by checking whether the moments of observed variables lie within regions consistent with Gaussian latent tree models. In our linguistic application, the data set comprises acoustic samples (audio recordings) from speakers of five Romance languages or dialects. The aim is to assess these functional data for compatibility with a hereditary tree model at the language level. A novel combination of canonical function analysis (CFA) with a separable covariance structure produces a representative basis for the data. The separable-CFA basis is formed of components which emphasize language differences whilst maintaining the integrity of the observational language-groupings. A previously unexploited Gaussian tree constraint is then applied to component-by-component projections of the data to investigate adherence to an evolutionary tree. The results highlight some aspects of Romance language speech that appear compatible with an evolutionary tree model but indicates that it would be inappropriate to model all features as such
An Improved Experimental Limit on the Electric Dipole Moment of the Neutron
An experimental search for an electric-dipole moment (EDM) of the neutron has
been carried out at the Institut Laue-Langevin (ILL), Grenoble. Spurious
signals from magnetic-field fluctuations were reduced to insignificance by the
use of a cohabiting atomic-mercury magnetometer. Systematic uncertainties,
including geometric-phase-induced false EDMs, have been carefully studied. Two
independent approaches to the analysis have been adopted. The overall results
may be interpreted as an upper limit on the absolute value of the neutron EDM
of |d_n| < 2.9 x 10^{-26} e cm (90% CL).Comment: 5 pages, 2 figures. The published PRL is slightly more terse (e.g. no
section headings) than this version, due to space constraints. Note a small
correction-to-a-correction led to an adjustment of the final limit from 3.0
to 2.9 E-26 e.cm compared to the first version of this preprin
- âŠ