253 research outputs found
Radon backgrounds in the DEAP-1 liquid-argon-based Dark Matter detector
The DEAP-1 \SI{7}{kg} single phase liquid argon scintillation detector was
operated underground at SNOLAB in order to test the techniques and measure the
backgrounds inherent to single phase detection, in support of the
\mbox{DEAP-3600} Dark Matter detector. Backgrounds in DEAP are controlled
through material selection, construction techniques, pulse shape discrimination
and event reconstruction. This report details the analysis of background events
observed in three iterations of the DEAP-1 detector, and the measures taken to
reduce them.
The Rn decay rate in the liquid argon was measured to be between 16
and \SI{26}{\micro\becquerel\per\kilogram}. We found that the background
spectrum near the region of interest for Dark Matter detection in the DEAP-1
detector can be described considering events from three sources: radon
daughters decaying on the surface of the active volume, the expected rate of
electromagnetic events misidentified as nuclear recoils due to inefficiencies
in the pulse shape discrimination, and leakage of events from outside the
fiducial volume due to imperfect position reconstruction. These backgrounds
statistically account for all observed events, and they will be strongly
reduced in the DEAP-3600 detector due to its higher light yield and simpler
geometry
Improving Photoelectron Counting and Particle Identification in Scintillation Detectors with Bayesian Techniques
Many current and future dark matter and neutrino detectors are designed to
measure scintillation light with a large array of photomultiplier tubes (PMTs).
The energy resolution and particle identification capabilities of these
detectors depend in part on the ability to accurately identify individual
photoelectrons in PMT waveforms despite large variability in pulse amplitudes
and pulse pileup. We describe a Bayesian technique that can identify the times
of individual photoelectrons in a sampled PMT waveform without deconvolution,
even when pileup is present. To demonstrate the technique, we apply it to the
general problem of particle identification in single-phase liquid argon dark
matter detectors. Using the output of the Bayesian photoelectron counting
algorithm described in this paper, we construct several test statistics for
rejection of backgrounds for dark matter searches in argon. Compared to simpler
methods based on either observed charge or peak finding, the photoelectron
counting technique improves both energy resolution and particle identification
of low energy events in calibration data from the DEAP-1 detector and
simulation of the larger MiniCLEAN dark matter detector.Comment: 16 pages, 16 figure
Towards Critical Human Resource Management Education (CHRME): a sociological imagination approach
This article explores the professional standing of the discipline of human resource management (HRM) in business schools in the post-financial crisis period. Using the prism of the sociological imagination, it explains the learning to be gained from teaching HRM that is sensitive to context, power and inequality. The context of crisis provides ideal circumstances for critical reflexivity and for integrating wider societal issues into the HRM curriculum. It argues for Critical Human Resource Management Education or CHRME, which, if adopted, would be an antidote to prescriptive practitioner-oriented approaches. It proceeds to set out five principles for CHRME: using the ‘sociological imagination’ prism; emphasizing the social nature of the employment relationship; investigating paradox within HRM; designing learning outcomes that encourage students to appraise HRM outcomes critically; and reflexive critique. Crucially, CHRME offers a teaching strategy that does not neglect or marginalize the reality of structural power, inequality and employee work experiences
Recommendations for The Conduct of Economic Evaluations in Osteoporosis: Outcomes of An Experts’ Consensus Meeting Organized by The European Society for Clinical and Economic Aspects of Osteoporosis, Osteoarthritis and Musculoskeletal Diseases (ESCEO) And the US Branch of The International Osteoporosis Foundation
Summary
Economic evaluations are increasingly used to assess the value of health interventions, but variable quality and heterogeneity limit the use of these evaluations by decision-makers. These recommendations provide guidance for the design, conduct, and reporting of economic evaluations in osteoporosis to improve their transparency, comparability, and methodologic standards. Introduction
This paper aims to provide recommendations for the conduct of economic evaluations in osteoporosis in order to improve their transparency, comparability, and methodologic standards. Methods
A working group was convened by the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis to make recommendations for the design, conduct, and reporting of economic evaluations in osteoporosis, to define an osteoporosis-specific reference case to serve a minimum standard for all economic analyses in osteoporosis, to discuss methodologic challenges and initiate a call for research. A literature review, a face-to-face meeting in New York City (including 11 experts), and a review/approval by a larger group of experts worldwide (including 23 experts in total) were conducted. Results
Recommendations on the type of economic evaluation, methods for economic evaluation, modeling aspects, base-case analysis and population, excess mortality, fracture costs and disutility, treatment characteristics, and model validation were provided. Recommendations for reporting economic evaluations in osteoporosis were also made and an osteoporosis-specific checklist was designed that includes items to report when performing an economic evaluation in osteoporosis. Further, 12 minimum criteria for economic evaluations in osteoporosis were identified and 12 methodologic challenges and need for further research were discussed. Conclusion
While the working group acknowledges challenges and the need for further research, these recommendations are intended to supplement general and national guidelines for economic evaluations, improve transparency, quality, and comparability of economic evaluations in osteoporosis, and maintain methodologic standards to increase their use by decision-makers
The Large Enriched Germanium Experiment for Neutrinoless Double Beta Decay (LEGEND)
The observation of neutrinoless double-beta decay (0)
would show that lepton number is violated, reveal that neutrinos are Majorana
particles, and provide information on neutrino mass. A discovery-capable
experiment covering the inverted ordering region, with effective Majorana
neutrino masses of 15 - 50 meV, will require a tonne-scale experiment with
excellent energy resolution and extremely low backgrounds, at the level of
0.1 count /(FWHMtyr) in the region of the signal. The
current generation Ge experiments GERDA and the MAJORANA DEMONSTRATOR
utilizing high purity Germanium detectors with an intrinsic energy resolution
of 0.12%, have achieved the lowest backgrounds by over an order of magnitude in
the 0 signal region of all 0
experiments. Building on this success, the LEGEND collaboration has been formed
to pursue a tonne-scale Ge experiment. The collaboration aims to develop
a phased 0 experimental program with discovery potential
at a half-life approaching or at years, using existing resources as
appropriate to expedite physics results.Comment: Proceedings of the MEDEX'17 meeting (Prague, May 29 - June 2, 2017
Measurement of the scintillation time spectra and pulse-shape discrimination of low-energy beta and nuclear recoils in liquid argon with DEAP-1
The DEAP-1 low-background liquid argon detector was used to measure
scintillation pulse shapes of electron and nuclear recoil events and to
demonstrate the feasibility of pulse-shape discrimination (PSD) down to an
electron-equivalent energy of 20 keV.
In the surface dataset using a triple-coincidence tag we found the fraction
of beta events that are misidentified as nuclear recoils to be (90% C.L.) for energies between 43-86 keVee and for a nuclear recoil
acceptance of at least 90%, with 4% systematic uncertainty on the absolute
energy scale. The discrimination measurement on surface was limited by nuclear
recoils induced by cosmic-ray generated neutrons. This was improved by moving
the detector to the SNOLAB underground laboratory, where the reduced background
rate allowed the same measurement with only a double-coincidence tag.
The combined data set contains events. One of those, in the
underground data set, is in the nuclear-recoil region of interest. Taking into
account the expected background of 0.48 events coming from random pileup, the
resulting upper limit on the electronic recoil contamination is
(90% C.L.) between 44-89 keVee and for a nuclear recoil
acceptance of at least 90%, with 6% systematic uncertainty on the absolute
energy scale.
We developed a general mathematical framework to describe PSD parameter
distributions and used it to build an analytical model of the distributions
observed in DEAP-1. Using this model, we project a misidentification fraction
of approx. for an electron-equivalent energy threshold of 15 keV for
a detector with 8 PE/keVee light yield. This reduction enables a search for
spin-independent scattering of WIMPs from 1000 kg of liquid argon with a
WIMP-nucleon cross-section sensitivity of cm, assuming
negligible contribution from nuclear recoil backgrounds.Comment: Accepted for publication in Astroparticle Physic
Precise measurement of the W-boson mass with the CDF II detector
We have measured the W-boson mass MW using data corresponding to 2.2/fb of
integrated luminosity collected in proton-antiproton collisions at 1.96 TeV
with the CDF II detector at the Fermilab Tevatron collider. Samples consisting
of 470126 W->enu candidates and 624708 W->munu candidates yield the measurement
MW = 80387 +- 12 (stat) +- 15 (syst) = 80387 +- 19 MeV. This is the most
precise measurement of the W-boson mass to date and significantly exceeds the
precision of all previous measurements combined
The Internet for weight control in an obese sample: results of a randomised controlled trial
Rising levels of obesity coupled with the limited success of currently available weight control methods highlight the need for investigation of novel approaches to obesity treatment. This study aims to determine the effectiveness and cost-effectiveness of an Internet-based resource for obesity management
Exercising control at the urban scale: Towards a theory of spatial organisation and surveillance
The purpose of this chapter is to explore how urban spaces are implicated in the control and surveillance of users in a culture saturated by the notion of the self as a consuming body or entity. Using the work of Foucault on disciplinary cultures, Lefebvre in relation to the production of space, and other seminal theorists such as Baudrillard, Bauman, Shields, and Walzer, a model for analysing the three dimensions of social spatialisation is proposed and illustrated by reference to contemporary public spaces, and specifically spaces of mundane leisure such as shopping malls and high streets. The chapter deals with how the public realm as a controlling space has been theorised in terms of opposition to such controlling tendencies—from the flaneur, through the self-constructed narratives of De Certeau’s walker to the digitally ‘enhanced’ individual today, appropriating space via technology and their own projects in tinder and so on, and other potentially subversive media
Can we derive an 'exchange rate' between descriptive and preference-based outcome measures for stroke? Results from the transfer to utility (TTU) technique
<p>Abstract</p> <p>Background</p> <p>Stroke-specific outcome measures and descriptive measures of health-related quality of life (HRQoL) are unsuitable for informing decision-makers of the broader consequences of increasing or decreasing funding for stroke interventions. The quality-adjusted life year (QALY) provides a common metric for comparing interventions over multiple dimensions of HRQoL and mortality differentials. There are, however, many circumstances when – because of timing, lack of foresight or cost considerations – only stroke-specific or descriptive measures of health status are available and some indirect means of obtaining QALY-weights becomes necessary. In such circumstances, the use of regression-based transformations or mappings can circumvent the failure to elicit QALY-weights by allowing predicted weights to proxy for observed weights. This regression-based approach has been dubbed 'Transfer to Utility' (TTU) regression. The purpose of the present study is to demonstrate the feasibility and value of TTU regression in stroke by deriving transformations or mappings from stroke-specific and generic but descriptive measures of health status to a generic preference-based measure of HRQoL in a sample of Australians with a diagnosis of acute stroke. Findings will quantify the additional error associated with the use of condition-specific to generic transformations in stroke.</p> <p>Methods</p> <p>We used TTU regression to derive empirical transformations from three commonly used descriptive measures of health status for stroke (NIHSS, Barthel and SF-36) to a preference-based measure (AQoL) suitable for attaching QALY-weights to stroke disease states; based on 2570 observations drawn from a sample of 859 patients with stroke.</p> <p>Results</p> <p>Transformations from the SF-36 to the AQoL explained up to 71.5% of variation in observed AQoL scores. Differences between mean predicted and mean observed AQoL scores from the 'severity-specific' item- and subscale-based SF-36 algorithms and from the 'moderate to severe' index- and item-based Barthel algorithm were neither clinically nor statistically significant when 'low severity' SF-36 transformations were used to predict AQoL scores for patients in the NIHSS = 0 and NIHSS = 1–5 subgroups and when 'moderate to severe severity' transformations were used to predict AQoL scores for patients in the NIHSS ≥ 6 subgroup. In contrast, the difference between mean predicted and mean observed AQoL scores from the NIHSS algorithms and from the 'low severity' Barthel algorithms reached levels that could mask minimally important differences on the AQoL scale.</p> <p>Conclusion</p> <p>While our NIHSS to AQoL transformations proved unsuitable for most applications, our findings demonstrate that stroke-relevant outcome measures such as the SF-36 and Barthel Index can be adequately transformed to preference-based measures for the purposes of economic evaluation.</p
- …