2,438 research outputs found

    The Long and Winding Road: Archiving and Re-Using Qualitative Data from 12 Research Projects Spanning 16 Years

    Get PDF
    We describe a pilot project designed to assess the feasibility of re-use across 12 diverse qualitative datasets related to Human Immunodeficiency Virus (HIV) in the UK, from research projects undertaken between 1997 and 2013 – an approach which is chronically underused. First, we consider the sweeping biomedical changes and imperatives relating to HIV in this time frame, offering a rationale for data re-use at this point in the epidemic. We then reflexively situate the processes and procedures we devised for this study with reference to relevant methodological literature. Hammersley’s and Leonelli’s contributions have been particularly instructive through this process, and following their lead, we conclude with further considerations for those undertaking qualitative data re-use, reflecting on the extent to which qualitative data re-use as a practice requires attention to both the given and the constructed aspects of data when assembled as evidence

    The Hamiltonian structure and Euler-Poincar\'{e} formulation of the Vlasov-Maxwell and gyrokinetic systems

    Full text link
    We present a new variational principle for the gyrokinetic system, similar to the Maxwell-Vlasov action presented in Ref. 1. The variational principle is in the Eulerian frame and based on constrained variations of the phase space fluid velocity and particle distribution function. Using a Legendre transform, we explicitly derive the field theoretic Hamiltonian structure of the system. This is carried out with a modified Dirac theory of constraints, which is used to construct meaningful brackets from those obtained directly from Euler-Poincar\'{e} theory. Possible applications of these formulations include continuum geometric integration techniques, large-eddy simulation models and Casimir type stability methods. [1] H. Cendra et. al., Journal of Mathematical Physics 39, 3138 (1998)Comment: 36 pages, 1 figur

    Naturally Rehearsing Passwords

    Full text link
    We introduce quantitative usability and security models to guide the design of password management schemes --- systematic strategies to help users create and remember multiple passwords. In the same way that security proofs in cryptography are based on complexity-theoretic assumptions (e.g., hardness of factoring and discrete logarithm), we quantify usability by introducing usability assumptions. In particular, password management relies on assumptions about human memory, e.g., that a user who follows a particular rehearsal schedule will successfully maintain the corresponding memory. These assumptions are informed by research in cognitive science and validated through empirical studies. Given rehearsal requirements and a user's visitation schedule for each account, we use the total number of extra rehearsals that the user would have to do to remember all of his passwords as a measure of the usability of the password scheme. Our usability model leads us to a key observation: password reuse benefits users not only by reducing the number of passwords that the user has to memorize, but more importantly by increasing the natural rehearsal rate for each password. We also present a security model which accounts for the complexity of password management with multiple accounts and associated threats, including online, offline, and plaintext password leak attacks. Observing that current password management schemes are either insecure or unusable, we present Shared Cues--- a new scheme in which the underlying secret is strategically shared across accounts to ensure that most rehearsal requirements are satisfied naturally while simultaneously providing strong security. The construction uses the Chinese Remainder Theorem to achieve these competing goals

    End TB strategy: the need to reduce risk inequalities

    Get PDF
    Background Diseases occur in populations whose individuals differ in essential characteristics, such as exposure to the causative agent, susceptibility given exposure, and infectiousness upon infection in the case of infectious diseases. Discussion Concepts developed in demography more than 30 years ago assert that variability between individuals affects substantially the estimation of overall population risk from disease incidence data. Methods that ignore individual heterogeneity tend to underestimate overall risk and lead to overoptimistic expectations for control. Concerned that this phenomenon is frequently overlooked in epidemiology, here we feature its significance for interpreting global data on human tuberculosis and predicting the impact of control measures. Summary We show that population-wide interventions have the greatest impact in populations where all individuals face an equal risk. Lowering variability in risk has great potential to increase the impact of interventions. Reducing inequality, therefore, empowers health interventions, which in turn improves health, further reducing inequality, in a virtuous circle

    Risk related to pre–diabetes mellitus and diabetes mellitus in heart failure with reduced ejection fraction: insights from prospective comparison of ARNI with ACEI to determine impact on global mortality and morbidity in heart failure trial

    Get PDF
    Background—The prevalence of pre–diabetes mellitus and its consequences in patients with heart failure and reduced ejection fraction are not known. We investigated these in the Prospective Comparison of ARNI With ACEI to Determine Impact on Global Mortality and Morbidity in Heart Failure (PARADIGM-HF) trial. Methods and Results—We examined clinical outcomes in 8399 patients with heart failure and reduced ejection fraction according to history of diabetes mellitus and glycemic status (baseline hemoglobin A1c [HbA1c]: <6.0% [<42 mmol/mol], 6.0%–6.4% [42–47 mmol/mol; pre–diabetes mellitus], and ≥6.5% [≥48 mmol/mol; diabetes mellitus]), in Cox regression models adjusted for known predictors of poor outcome. Patients with a history of diabetes mellitus (n=2907 [35%]) had a higher risk of the primary composite outcome of heart failure hospitalization or cardiovascular mortality compared with those without a history of diabetes mellitus: adjusted hazard ratio, 1.38; 95% confidence interval, 1.25 to 1.52;P<0.001. HbA1c measurement showed that an additional 1106 (13% of total) patients had undiagnosed diabetes mellitus and 2103 (25%) had pre–diabetes mellitus. The hazard ratio for patients with undiagnosed diabetes mellitus (HbA1c, >6.5%) and known diabetes mellitus compared with those with HbA1c<6.0% was 1.39 (1.17–1.64); P<0.001 and 1.64 (1.43–1.87); P<0.001, respectively. Patients with pre–diabetes mellitus were also at higher risk (hazard ratio, 1.27 [1.10–1.47];P<0.001) compared with those with HbA1c<6.0%. The benefit of LCZ696 (sacubitril/valsartan) compared with enalapril was consistent across the range of HbA1c in the trial. Conclusions—In patients with heart failure and reduced ejection fraction, dysglycemia is common and pre–diabetes mellitus is associated with a higher risk of adverse cardiovascular outcomes (compared with patients with no diabetes mellitus and HbA1c <6.0%). LCZ696 was beneficial compared with enalapril, irrespective of glycemic status

    Crop Growth in Semi-Arid Environments

    Get PDF
    Methods are presented for analyzing growth and yield of crops when water is limiting and not limiting. Examples are given from collaborative research between ICRISAT and the University of Nottingham, UK, on pearl millet grown in a range of controlled and natural environments

    PAin SoluTions In the Emergency Setting (PASTIES)--patient controlled analgesia versus routine care in emergency department patients with non-traumatic abdominal pain: randomised trial

    Get PDF
    © Smith et al 2015. OBJECTIVE: To determine whether patient controlled analgesia (PCA) is better than routine care in providing effective analgesia for patients presenting to emergency departments with moderate to severe non-traumatic abdominal pain.DESIGN: Pragmatic, multicentre, parallel group, randomised controlled trialSETTING: Five English hospitals.PARTICIPANTS: 200 adults (66% (n=130) female), aged 18 to 75 years, who presented to the emergency department requiring intravenous opioid analgesia for the treatment of moderate to severe non-traumatic abdominal pain and were expected to be admitted to hospital for at least 12 hours.INTERVENTIONS: Patient controlled analgesia or nurse titrated analgesia (treatment as usual).MAIN OUTCOME MEASURES: The primary outcome was total pain experienced over the 12 hour study period, derived by standardised area under the curve (scaled from 0 to 100) of each participant's hourly pain scores, captured using a visual analogue scale. Pre-specified secondary outcomes included total morphine use, percentage of study period in moderate or severe pain, percentage of study period asleep, length of hospital stay, and satisfaction with pain management.RESULTS: 196 participants were included in the primary analyses (99 allocated to PCA and 97 to treatment as usual). Mean total pain experienced was 35.3 (SD 25.8) in the PCA group compared with 47.3 (24.7) in the treatment as usual group. The adjusted between group difference was 6.3 (95% confidence interval 0.7 to 11.9). Participants in the PCA group received significantly more morphine (mean 36.1 (SD 22.4) v 23.6 (13.1) mg; mean difference 12.3 (95% confidence interval 7.2 to 17.4) mg), spent less of the study period in moderate or severe pain (32.6% v 46.9%; mean difference 14.5% (5.6% to 23.5%)), and were more likely to be perfectly or very satisfied with the management of their pain (83% (73/88) v 66% (57/87); adjusted odds ratio 2.56 (1.25 to 5.23)) in comparison with participants in the treatment as usual group.CONCLUSIONS: Significant reductions in pain can be achieved by PCA compared with treatment as usual in patients presenting to the emergency department with non-traumatic abdominal pain. Trial registration European Clinical Trials Database EudraCT2011-000194-31; Current Controlled Trials ISRCTN25343280

    Influence of future climate and cropland expansion on isoprene emissions and tropospheric ozone

    Get PDF
    Over the 21st century, changes in CO2 levels, climate and land use are expected to alter the global distribution of vegetation, leading to changes in trace gas emissions from plants, including, importantly, the emissions of isoprene. This, combined with changes in anthropogenic emissions, has the potential to impact tropospheric ozone levels, which above a certain level are harmful to animals and vegetation. In this study we use a biogenic emissions model following the empirical parameterisation of the MEGAN model, with vegetation distributions calculated by the Sheffield Dynamic Global Vegetation Model (SDGVM) to explore a range of potential future (2095) changes in isoprene emissions caused by changes in climate (including natural land use changes), land use, and the inhibition of isoprene emissions by CO2. From the present-day (2000) value of 467 Tg C yr−1, we find that the combined impact of these factors could cause a net decrease in isoprene emissions of 259 Tg C yr−1 (55%) with individual contributions of +78 Tg C yr−1 (climate change), −190 Tg C yr−1 (land use) and −147 Tg C yr−1 (CO2 inhibition). Using these isoprene emissions and changes in anthropogenic emissions, a series of integrations is conducted with the UM-UKCA chemistry-climate model with the aim of examining changes in ozone over the 21st century. Globally, all combined future changes cause a decrease in the tropospheric ozone burden of 27 Tg (7%) from 379 Tg in the present-day. At the surface, decreases in ozone of 6–10 ppb are calculated over the oceans and developed northern hemispheric regions, due to reduced NOx transport by PAN and reductions in NOx emissions in these areas respectively. Increases of 4–6 ppb are calculated in the continental tropics due to cropland expansion in these regions, increased CO2 inhibition of isoprene emissions, and higher temperatures due to climate change. These effects outweigh the decreases in tropical ozone caused by increased tropical isoprene emissions with climate change. Our land use change scenario consists of cropland expansion, which is most pronounced in the tropics. The tropics are also where land use change causes the greatest increases in ozone. As such there is potential for increased crop exposure to harmful levels of ozone. However, we find that these ozone increases are still not large enough to raise ozone to such damaging levels

    Eyewitness metamemory predicts identification performance in biased and unbiased line‐ups

    Get PDF
    Purpose Distinguishing accurate from inaccurate identifications is a challenging issue in the criminal justice system, especially for biased police line-ups. That is because biased line-ups undermine the diagnostic value of accuracy post-dictors such as confidence and decision time. Here, we aimed to test general and eyewitness-specific self-ratings of memory capacity as potential estimators of identification performance that are unaffected by line-up bias. Methods Participants (N = 744) completed a metamemory assessment consisting of the Multifactorial Metamemory Questionnaire and the Eyewitness Metamemory Scale and took part in a standard eyewitness paradigm. Following the presentation of a mock-crime video, they viewed either biased or unbiased line-ups. Results Self-ratings of discontentment with eyewitness memory ability were indicative of identification accuracy for both biased and unbiased line-ups. Participants who scored low on eyewitness metamemory factors also displayed a stronger confidence-accuracy calibration than those who scored high. Conclusions These results suggest a promising role for self-ratings of memory capacity in the evaluation of eyewitness identifications, while also advancing theory on self-assessments for different memory systems
    corecore