382 research outputs found

    Measuring socioeconomic inequalities in relation to malaria risk: a comparison of metrics in rural Uganda

    Get PDF
    ocioeconomic position (SEP) is an important risk factor for malaria, but there is no consensus on how to measure SEP in malaria studies. We evaluated the relative strength of four indicators of SEP in predicting malaria risk in Nagongera, Uganda. 318 children resident in 100 households were followed for 36 months to measure parasite prevalence routinely every three months and malaria incidence by passive case detection. Household SEP was determined using: (1) two wealth indices, (2) income, (3) occupation and (4) education. Wealth Index I (reference) included only asset ownership variables. Wealth Index II additionally included food security and house construction variables, which may directly affect malaria. In multivariate analysis, only Wealth Index II and income were associated with the human biting rate, only Wealth Indices I and II were associated with parasite prevalence and only caregiver’s education was associated with malaria incidence. This is the first evaluation of metrics beyond wealth and consumption indices for measuring the association between SEP and malaria. The wealth index still predicted malaria risk after excluding variables directly associated with malaria, but the strength of association was lower. In this setting, wealth indices, income and education were stronger predictors of socioeconomic differences in malaria risk than occupation

    Why is malaria associated with poverty? Findings from a cohort study in rural Uganda

    Get PDF
    Background Malaria control and sustainable development are linked, but implementation of ‘multisectoral’ intervention is restricted by a limited understanding of the causal pathways between poverty and malaria. We investigated the relationships between socioeconomic position (SEP), potential determinants of SEP, and malaria in Nagongera, rural Uganda. Methods Socioeconomic information was collected for 318 children aged six months to 10 years living in 100 households, who were followed for up to 36 months. Mosquito density was recorded using monthly light trap collections. Parasite prevalence was measured routinely every three months and malaria incidence determined by passive case detection. First, we evaluated the association between success in smallholder agriculture (the primary livelihood source) and SEP. Second, we explored socioeconomic risk factors for human biting rate (HBR), parasite prevalence and incidence of clinical malaria, and spatial clustering of socioeconomic variables. Third, we investigated the role of selected factors in mediating the association between SEP and malaria. Results Relative agricultural success was associated with higher SEP. In turn, high SEP was associated with lower HBR (highest versus lowest wealth index tertile: Incidence Rate Ratio 0.71, 95 % confidence intervals (CI) 0.54–0.93, P = 0.01) and lower odds of malaria infection in children (highest versus lowest wealth index tertile: adjusted Odds Ratio 0.52, 95 % CI 0.35–0.78, P = 0.001), but SEP was not associated with clinical malaria incidence. Mediation analysis suggested that part of the total effect of SEP on malaria infection risk was explained by house type (24.9 %, 95 % CI 15.8–58.6 %) and food security (18.6 %, 95 % CI 11.6–48.3 %); however, the assumptions of the mediation analysis may not have been fully met. Conclusion Housing improvements and agricultural development interventions to reduce poverty merit further investigation as multisectoral interventions against malaria. Further interdisplinary research is needed to understand fully the complex pathways between poverty and malaria and to develop strategies for sustainable malaria control

    Measuring Socioeconomic Inequalities in Relation to Malaria Risk: A Comparison of Metrics in Rural Uganda.

    Get PDF
    Socioeconomic position (SEP) is an important risk factor for malaria, but there is no consensus on how to measure SEP in malaria studies. We evaluated the relative strength of four indicators of SEP in predicting malaria risk in Nagongera, Uganda. A total of 318 children resident in 100 households were followed for 36 months to measure parasite prevalence routinely every 3 months and malaria incidence by passive case detection. Household SEP was determined using: 1) two wealth indices, 2) income, 3) occupation, and 4) education. Wealth Index I (reference) included only asset ownership variables. Wealth Index II additionally included food security and house construction variables, which may directly affect malaria. In multivariate analysis, only Wealth Index II and income were associated with the human biting rate, only Wealth Indices I and II were associated with parasite prevalence, and only caregiver's education was associated with malaria incidence. This is the first evaluation of metrics beyond wealth and consumption indices for measuring the association between SEP and malaria. The wealth index still predicted malaria risk after excluding variables directly associated with malaria, but the strength of association was lower. In this setting, wealth indices, income, and education were stronger predictors of socioeconomic differences in malaria risk than occupation

    Pain Squad+ smartphone app to support real-time pain treatment for adolescents with cancer: protocol for a randomised controlled trial.

    Get PDF
    INTRODUCTION: Pain negatively affects the health-related quality of life (HRQL) of adolescents with cancer. The Pain Squad+ smartphone-based application (app), has been developed to provide adolescents with real-time pain self-management support. The app uses a validated pain assessment and personalised pain treatment advice with centralised decision support via a registered nurse to enable real-time pain treatment in all settings. The algorithm informing pain treatment advice is evidence-based and expert-vetted. This trial will longitudinally evaluate the impact of Pain Squad+, with or without the addition of nurse support, on adolescent health and cost outcomes. METHODS AND ANALYSIS: This will be a pragmatic, multicentre, waitlist controlled, 3-arm parallel-group superiority randomised trial with 1:1:1 allocation enrolling 74 adolescents with cancer per arm from nine cancer centres. Participants will be 12 to 18 years, English-speaking and with ≥3/10 pain. Exclusion criteria are significant comorbidities, end-of-life status or enrolment in a concurrent pain study. The primary aim is to determine the effect of Pain Squad+, with and without nurse support, on pain intensity in adolescents with cancer, when compared with a waitlist control group. The secondary aims are to determine the immediate and sustained effect over time of using Pain Squad+, with and without nurse support, as per prospective outcome measurements of pain interference, HRQL, pain self-efficacy and cost. Linear mixed models with baseline scores as a covariate will be used. Qualitative interviews with adolescents from all trial arms will be conducted and analysed. ETHICS AND DISSEMINATION: This trial is approved by the Hospital for Sick Children Research Ethics Board. Results will provide data to guide adolescents with cancer and healthcare teams in treating pain. Dissemination will occur through partnerships with stakeholder groups, scientific meetings, publications, mass media releases and consumer detailing. TRIAL REGISTRATION NUMBER: NCT03632343

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation

    The Early Positive Approaches to Support (E-PAtS) study: study protocol for a feasibility cluster randomised controlled trial of a group programme (E-PAtS) for family caregivers of young children with intellectual disability

    Get PDF
    Background: Children with intellectual disability have an IQ < 70, associated deficits in adaptive skills and are at increased risk of having clinically concerning levels of behaviour problems. In addition, parents of children with intellectual disability are likely to report high levels of mental health and other psychological problems. The Early Positive Approaches to Support (E-PAtS) programme for family caregivers of young children (5 years and under) with intellectual and developmental disabilities is a group-based intervention which aims to enhance parental psychosocial wellbeing and service access and support positive development for children. The aim of this study is to assess the feasibility of delivering E-PAtS to family caregivers of children with intellectual disability by community parenting support service provider organisations. The study will inform a potential, definitive RCT of the effectiveness and cost-effectiveness of E-PAtS. Methods: This study is a feasibility cluster randomised controlled trial, with embedded process evaluation. Up to 2 family caregivers will be recruited from 64 families with a child (18 months to 5 years) with intellectual disability at research sites in the UK. Participating families will be allocated to intervention: control on a 1:1 basis; intervention families will be offered the E-PAtS programme immediately, continuing to receive usual practice, and control participants will be offered the opportunity to attend the E-PAtS programme at the end of the follow-up period and will continue to receive usual practice. Data will be collected at baseline, 3 months post-randomisation and 12 months post-randomisation. The primary aim is to assess feasibility via the assessment of: recruitment of service provider organisations; participant recruitment; randomisation; retention; intervention adherence; intervention fidelity and the views of participants, intervention facilitators and service provider organisations regarding intervention delivery and study processes. The secondary aim is preliminary evaluation of a range of established outcome measures for individual family members, subsystem relationships and overall family functioning, plus additional health economic outcomes for inclusion in a future definitive trial. Discussion: The results of this study will inform a potential future definitive trial, to evaluate the effectiveness and cost-effectiveness of the E-PAtS intervention to improve parental psychosocial wellbeing. Such a trial would have significant scientific impact internationally in the intellectual disability field

    Metabolism before, during and after anaesthesia in colic and healthy horses

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Many colic horses are compromised due to the disease state and from hours of starvation and sometimes long trailer rides. This could influence their muscle energy reserves and affect the horses' ability to recover. The principal aim was to follow metabolic parameter before, during, and up to 7 days after anaesthesia in healthy horses and in horses undergoing abdominal surgery due to colic.</p> <p>Methods</p> <p>20 healthy horses given anaesthesia alone and 20 colic horses subjected to emergency abdominal surgery were anaesthetised for a mean of 228 minutes and 183 minutes respectively. Blood for analysis of haematology, electrolytes, cortisol, creatine kinase (CK), free fatty acids (FFA), glycerol, glucose and lactate was sampled before, during, and up to 7 days after anaesthesia. Arterial and venous blood gases were obtained before, during and up to 8 hours after recovery. Gluteal muscle biopsy specimens for biochemical analysis of muscle metabolites were obtained at start and end of anaesthesia and 1 h and 1 day after recovery.</p> <p>Results</p> <p>Plasma cortisol, FFA, glycerol, glucose, lactate and CK were elevated and serum phosphate and potassium were lower in colic horses before anaesthesia. Muscle adenosine triphosphate (ATP) content was low in several colic horses. Anaesthesia and surgery resulted in a decrease in plasma FFA and glycerol in colic horses whereas levels increased in healthy horses. During anaesthesia muscle and plasma lactate and plasma phosphate increased in both groups. In the colic horses plasma lactate increased further after recovery. Plasma FFA and glycerol increased 8 h after standing in the colic horses. In both groups, plasma concentrations of CK increased and serum phosphate decreased post-anaesthesia. On Day 7 most parameters were not different between groups. Colic horses lost on average 8% of their initial weight. Eleven colic horses completed the study.</p> <p>Conclusion</p> <p>Colic horses entered anaesthesia with altered metabolism and in a negative oxygen balance. Muscle oxygenation was insufficient during anaesthesia in both groups, although to a lesser extent in the healthy horses. The post-anaesthetic period was associated with increased lipolysis and weight loss in the colic horses, indicating a negative energy balance during the first week post-operatively.</p

    Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy

    Get PDF
    Background A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets. Methods Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis. Results A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001). Conclusion We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
    • …
    corecore