27 research outputs found

    Phosphorus speciation in the organic layer of two Swedish forest soils 13-24 years after wood ash and nitrogen application

    Get PDF
    Application of wood ash to forests can restore pools of phosphorus (P) and other nutrients, which are removed following whole tree harvesting. Yet, the mechanisms that affect the fate of ash-P in the organic layer are less well known. Previous research into the extent to which ash application leads to increased P solubility in the soil is contradictory. We combined synchrotron P K-edge XANES spectroscopy, mu-XRF microscopy, and chemical ex-tractions to examine the speciation and solubility of P. We studied organic horizons of two long-term field ex-periments, Riddarhyttan (central Sweden), which had received 3, 6, and 9 Mg ash ha -1, and Ro center dot dalund (northern Sweden), where 3 Mg ash ha- 1 had been applied alone or combined with N every-three years since 2003. At the latter site, we also determined P in aboveground tree biomass. Overall, the ash application increased P in the organic layer by between 6 and 28 kg P ha -1, equivalent to 17-39 % of the initial P content in the applied ash. At Ro center dot dalund, there was 4.6 kg Ca-bound P ha- 1 (9.5 %) in the ash treatment compared to 1.6 kg ha- 1 in the ash + N treatment and < 0.4 kg ha- 1 in the N treatment and the control. At Riddarhyttan, only the treatment with the highest ash dose had residual Ca-bound P (3.8 kg ha -1). In contrast, the ash application increased Al-bound P (p < 0.001) with up to 15.6 kg P ha -1. Moreover, the ash increased Olsen-P by up to two times. There was a strong relationship between the concentrations of Olsen-P and Al-bound P (R2 = 0.83, p < 0.001) as well as Fe-bound P (R2 = 0.74, p = 0.003), suggesting that the ash application resulted in an increased amount of relatively soluble P associated with hydroxy-Al and hydroxy-Fe compounds. Further, there was an 18 % increase in P uptake by trees in the ash treatment. By contrast, repeated N fertilization, with or without ash, reduced Olsen-P. The lower P extractability was concomitant with a 39 % increase in plant P uptake in the N treatment, which indicates elevated P uptake in response to higher N availability. Hence, the application of wood ash increased Al-bound P, easily available P, and P uptake. N fertilization, while also increasing tree P uptake, instead decreased easily available P and did not cause a shift in soil P speciation

    Acquired HIV drug resistance among adults living with HIV receiving first-line antiretroviral therapy in Rwanda: a cross-sectional nationally representative survey

    Get PDF
    BACKGROUND: We assessed the prevalence of acquired HIV drug resistance (HIVDR) and associated factors among patients receiving first-line antiretroviral therapy (ART) in Rwanda. METHODS: This cross-sectional study included 702 patients receiving first-line ART for at least 6 months with last viral load (VL) results >/=1000 copies/mL. Blood plasma samples were subjected to VL testing; specimens with unsuppressed VL were genotyped to identify HIVDR-associated mutations. Data were analysed using STATA/SE. RESULTS: Median time on ART was 86.4 months (interquartile range [IQR], 44.8-130.2 months), and median CD4 count at ART initiation was 311 cells/mm(3) (IQR, 197-484 cells/mm(3)). Of 414 (68.2%) samples with unsuppressed VL, 378 (88.3%) were genotyped. HIVDR included 347 (90.4%) non-nucleoside reverse transcriptase inhibitor- (NNRTI), 291 (75.5%) nucleoside reverse transcriptase inhibitor- (NRTI) and 13 (3.5%) protease inhibitor (PI) resistance-associated mutations. The most common HIVDR mutations were K65R (22.7%), M184V (15.4%) and D67N (9.8%) for NRTIs and K103N (34.4%) and Y181C/I/V/YC (7%) for NNRTIs. Independent predictors of acquired HIVDR included current ART regimen of zidovudine + lamivudine + nevirapine (adjusted odds ratio [aOR], 3.333 [95% confidence interval (CI): 1.022-10.870]; p = 0.046) for NRTI resistance and current ART regimen of tenofovir + emtricitabine + nevirapine (aOR, 0.148 [95% CI: 0.028-0.779]; p = 0.025), zidovudine + lamivudine + efavirenz (aOR, 0.105 [95% CI: 0.016-0.693]; p = 0.020) and zidovudine + lamivudine + nevirapine (aOR, 0.259 [95% CI: 0.084-0.793]; p = 0.019) for NNRTI resistance. History of ever switching ART regimen was associated with NRTI resistance (aOR, 2.53 [95% CI: 1.198-5.356]; p = 0.016) and NNRTI resistance (aOR, 3.23 [95% CI: 1.435-7.278], p = 0.005). CONCLUSION: The prevalence of acquired HIV drug resistance (HIVDR) was high among patient failing to re-suppress VL and was associated with current ART regimen and ever switching ART regimen. The findings of this study support the current WHO guidelines recommending that patients on an NNRTI-based regimen should be switched based on a single viral load test and suggests that national HIV VL monitoring of patients receiving ART has prevented long-term treatment failure that would result in the accumulation of TAMs and potential loss of efficacy of all NRTI used in second-line ART as the backbone in combination with either dolutegravir or boosted PIs

    Phosphorus speciation in the organic layer of two Swedish forest soils 13–24 years after wood ash and nitrogen application

    Get PDF
    Application of wood ash to forests can restore pools of phosphorus (P) and other nutrients, which are removed following whole tree harvesting. Yet, the mechanisms that affect the fate of ash-P in the organic layer are less well known. Previous research into the extent to which ash application leads to increased P solubility in the soil is contradictory. We combined synchrotron P K-edge XANES spectroscopy, µ-XRF microscopy, and chemical extractions to examine the speciation and solubility of P. We studied organic horizons of two long-term field experiments, Riddarhyttan (central Sweden), which had received 3, 6, and 9 Mg ash ha−1, and Rödålund (northern Sweden), where 3 Mg ash ha−1 had been applied alone or combined with N every-three years since 2003. At the latter site, we also determined P in aboveground tree biomass. Overall, the ash application increased P in the organic layer by between 6 and 28 kg P ha−1, equivalent to 17–39 % of the initial P content in the applied ash. At Rödålund, there was 4.6 kg Ca-bound P ha−1 (9.5 %) in the ash treatment compared to 1.6 kg ha−1 in the ash + N treatment and < 0.4 kg ha−1 in the N treatment and the control. At Riddarhyttan, only the treatment with the highest ash dose had residual Ca-bound P (3.8 kg ha−1). In contrast, the ash application increased Al-bound P (p < 0.001) with up to 15.6 kg P ha−1. Moreover, the ash increased Olsen-P by up to two times. There was a strong relationship between the concentrations of Olsen-P and Al-bound P (R2 = 0.83, p < 0.001) as well as Fe-bound P (R2 = 0.74, p = 0.003), suggesting that the ash application resulted in an increased amount of relatively soluble P associated with hydroxy-Al and hydroxy-Fe compounds. Further, there was an 18 % increase in P uptake by trees in the ash treatment. By contrast, repeated N fertilization, with or without ash, reduced Olsen-P. The lower P extractability was concomitant with a 39 % increase in plant P uptake in the N treatment, which indicates elevated P uptake in response to higher N availability. Hence, the application of wood ash increased Al-bound P, easily available P, and P uptake. N fertilization, while also increasing tree P uptake, instead decreased easily available P and did not cause a shift in soil P speciation

    Design and methods for a quasi-experimental pilot study to evaluate the impact of dual active ingredient insecticide-treated nets on malaria burden in five regions in sub-Saharan Africa

    Get PDF
    Background Vector control tools have contributed significantly to a reduction in malaria burden since 2000, primarily through insecticidal-treated bed nets (ITNs) and indoor residual spraying. In the face of increasing insecticide resistance in key malaria vector species, global progress in malaria control has stalled. Innovative tools, such as dual active ingredient (dual-AI) ITNs that are effective at killing insecticide-resistant mosquitoes have recently been introduced. However, large-scale uptake has been slow for several reasons, including higher costs and limited evidence on their incremental effectiveness and cost-effectiveness. The present report describes the design of several observational studies aimed to determine the effectiveness and cost-effectiveness of dual-AI ITNs, compared to standard pyrethroid-only ITNs, at reducing malaria transmission across a variety of transmission settings. Methods Observational pilot studies are ongoing in Burkina Faso, Mozambique, Nigeria, and Rwanda, leveraging dual-AI ITN rollouts nested within the 2019 and 2020 mass distribution campaigns in each country. Enhanced surveillance occurring in select study districts include annual cross-sectional surveys during peak transmission seasons, monthly entomological surveillance, passive case detection using routine health facility surveillance systems, and studies on human behaviour and ITN use patterns. Data will compare changes in malaria transmission and disease burden in districts receiving dual-AI ITNs to similar districts receiving standard pyrethroid-only ITNs over three years. The costs of net distribution will be calculated using the provider perspective including financial and economic costs, and a cost-effectiveness analysis will assess incremental cost-effectiveness ratios for Interceptor® G2, Royal Guard®, and piperonyl butoxide ITNs in comparison to standard pyrethroid-only ITNs, based on incidence rate ratios calculated from routine data. Conclusions Evidence of the effectiveness and cost-effectiveness of the dual-AI ITNs from these pilot studies will complement evidence from two contemporary cluster randomized control trials, one in Benin and one in Tanzania, to provide key information to malaria control programmes, policymakers, and donors to help guide decision-making and planning for local malaria control and elimination strategies. Understanding the breadth of contexts where these dual-AI ITNs are most effective and collecting robust information on factors influencing comparative effectiveness could improve uptake and availability and help maximize their impact

    Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study

    Get PDF
    Background: Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. // Methods: We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung's disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. // Findings: We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung's disease) from 264 hospitals (89 in high-income countries, 166 in middle-income countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in low-income countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. // Interpretation: Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between low-income, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Mortality from gastrointestinal congenital anomalies at 264 hospitals in 74 low-income, middle-income, and high-income countries: a multicentre, international, prospective cohort study

    Get PDF
    Summary Background Congenital anomalies are the fifth leading cause of mortality in children younger than 5 years globally. Many gastrointestinal congenital anomalies are fatal without timely access to neonatal surgical care, but few studies have been done on these conditions in low-income and middle-income countries (LMICs). We compared outcomes of the seven most common gastrointestinal congenital anomalies in low-income, middle-income, and high-income countries globally, and identified factors associated with mortality. Methods We did a multicentre, international prospective cohort study of patients younger than 16 years, presenting to hospital for the first time with oesophageal atresia, congenital diaphragmatic hernia, intestinal atresia, gastroschisis, exomphalos, anorectal malformation, and Hirschsprung’s disease. Recruitment was of consecutive patients for a minimum of 1 month between October, 2018, and April, 2019. We collected data on patient demographics, clinical status, interventions, and outcomes using the REDCap platform. Patients were followed up for 30 days after primary intervention, or 30 days after admission if they did not receive an intervention. The primary outcome was all-cause, in-hospital mortality for all conditions combined and each condition individually, stratified by country income status. We did a complete case analysis. Findings We included 3849 patients with 3975 study conditions (560 with oesophageal atresia, 448 with congenital diaphragmatic hernia, 681 with intestinal atresia, 453 with gastroschisis, 325 with exomphalos, 991 with anorectal malformation, and 517 with Hirschsprung’s disease) from 264 hospitals (89 in high-income countries, 166 in middleincome countries, and nine in low-income countries) in 74 countries. Of the 3849 patients, 2231 (58·0%) were male. Median gestational age at birth was 38 weeks (IQR 36–39) and median bodyweight at presentation was 2·8 kg (2·3–3·3). Mortality among all patients was 37 (39·8%) of 93 in low-income countries, 583 (20·4%) of 2860 in middle-income countries, and 50 (5·6%) of 896 in high-income countries (p<0·0001 between all country income groups). Gastroschisis had the greatest difference in mortality between country income strata (nine [90·0%] of ten in lowincome countries, 97 [31·9%] of 304 in middle-income countries, and two [1·4%] of 139 in high-income countries; p≤0·0001 between all country income groups). Factors significantly associated with higher mortality for all patients combined included country income status (low-income vs high-income countries, risk ratio 2·78 [95% CI 1·88–4·11], p<0·0001; middle-income vs high-income countries, 2·11 [1·59–2·79], p<0·0001), sepsis at presentation (1·20 [1·04–1·40], p=0·016), higher American Society of Anesthesiologists (ASA) score at primary intervention (ASA 4–5 vs ASA 1–2, 1·82 [1·40–2·35], p<0·0001; ASA 3 vs ASA 1–2, 1·58, [1·30–1·92], p<0·0001]), surgical safety checklist not used (1·39 [1·02–1·90], p=0·035), and ventilation or parenteral nutrition unavailable when needed (ventilation 1·96, [1·41–2·71], p=0·0001; parenteral nutrition 1·35, [1·05–1·74], p=0·018). Administration of parenteral nutrition (0·61, [0·47–0·79], p=0·0002) and use of a peripherally inserted central catheter (0·65 [0·50–0·86], p=0·0024) or percutaneous central line (0·69 [0·48–1·00], p=0·049) were associated with lower mortality. Interpretation Unacceptable differences in mortality exist for gastrointestinal congenital anomalies between lowincome, middle-income, and high-income countries. Improving access to quality neonatal surgical care in LMICs will be vital to achieve Sustainable Development Goal 3.2 of ending preventable deaths in neonates and children younger than 5 years by 2030

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    A Probabilistic Approach to Phosphorus Speciation of Soils Using P K-edge XANES Spectroscopy with Linear Combination Fitting

    No full text
    A common technique to quantitatively estimate P speciation in soil samples is to apply linear combination fitting (LCF) to normalized P K-edge X-ray absorption near-edge structure (XANES) spectra. Despite the rapid growth of such applications, the uncertainties of the fitted weights are still poorly known. Further, there are few reports to what extent the LCF standards represent unique end-members. Here, the co-variance between 34 standards was determined and their significance for LCF was discussed. We present a probabilistic approach for refining the calculation of LCF weights based on Latin hypercube sampling of normalized XANES spectra, where the contributions of energy calibration and normalization to fit uncertainty were considered. Many of the LCF standards, particularly within the same standard groups, were strongly correlated. This supports an approach in which the LCF standards are grouped. Moreover, adsorbed phytates and monetite were well described by other standards, which puts into question their use as end-members in LCF. Use of the probabilistic method resulted in uncertainties ranging from 2 to 11 percentage units. Uncertainties in the calibrated energy were important for the LCF weights, particularly for organic P, which changed with up to 2.7 percentage units per 0.01 eV error in energy. These results highlight the necessity of careful energy calibration and the use of frequent calibration checks. The probabilistic approach, in which at least 100 spectral variants are analyzed, improves our ability to identify the most likely P compounds present in a soil sample, and a procedure for this is suggested in the paper
    corecore