17 research outputs found

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Factors Governing the Erythropoietic Response to Intravenous Iron Infusion in Patients with Chronic Kidney Disease: A Retrospective Cohort Study

    No full text
    Background: Limited knowledge exists about factors affecting parenteral iron response. A study was conducted to determine the factors influencing the erythropoietic response to parenteral iron in iron-deficient anaemic patients whose kidney function ranged from normal through all stages of chronic kidney disease (CKD) severity. Methods: This retrospective cohort study included parenteral iron recipients who did not receive erythropoiesis-stimulating agents (ESA) between 2017 and 2019. The study cohort was derived from two groups of patients: those managed by the CKD team and patients being optimised for surgery in the pre-operative clinic. Patients were categorized based on their kidney function: Patients with normal kidney function [estimated glomerular filtration rate (eGFR) ≥ 60 mL/min/1.73 m2] were compared to those with CKD stages 3–5 (eGFR 2). Patients were further stratified by the type of iron deficiency [absolute iron deficiency (AID) versus functional iron deficiency (FID)]. The key outcome was change in hemoglobin (∆Hb) between pre- and post-infusion haemoglobin (Hb) values. Parenteral iron response was assessed using propensity-score matching and multivariate linear regression. The impact of kidney impairment versus the nature of iron deficiency (AID vs. FID) in response was explored. Results: 732 subjects (mean age 66 ± 17 years, 56% females and 87% White) were evaluated. No significant differences were observed in the time to repeat Hb among CKD stages and FID/AID patients. The Hb rise was significantly lower with lower kidney function (non-CKD and CKD1–2; 13 g/L, CKD3–5; 7 g/L; p < 0.001). When groups with different degrees of renal impairment were propensity-score matched according to whether iron deficiency was due to AID or FID, the level of CKD was found not to be relevant to Hb responses [unmatched (∆Hb) 12.1 vs. 8.7 g/L; matched (∆Hb) 12.4 vs. 12.1 g/L in non-CKD and CKD1–2 versus CKD3–5, respectively]. However, a comparison of patients with AID and FID, while controlling for the degree of CKD, indicated that patients with FID exhibited a diminished Hb response regardless of their level of kidney impairment. Conclusion: The nature of iron deficiency rather than the severity of CKD has a stronger impact on Hb response to intravenous iron with an attenuated response seen in functional iron deficiency irrespective of the degree of renal impairment

    Kidney Transplant-Associated Viral Infection Rates and Outcomes in a Single-Centre Cohort

    No full text
    Background: Opportunistic infections remain a significant cause of morbidity and mortality after kidney transplantation. This retrospective cohort study aimed to assess the incidence and predictors of post-transplant DNA virus infections (CMV, EBV, BKV and JCV infections) in kidney transplant recipients (KTR) at a single tertiary centre and evaluate their impact on graft outcomes. Methods: KTR transplanted between 2000 and 2021 were evaluated. Multivariate logistic regression analysis and Cox proportional hazard analyses were used to identify factors associated with DNA virus infections and their impact on allograft outcomes respectively. A sub-analysis of individual viral infections was also conducted to describe the pattern, timing, interventions, and outcomes of individual infections. Results: Data from 962 recipients were evaluated (Mean age 47.3 &plusmn; 15 years, 62% male, 81% white). 30% of recipients (288/962) had infection(s) by one or more of the DNA viruses. Individually, CMV, EBV, BKV and JCV viruses were diagnosed in 13.8%. 11.3%, 8.9% and 4.4% of recipients respectively. Factors associated with increased risk of post-transplant DNA virus infection included recipient female gender, higher number of HLA mismatch, lower baseline estimated glomerular filtration rate (eGFR), CMV seropositive donor, maintenance with cyclosporin (rather than tacrolimus) and higher number of maintenance immunosuppressive medications. The slope of eGFR decline was steeper in recipients with a history of DNA virus infection irrespective of the virus type. Further, GFR declined faster with an increasing number of different viral infections. Death-censored graft loss adjusted for age, gender, total HLA mismatch, baseline eGFR and acute rejection was significantly higher in recipients with a history of DNA virus infection than those without infection (adjusted hazard ratio (aHR, 1.74, 95% CI, 1.08&ndash;2.80)). In contrast, dialysis-free survival did not differ between the two groups of recipients (aHR, 1.13, 95% CI, 0.88&ndash;1.47). Conclusion: Post-transplant DNA viral infection is associated with a higher risk of allograft loss. Careful management of immunosuppression and close surveillance of at-risk recipients may improve graft outcomes

    In Vivo and in Silico Assessment of Ameliorative Effects of Xylopia aethiopica on Testosterone Propionate-Induced Benign Prostatic Hyperplasia

    No full text
    Xylopia aethiopica (XAE) is a commonly used herbal medicine and contains rich active ingredients for a variety of biological activities. The study aimed to explore the role of XAE in the management of benign prostatic hyperplasia (BPH). In the study, testosterone propionate-induced BPH in albino rats was established and treated with different concentrations of ethanol extract of XAE leaf. After treatment, the rats were sacrificed, and the body and prostate weights were recorded. The prostate-specific antigen (PSA) and acid phosphatase (ACP) levels in the blood samples were also determined. Gas chromatography-mass spectrometry was conducted to assess the active chemical compounds. Docking analysis was performed to screen chemical compounds by evaluating their binding affinity with two pro-BPH protein targets (cellular prostatic ACP and PSA). Our data showed the presence of 44 chemical compounds in XAE leaf extract. The body and prostate weights, as well as the levels of PSA and ACP, were significantly increased in BPH induction, and the changing trend was significantly reversed by additional XAE treatment. Interestingly, PSA and ACP levels in XAE-treated groups were reduced to almost the same levels as those in the healthy control. Docking analysis identified four top-posed compounds: β-amyrin, α-amyrin, α-amyrenone, and lupenone with stronger binding energies to prostatic ACP being −9.8, −8.3, −8.4, and −8.6, respectively, compared with the standard drug finasteride (−8.3). Furthermore, the two-dimensional analysis revealed strong interactions through hydrogen bonding, covalent interactions, and several van der Waal forces between the lead compounds and the target proteins. Notably, there was a recurrence interaction between similar residues Asn-1062, Lys-1250, Lys-1059, and Phe-1060 on the protein targets and the lead compounds. The study first revealed the role of XAE in BPH therapy and will help in drug design based on the lead compounds discovered in this work

    Global fertility in 204 countries and territories, 1950–2021, with forecasts to 2100: a comprehensive demographic analysis for the Global Burden of Disease Study 2021

    Get PDF
    BackgroundAccurate assessments of current and future fertility—including overall trends and changing population age structures across countries and regions—are essential to help plan for the profound social, economic, environmental, and geopolitical challenges that these changes will bring. Estimates and projections of fertility are necessary to inform policies involving resource and health-care needs, labour supply, education, gender equality, and family planning and support. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 produced up-to-date and comprehensive demographic assessments of key fertility indicators at global, regional, and national levels from 1950 to 2021 and forecast fertility metrics to 2100 based on a reference scenario and key policy-dependent alternative scenarios. MethodsTo estimate fertility indicators from 1950 to 2021, mixed-effects regression models and spatiotemporal Gaussian process regression were used to synthesise data from 8709 country-years of vital and sample registrations, 1455 surveys and censuses, and 150 other sources, and to generate age-specific fertility rates (ASFRs) for 5-year age groups from age 10 years to 54 years. ASFRs were summed across age groups to produce estimates of total fertility rate (TFR). Livebirths were calculated by multiplying ASFR and age-specific female population, then summing across ages 10–54 years. To forecast future fertility up to 2100, our Institute for Health Metrics and Evaluation (IHME) forecasting model was based on projections of completed cohort fertility at age 50 years (CCF50; the average number of children born over time to females from a specified birth cohort), which yields more stable and accurate measures of fertility than directly modelling TFR. CCF50 was modelled using an ensemble approach in which three sub-models (with two, three, and four covariates variously consisting of female educational attainment, contraceptive met need, population density in habitable areas, and under-5 mortality) were given equal weights, and analyses were conducted utilising the MR-BRT (meta-regression—Bayesian, regularised, trimmed) tool. To capture time-series trends in CCF50 not explained by these covariates, we used a first-order autoregressive model on the residual term. CCF50 as a proportion of each 5-year ASFR was predicted using a linear mixed-effects model with fixed-effects covariates (female educational attainment and contraceptive met need) and random intercepts for geographical regions. Projected TFRs were then computed for each calendar year as the sum of single-year ASFRs across age groups. The reference forecast is our estimate of the most likely fertility future given the model, past fertility, forecasts of covariates, and historical relationships between covariates and fertility. We additionally produced forecasts for multiple alternative scenarios in each location: the UN Sustainable Development Goal (SDG) for education is achieved by 2030; the contraceptive met need SDG is achieved by 2030; pro-natal policies are enacted to create supportive environments for those who give birth; and the previous three scenarios combined. Uncertainty from past data inputs and model estimation was propagated throughout analyses by taking 1000 draws for past and present fertility estimates and 500 draws for future forecasts from the estimated distribution for each metric, with 95% uncertainty intervals (UIs) given as the 2·5 and 97·5 percentiles of the draws. To evaluate the forecasting performance of our model and others, we computed skill values—a metric assessing gain in forecasting accuracy—by comparing predicted versus observed ASFRs from the past 15 years (2007–21). A positive skill metric indicates that the model being evaluated performs better than the baseline model (here, a simplified model holding 2007 values constant in the future), and a negative metric indicates that the evaluated model performs worse than baseline. FindingsDuring the period from 1950 to 2021, global TFR more than halved, from 4·84 (95% UI 4·63–5·06) to 2·23 (2·09–2·38). Global annual livebirths peaked in 2016 at 142 million (95% UI 137–147), declining to 129 million (121–138) in 2021. Fertility rates declined in all countries and territories since 1950, with TFR remaining above 2·1—canonically considered replacement-level fertility—in 94 (46·1%) countries and territories in 2021. This included 44 of 46 countries in sub-Saharan Africa, which was the super-region with the largest share of livebirths in 2021 (29·2% [28·7–29·6]). 47 countries and territories in which lowest estimated fertility between 1950 and 2021 was below replacement experienced one or more subsequent years with higher fertility; only three of these locations rebounded above replacement levels. Future fertility rates were projected to continue to decline worldwide, reaching a global TFR of 1·83 (1·59–2·08) in 2050 and 1·59 (1·25–1·96) in 2100 under the reference scenario. The number of countries and territories with fertility rates remaining above replacement was forecast to be 49 (24·0%) in 2050 and only six (2·9%) in 2100, with three of these six countries included in the 2021 World Bank-defined low-income group, all located in the GBD super-region of sub-Saharan Africa. The proportion of livebirths occurring in sub-Saharan Africa was forecast to increase to more than half of the world's livebirths in 2100, to 41·3% (39·6–43·1) in 2050 and 54·3% (47·1–59·5) in 2100. The share of livebirths was projected to decline between 2021 and 2100 in most of the six other super-regions—decreasing, for example, in south Asia from 24·8% (23·7–25·8) in 2021 to 16·7% (14·3–19·1) in 2050 and 7·1% (4·4–10·1) in 2100—but was forecast to increase modestly in the north Africa and Middle East and high-income super-regions. Forecast estimates for the alternative combined scenario suggest that meeting SDG targets for education and contraceptive met need, as well as implementing pro-natal policies, would result in global TFRs of 1·65 (1·40–1·92) in 2050 and 1·62 (1·35–1·95) in 2100. The forecasting skill metric values for the IHME model were positive across all age groups, indicating that the model is better than the constant prediction. InterpretationFertility is declining globally, with rates in more than half of all countries and territories in 2021 below replacement level. Trends since 2000 show considerable heterogeneity in the steepness of declines, and only a small number of countries experienced even a slight fertility rebound after their lowest observed rate, with none reaching replacement level. Additionally, the distribution of livebirths across the globe is shifting, with a greater proportion occurring in the lowest-income countries. Future fertility rates will continue to decline worldwide and will remain low even under successful implementation of pro-natal policies. These changes will have far-reaching economic and societal consequences due to ageing populations and declining workforces in higher-income countries, combined with an increasing share of livebirths among the already poorest regions of the world. FundingBill & Melinda Gates Foundation
    corecore