28 research outputs found
Isolation, Identification and Antibiotic Resistance Profile of Public Health Threat Enteric Bacteria from Milk and Dairy Products Retail in Abakaliki, South-East, Nigeria
Milk and foods made from milk is manufactured into more stable dairy products of worldwide value, such as butter, cheese, ice cream, and yoghurt. Consumption of contaminated milk or dairy products by pathogens causes human gastrointestinal infection, which leads to diarrheal disease in human and hospitalization or death in severe cases especially among elderly and children. An assessment of milk and dairy products was designed to determine the microbiological quality of milk and dairy products consumed in Abakaliki, Nigeria. Culture techniques were used for isolation of enteric bacteria from retail dairy products and disk diffusion method were used to determine the Antibiotic Resistance profile of isolates. Bacteria pathogens isolated were characterized and identified using morphological and biochemical techniques. SPSS and Chi-square test were used for the analysis of the study, P-value of 0.02 indicates a significant difference between the bacteria pathogens counts. A total of 161 pathogenic bacteria were isolated from 100 dairy products. Salmonella spp heard (26.1%), Escherichia coli (44.1%) and Shigella spp. (29.8%). All identified isolates were found to be 100% susceptible to ciprofloxacin and gentamycin, with 66.7% for ofloxacin. Augmentin, ampicillin, chloramphenicol and spectinomycin was 100% resistant. Data obtained confirm that milk and dairy products retailed in Abakaliki pose a serious public health threat to consumers due to the presence of pathogenic bacteria. Standard and good storage conditions, as well as environmental and personnel hygiene should be practiced to prevent contamination of milk and dairy products for the safety of consumers
The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance
INTRODUCTION
Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic.
RATIONALE
We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs).
RESULTS
Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants.
CONCLUSION
Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
Recommended from our members
Global burden of 288 causes of death and life expectancy decomposition in 204 countries and territories and 811 subnational locations, 1990–2021: a systematic analysis for the Global Burden of Disease Study 2021
BACKGROUND Regular, detailed reporting on population health by underlying cause of death is fundamental for public health decision making. Cause-specific estimates of mortality and the subsequent effects on life expectancy worldwide are valuable metrics to gauge progress in reducing mortality rates. These estimates are particularly important following large-scale mortality spikes, such as the COVID-19 pandemic. When systematically analysed, mortality rates and life expectancy allow comparisons of the consequences of causes of death globally and over time, providing a nuanced understanding of the effect of these causes on global populations. METHODS The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2021 cause-of-death analysis estimated mortality and years of life lost (YLLs) from 288 causes of death by age-sex-location-year in 204 countries and territories and 811 subnational locations for each year from 1990 until 2021. The analysis used 56 604 data sources, including data from vital registration and verbal autopsy as well as surveys, censuses, surveillance systems, and cancer registries, among others. As with previous GBD rounds, cause-specific death rates for most causes were estimated using the Cause of Death Ensemble model-a modelling tool developed for GBD to assess the out-of-sample predictive validity of different statistical models and covariate permutations and combine those results to produce cause-specific mortality estimates-with alternative strategies adapted to model causes with insufficient data, substantial changes in reporting over the study period, or unusual epidemiology. YLLs were computed as the product of the number of deaths for each cause-age-sex-location-year and the standard life expectancy at each age. As part of the modelling process, uncertainty intervals (UIs) were generated using the 2·5th and 97·5th percentiles from a 1000-draw distribution for each metric. We decomposed life expectancy by cause of death, location, and year to show cause-specific effects on life expectancy from 1990 to 2021. We also used the coefficient of variation and the fraction of population affected by 90% of deaths to highlight concentrations of mortality. Findings are reported in counts and age-standardised rates. Methodological improvements for cause-of-death estimates in GBD 2021 include the expansion of under-5-years age group to include four new age groups, enhanced methods to account for stochastic variation of sparse data, and the inclusion of COVID-19 and other pandemic-related mortality-which includes excess mortality associated with the pandemic, excluding COVID-19, lower respiratory infections, measles, malaria, and pertussis. For this analysis, 199 new country-years of vital registration cause-of-death data, 5 country-years of surveillance data, 21 country-years of verbal autopsy data, and 94 country-years of other data types were added to those used in previous GBD rounds. FINDINGS The leading causes of age-standardised deaths globally were the same in 2019 as they were in 1990; in descending order, these were, ischaemic heart disease, stroke, chronic obstructive pulmonary disease, and lower respiratory infections. In 2021, however, COVID-19 replaced stroke as the second-leading age-standardised cause of death, with 94·0 deaths (95% UI 89·2-100·0) per 100 000 population. The COVID-19 pandemic shifted the rankings of the leading five causes, lowering stroke to the third-leading and chronic obstructive pulmonary disease to the fourth-leading position. In 2021, the highest age-standardised death rates from COVID-19 occurred in sub-Saharan Africa (271·0 deaths [250·1-290·7] per 100 000 population) and Latin America and the Caribbean (195·4 deaths [182·1-211·4] per 100 000 population). The lowest age-standardised death rates from COVID-19 were in the high-income super-region (48·1 deaths [47·4-48·8] per 100 000 population) and southeast Asia, east Asia, and Oceania (23·2 deaths [16·3-37·2] per 100 000 population). Globally, life expectancy steadily improved between 1990 and 2019 for 18 of the 22 investigated causes. Decomposition of global and regional life expectancy showed the positive effect that reductions in deaths from enteric infections, lower respiratory infections, stroke, and neonatal deaths, among others have contributed to improved survival over the study period. However, a net reduction of 1·6 years occurred in global life expectancy between 2019 and 2021, primarily due to increased death rates from COVID-19 and other pandemic-related mortality. Life expectancy was highly variable between super-regions over the study period, with southeast Asia, east Asia, and Oceania gaining 8·3 years (6·7-9·9) overall, while having the smallest reduction in life expectancy due to COVID-19 (0·4 years). The largest reduction in life expectancy due to COVID-19 occurred in Latin America and the Caribbean (3·6 years). Additionally, 53 of the 288 causes of death were highly concentrated in locations with less than 50% of the global population as of 2021, and these causes of death became progressively more concentrated since 1990, when only 44 causes showed this pattern. The concentration phenomenon is discussed heuristically with respect to enteric and lower respiratory infections, malaria, HIV/AIDS, neonatal disorders, tuberculosis, and measles. INTERPRETATION Long-standing gains in life expectancy and reductions in many of the leading causes of death have been disrupted by the COVID-19 pandemic, the adverse effects of which were spread unevenly among populations. Despite the pandemic, there has been continued progress in combatting several notable causes of death, leading to improved global life expectancy over the study period. Each of the seven GBD super-regions showed an overall improvement from 1990 and 2021, obscuring the negative effect in the years of the pandemic. Additionally, our findings regarding regional variation in causes of death driving increases in life expectancy hold clear policy utility. Analyses of shifting mortality trends reveal that several causes, once widespread globally, are now increasingly concentrated geographically. These changes in mortality concentration, alongside further investigation of changing risks, interventions, and relevant policy, present an important opportunity to deepen our understanding of mortality-reduction strategies. Examining patterns in mortality concentration might reveal areas where successful public health interventions have been implemented. Translating these successes to locations where certain causes of death remain entrenched can inform policies that work to improve life expectancy for people everywhere. FUNDING Bill & Melinda Gates Foundation
Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries
Abstract
Background
Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres.
Methods
This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries.
Results
In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia.
Conclusion
This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
Managing recurrent urinary tract infections in kidney transplant recipients using smartphone assisted urinalysis test
Evaluating the antibody response to SARS-COV-2 vaccination amongst kidney transplant recipients at a single nephrology centre
BACKGROUND AND OBJECTIVES: Kidney transplant recipients are highly vulnerable to the serious complications of severe acute respiratory syndrome coronavirus 2 (SARS-COV-2) infections and thus stand to benefit from vaccination. Therefore, it is necessary to establish the effectiveness of available vaccines as this group of patients was not represented in the randomized trials. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: A total of 707 consecutive adult kidney transplant recipients in a single center in the United Kingdom were evaluated. 373 were confirmed to have received two doses of either the BNT162b2 (Pfizer-BioNTech) or AZD1222 (Oxford-AstraZeneca) and subsequently had SARS-COV-2 antibody testing were included in the final analysis. Participants were excluded from the analysis if they had a previous history of SARS-COV-2 infection or were seropositive for SARS-COV-2 antibody pre-vaccination. Multivariate and propensity score analyses were performed to identify the predictors of antibody response to SARS-COV-2 vaccines. The primary outcome was seroconversion rates following two vaccine doses. RESULTS: Antibody responders were 56.8% (212/373) and non-responders 43.2% (161/373). Antibody response was associated with greater estimated glomerular filtration (eGFR) rate [odds ratio (OR), for every 10 ml/min/1.73m(2) = 1.40 (1.19–1.66), P<0.001] whereas, non-response was associated with mycophenolic acid immunosuppression [OR, 0.02(0.01–0.11), p<0.001] and increasing age [OR per 10year increase, 0.61(0.48–0.78), p<0.001]. In the propensity-score analysis of four treatment variables (vaccine type, mycophenolic acid, corticosteroid, and triple immunosuppression), only mycophenolic acid was significantly associated with vaccine response [adjusted OR by PSA 0.17 (0.07–0.41): p<0.001]. 22 SARS-COV-2 infections were recorded in our cohort following vaccination. 17(77%) infections, with 3 deaths, occurred in the non-responder group. No death occurred in the responder group. CONCLUSION: Vaccine response in allograft recipients after two doses of SARS-COV-2 vaccine is poor compared to the general population. Maintenance with mycophenolic acid appears to have the strongest negative impact on vaccine response
Factors Governing the Erythropoietic Response to Intravenous Iron Infusion in Patients with Chronic Kidney Disease: A Retrospective Cohort Study
Background: Limited knowledge exists about factors affecting parenteral iron response. A study was conducted to determine the factors influencing the erythropoietic response to parenteral iron in iron-deficient anaemic patients whose kidney function ranged from normal through all stages of chronic kidney disease (CKD) severity. Methods: This retrospective cohort study included parenteral iron recipients who did not receive erythropoiesis-stimulating agents (ESA) between 2017 and 2019. The study cohort was derived from two groups of patients: those managed by the CKD team and patients being optimised for surgery in the pre-operative clinic. Patients were categorized based on their kidney function: Patients with normal kidney function [estimated glomerular filtration rate (eGFR) ≥ 60 mL/min/1.73 m2] were compared to those with CKD stages 3–5 (eGFR 2). Patients were further stratified by the type of iron deficiency [absolute iron deficiency (AID) versus functional iron deficiency (FID)]. The key outcome was change in hemoglobin (∆Hb) between pre- and post-infusion haemoglobin (Hb) values. Parenteral iron response was assessed using propensity-score matching and multivariate linear regression. The impact of kidney impairment versus the nature of iron deficiency (AID vs. FID) in response was explored. Results: 732 subjects (mean age 66 ± 17 years, 56% females and 87% White) were evaluated. No significant differences were observed in the time to repeat Hb among CKD stages and FID/AID patients. The Hb rise was significantly lower with lower kidney function (non-CKD and CKD1–2; 13 g/L, CKD3–5; 7 g/L; p < 0.001). When groups with different degrees of renal impairment were propensity-score matched according to whether iron deficiency was due to AID or FID, the level of CKD was found not to be relevant to Hb responses [unmatched (∆Hb) 12.1 vs. 8.7 g/L; matched (∆Hb) 12.4 vs. 12.1 g/L in non-CKD and CKD1–2 versus CKD3–5, respectively]. However, a comparison of patients with AID and FID, while controlling for the degree of CKD, indicated that patients with FID exhibited a diminished Hb response regardless of their level of kidney impairment. Conclusion: The nature of iron deficiency rather than the severity of CKD has a stronger impact on Hb response to intravenous iron with an attenuated response seen in functional iron deficiency irrespective of the degree of renal impairment
Starch Hydrolysis, Polyphenol Contents, and In Vitro Alpha Amylase Inhibitory Properties of Some Nigerian Foods As Affected by Cooking
The effect of cooking on starch hydrolysis, polyphenol contents, and in vitro α-amylase inhibitory properties of mushrooms (two varieties Russula virescens and Auricularia auricula-judae), sweet potato (Ipomea batatas), and potato (Solanum tuberosum) was investigated. The total, resistant, and digestible starch contents of the raw and cooked food samples (FS) ranged from 6.4 to 64.9; 0 to 10.1; and 6.4 to 62.7 g/100 g, respectively, while their percentages of starch digestibility (DS values expressed as percentages of total starch hydrolyzed) ranged from 45.99 to 100. Raw and boiled unpeeled potato, raw and boiled peeled potato, raw A. auricula-judae, and sweet potato showed mild to high α-amylase inhibition (over a range of concentration of 10–50 mg/mL), which was lower than that of acarbose (that had 69% inhibition of α-amylase over a range of concentration of 2–10 mg/mL), unlike raw R. virescens, boiled A. auricula-judae, and boiled sweet potatoes that activated α-amylase and boiled R. virescens that gave 0% inhibition. The FS contained flavonoids and phenols in addition. The significant negative correlation (r = −0.55; P = 0.05) between the α-amylase inhibitory properties of the raw and cooked FS versus their SD indicates that the α-amylase inhibitors in these FS also influenced the digestibility of their starches. In addition, the significant positive correlation between the α-amylase inhibitory properties of the raw and cooked FS versus their resistant starch (RS) (r = 0.59; P = 0.01) contents indicates that the RS constituents of these FS contributed to their α-amylase inhibitory properties. The study showed the usefulness of boiled unpeeled potato, boiled potato peeled, and raw sweet potato as functional foods for people with type 2 diabetes
Periodic Effects of Crude Oil Pollution on Some Nutrient Elements of Soils Treated Over a 90 Day Period Using Schwenkia Americana L. and Spermacoce ocymoides Burm. f.
Crude oil contamination of the environment awfully impede soil ecosystem, through adsorption and surface assimilation of soil particles, contributing to excess carbon which might be unfeasible for use by the microbial populace, thereby bringing about constraints in soil nutrients. This study investigated the effects on some soil nutrient elements brought about by crude oil contamination. Laboratory analyses were carried out using standard methods. When compared to the values before planting, results obtained within the 90 days planting, revealed a significant decrease in the treated soils’ exchangeable calcium, exchangeable magnesium, total nitrogen, phosphorus and potassium contents whereas a significant increase was recorded in the sulfur content. This indicates a deficiency of these nutrients in soils phyto-remediated over a 90 day period, and as such imperative for such soils to be augmented with nutrients before use for agricultural and other related purposes