28 research outputs found

    Isolation, Identification and Antibiotic Resistance Profile of Public Health Threat Enteric Bacteria from Milk and Dairy Products Retail in Abakaliki, South-East, Nigeria

    Get PDF
    Milk and foods made from milk is manufactured into more stable dairy products of worldwide value, such as butter, cheese, ice cream, and yoghurt. Consumption of contaminated milk or dairy products by pathogens causes human gastrointestinal infection, which leads to diarrheal disease in human and hospitalization or death in severe cases especially among elderly and children. An assessment of milk and dairy products was designed to determine the microbiological quality of milk and dairy products consumed in Abakaliki, Nigeria. Culture techniques were used for isolation of enteric bacteria from retail dairy products and disk diffusion method were used to determine the Antibiotic Resistance profile of isolates. Bacteria pathogens isolated were characterized and identified using morphological and biochemical techniques. SPSS and Chi-square test were used for the analysis of the study, P-value of 0.02 indicates a significant difference between the bacteria pathogens counts. A total of 161 pathogenic bacteria were isolated from 100 dairy products. Salmonella spp heard (26.1%), Escherichia coli (44.1%) and Shigella spp. (29.8%). All identified isolates were found to be 100% susceptible to ciprofloxacin and gentamycin, with 66.7% for ofloxacin. Augmentin, ampicillin, chloramphenicol and spectinomycin was 100% resistant. Data obtained confirm that milk and dairy products retailed in Abakaliki pose a serious public health threat to consumers due to the presence of pathogenic bacteria. Standard and good storage conditions, as well as environmental and personnel hygiene should be practiced to prevent contamination of milk and dairy products for the safety of consumers

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Evaluating the antibody response to SARS-COV-2 vaccination amongst kidney transplant recipients at a single nephrology centre

    No full text
    BACKGROUND AND OBJECTIVES: Kidney transplant recipients are highly vulnerable to the serious complications of severe acute respiratory syndrome coronavirus 2 (SARS-COV-2) infections and thus stand to benefit from vaccination. Therefore, it is necessary to establish the effectiveness of available vaccines as this group of patients was not represented in the randomized trials. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS: A total of 707 consecutive adult kidney transplant recipients in a single center in the United Kingdom were evaluated. 373 were confirmed to have received two doses of either the BNT162b2 (Pfizer-BioNTech) or AZD1222 (Oxford-AstraZeneca) and subsequently had SARS-COV-2 antibody testing were included in the final analysis. Participants were excluded from the analysis if they had a previous history of SARS-COV-2 infection or were seropositive for SARS-COV-2 antibody pre-vaccination. Multivariate and propensity score analyses were performed to identify the predictors of antibody response to SARS-COV-2 vaccines. The primary outcome was seroconversion rates following two vaccine doses. RESULTS: Antibody responders were 56.8% (212/373) and non-responders 43.2% (161/373). Antibody response was associated with greater estimated glomerular filtration (eGFR) rate [odds ratio (OR), for every 10 ml/min/1.73m(2) = 1.40 (1.19–1.66), P<0.001] whereas, non-response was associated with mycophenolic acid immunosuppression [OR, 0.02(0.01–0.11), p<0.001] and increasing age [OR per 10year increase, 0.61(0.48–0.78), p<0.001]. In the propensity-score analysis of four treatment variables (vaccine type, mycophenolic acid, corticosteroid, and triple immunosuppression), only mycophenolic acid was significantly associated with vaccine response [adjusted OR by PSA 0.17 (0.07–0.41): p<0.001]. 22 SARS-COV-2 infections were recorded in our cohort following vaccination. 17(77%) infections, with 3 deaths, occurred in the non-responder group. No death occurred in the responder group. CONCLUSION: Vaccine response in allograft recipients after two doses of SARS-COV-2 vaccine is poor compared to the general population. Maintenance with mycophenolic acid appears to have the strongest negative impact on vaccine response

    Factors Governing the Erythropoietic Response to Intravenous Iron Infusion in Patients with Chronic Kidney Disease: A Retrospective Cohort Study

    No full text
    Background: Limited knowledge exists about factors affecting parenteral iron response. A study was conducted to determine the factors influencing the erythropoietic response to parenteral iron in iron-deficient anaemic patients whose kidney function ranged from normal through all stages of chronic kidney disease (CKD) severity. Methods: This retrospective cohort study included parenteral iron recipients who did not receive erythropoiesis-stimulating agents (ESA) between 2017 and 2019. The study cohort was derived from two groups of patients: those managed by the CKD team and patients being optimised for surgery in the pre-operative clinic. Patients were categorized based on their kidney function: Patients with normal kidney function [estimated glomerular filtration rate (eGFR) ≥ 60 mL/min/1.73 m2] were compared to those with CKD stages 3–5 (eGFR 2). Patients were further stratified by the type of iron deficiency [absolute iron deficiency (AID) versus functional iron deficiency (FID)]. The key outcome was change in hemoglobin (∆Hb) between pre- and post-infusion haemoglobin (Hb) values. Parenteral iron response was assessed using propensity-score matching and multivariate linear regression. The impact of kidney impairment versus the nature of iron deficiency (AID vs. FID) in response was explored. Results: 732 subjects (mean age 66 ± 17 years, 56% females and 87% White) were evaluated. No significant differences were observed in the time to repeat Hb among CKD stages and FID/AID patients. The Hb rise was significantly lower with lower kidney function (non-CKD and CKD1–2; 13 g/L, CKD3–5; 7 g/L; p < 0.001). When groups with different degrees of renal impairment were propensity-score matched according to whether iron deficiency was due to AID or FID, the level of CKD was found not to be relevant to Hb responses [unmatched (∆Hb) 12.1 vs. 8.7 g/L; matched (∆Hb) 12.4 vs. 12.1 g/L in non-CKD and CKD1–2 versus CKD3–5, respectively]. However, a comparison of patients with AID and FID, while controlling for the degree of CKD, indicated that patients with FID exhibited a diminished Hb response regardless of their level of kidney impairment. Conclusion: The nature of iron deficiency rather than the severity of CKD has a stronger impact on Hb response to intravenous iron with an attenuated response seen in functional iron deficiency irrespective of the degree of renal impairment

    Starch Hydrolysis, Polyphenol Contents, and In Vitro Alpha Amylase Inhibitory Properties of Some Nigerian Foods As Affected by Cooking

    No full text
    The effect of cooking on starch hydrolysis, polyphenol contents, and in vitro α-amylase inhibitory properties of mushrooms (two varieties Russula virescens and Auricularia auricula-judae), sweet potato (Ipomea batatas), and potato (Solanum tuberosum) was investigated. The total, resistant, and digestible starch contents of the raw and cooked food samples (FS) ranged from 6.4 to 64.9; 0 to 10.1; and 6.4 to 62.7 g/100 g, respectively, while their percentages of starch digestibility (DS values expressed as percentages of total starch hydrolyzed) ranged from 45.99 to 100. Raw and boiled unpeeled potato, raw and boiled peeled potato, raw A. auricula-judae, and sweet potato showed mild to high α-amylase inhibition (over a range of concentration of 10–50 mg/mL), which was lower than that of acarbose (that had 69% inhibition of α-amylase over a range of concentration of 2–10 mg/mL), unlike raw R. virescens, boiled A. auricula-judae, and boiled sweet potatoes that activated α-amylase and boiled R. virescens that gave 0% inhibition. The FS contained flavonoids and phenols in addition. The significant negative correlation (r = −0.55; P = 0.05) between the α-amylase inhibitory properties of the raw and cooked FS versus their SD indicates that the α-amylase inhibitors in these FS also influenced the digestibility of their starches. In addition, the significant positive correlation between the α-amylase inhibitory properties of the raw and cooked FS versus their resistant starch (RS) (r = 0.59; P = 0.01) contents indicates that the RS constituents of these FS contributed to their α-amylase inhibitory properties. The study showed the usefulness of boiled unpeeled potato, boiled potato peeled, and raw sweet potato as functional foods for people with type 2 diabetes

    Periodic Effects of Crude Oil Pollution on Some Nutrient Elements of Soils Treated Over a 90 Day Period Using Schwenkia Americana L. and Spermacoce ocymoides Burm. f.

    No full text
    Crude oil contamination of the environment awfully impede soil ecosystem, through adsorption and surface assimilation of soil particles, contributing to excess carbon which might be unfeasible for use by the microbial populace, thereby bringing about constraints in soil nutrients. This study investigated the effects on some soil nutrient elements brought about by crude oil contamination. Laboratory analyses were carried out using standard methods. When compared to the values before planting, results obtained within the 90 days planting, revealed a significant decrease in the treated soils’ exchangeable calcium, exchangeable magnesium, total nitrogen, phosphorus and potassium contents whereas a significant increase was recorded in the sulfur content. This indicates a deficiency of these nutrients in soils phyto-remediated over a 90 day period, and as such imperative for such soils to be augmented with nutrients before use for agricultural and other related purposes
    corecore