50 research outputs found

    Efficiency of Extraction of Trace metals from Blood samples using Wet Digestion and Microwave Digestion Techniques

    Get PDF
    The efficiency of extraction of trace metals using conventional wet acid digestion method (CDM) and microwave induced acid digestion method (MWD) was determined by recovery experiments. The high percentage recoveries obtained from microwave induced acid digestion method make it to be a more efficient method than the conventional wet acid method. The conventional wet acid digestion method is time consuming while the MWD method saves time, less than four minutes are required to complete the digestion of the biological samples in this study

    Essential and Non-Essential Metals Profile in Blood of some Nigerian Pregnant Women

    Get PDF
    In this study, the concentrations of some essential (Ca, Cu, Fe, Mg, Mn, Ni, and Zn) and non-essential metals (Cd and Pb) were determined in blood of pregnant women aged between 15 – 45 years and enrolled at the Obafemi Awolowo University Teaching Hospitals Complex, Ile – Ife, Nigeria, for antenatal care. Fifty samples of whole blood were collected from the pregnant women and twenty five samples from non pregnant women as control. Levels of essential and non- essential metals were determined by atomic absorption spectrophotometry. The analyses were performed in order to assess the body burden of pregnant women with the metals and the health implications of the latter to pregnant women and their fetuses. Data analysis by descriptive and inferential statistics revealed that age, education, and profession correlate with the levels of the metals. The mean levels of the two non-essential metals obtained in this study were lower than the recommended limits for whole blood. While compared with other studies of pregnant women elsewhere results obtain were generally higher. Generally the values obtained in this study have indicated no serious body burden on the pregnant women. The importance of establishing factors that influence low human exposure concentrations is becoming critical in efforts to reduce exposures and hence the potential for adverse health effects. ©JASE

    Detection of metallo betalactamases among gram negative bacterial isolates from Murtala Muhammad Specialist Hospital, Kano and Almadina Hospital Kaduna, Nigeria

    Get PDF
    Over the last few years, the increase in the number of multi-resistant (MR) enterobacteria has become a major clinical problem. This study detects the occurrence and prevalence of Metallo betalactamase production among some clinical bacterial isolates in Murtala Muhammad SpecialistHospital, Kano and Al-Madina Specialist Hospital Kaduna, Nigeria. A total of 200 clinical isolates comprising of E. coli (83), Klebsiella pneumoniae (52), Pseusomonas aeruginosa (28) and Proteus mirabilis (37) were screened phenotypically for carbapenemase and specifically for Metallo betalactamase using Modified Hodges Test and EDTA Disc Synergy Test respectively. The result showed that 67(33.5%) of the isolates were found to produce carbapenemase. High production occurred in 24(35.8%) and low production occurred in 43(64.2%) of the isolates. Highest prevalence of carbapenemase was found in Pseudomonas aeruginosa (38.55%) followed by E. coli (34.8%), Proteus mirabilis. (29.1%) and least prevalence in Klebsiella pneumoniae (25.0%). The prevalence of MBLs in the study was 24.5% with highest prevalence in E. coli (31.32%) followed by Proteus mirabilis. (21.6%), Pseudomonas aeruginosa (21.2%) and least among Klebsiella pneumoniae. (14.3%). Most of carbapenemase producers produce MBL type. Urine samples were found to be with the highest prevalence of 38.3% when compared with ear swab (12.0%). Prevalence of 67.9% and 76.9% were recorded for Murtala Muhammad specialist hospital Kano and Al-madina hospital Kaduna respectively. This showed that carbapenemase-mediated resistance occurred in the selected hospitals and uncontrolled spread may lead to treatment failure and frustration.Keywords: Metallobetalactamase, Carbapenemase, Enterobacteriaceae, prevalence, Hospita

    Metagenomic analysis of viruses associated with maize lethal necrosis in Kenya

    Get PDF
    Background: Maize lethal necrosis is caused by a synergistic co-infection of Maize chlorotic mottle virus (MCMV) and a specific member of the Potyviridae, such as Sugarcane mosaic virus (SCMV), Wheat streak mosaic virus (WSMV) or Johnson grass mosaic virus (JGMV). Typical maize lethal necrosis symptoms include severe yellowing and leaf drying from the edges. In Kenya, we detected plants showing typical and atypical symptoms. Both groups of plants often tested negative for SCMV by ELISA. Methods: We used next-generation sequencing to identify viruses associated to maize lethal necrosis in Kenya through a metagenomics analysis. Symptomatic and asymptomatic leaf samples were collected from maize and sorghum representing sixteen counties. Results: Complete and partial genomes were assembled for MCMV, SCMV, Maize streak virus (MSV) and Maize yellow dwarf virus-RMV (MYDV-RMV). These four viruses (MCMV, SCMV, MSV and MYDV-RMV) were found together in 30 of 68 samples. A geographic analysis showed that these viruses are widely distributed in Kenya. Phylogenetic analyses of nucleotide sequences showed that MCMV, MYDV-RMV and MSV are similar to isolates from East Africa and other parts of the world. Single nucleotide polymorphism, nucleotide and polyprotein sequence alignments identified three genetically distinct groups of SCMV in Kenya. Variation mapped to sequences at the border of NIb and the coat protein. Partial genome sequences were obtained for other four potyviruses and one polerovirus. Conclusion: Our results uncover the complexity of the maize lethal necrosis epidemic in Kenya. MCMV, SCMV, MSV and MYDV-RMV are widely distributed and infect both maize and sorghum. SCMV population in Kenya is diverse and consists of numerous strains that are genetically different to isolates from other parts of the world. Several potyviruses, and possibly poleroviruses, are also involved

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone

    Toxic Metals Profiles in Hair Samples from Street Roaming Animals at Yelwa-Yauri Town, North Western Nigeria

    No full text
    Assessment of toxic metals profiles in hair samples from street roaming animals in Yelwa-Yauri town, North Western, Nigeria was carried out. Hair samples for the analyses were collected from 108 animal (54 sheep and 54 goats) roaming the street of Yelwa-Yauri town and 45 samples from animals in the neighboring villages. Levels of heavy minerals (Cd, Co, Cu, Fe, Pb and Zn) were analyzed using Atomic Absorption Spectroscopy after microwave induced acid digestion. The mean levels of Cd, Co, Cu, Fe, Pb and Zn were 2.09, 0.19, 43.01, 49.12, 2.05 and 54.50 &micro;g/g and 1.96, 2.30, 55.30, 61.03, 1.93 and 62.10 &micro;g/g in goats and sheep respectively. Levels of essential heavy metals (Co, Cu, Fe and Zn) in sheep hair were higher than those of goats, while levels of non-essential heavy metals (Cd and Pb) in goat&rsquo;s hair were higher than in sheep. Pearson correlation matrix results in both goats and sheep hair, reveals that Cu and Co (r = 0.030) and Fe and Cu (r = 0.011) had significant (p&lt;0.05) positive correlation. Lead and cadmium (r = 0.00), have zero correlation, while the remaining either had negative significant (p&lt;0.05) correlation or were not correlated significantly, with the exception of Fe and Cu (r = 0.040), which had positive significant (p&lt;0.05) correlation in sheep. Generally, levels of heavy metals in this study were lower than reported values in similar studies elsewhere, which suggested safe or non-toxic levels. Higher levels of some toxic metals in the animal&rsquo;s hair may be cause for concern as this may lead to disease in the animals or humans through food chain
    corecore