18 research outputs found
Prevalence and Determinants of Sickness Absenteeism among Healthcare Workers in a Tertiary Hospital in Southwestern Nigeria
Introduction: Sickness absenteeism is a global problem that affects almost all forms of workers, especially healthcare workers. This study assessed the prevalence and determinants of sickness absenteeism among healthcare workers in a tertiary hospital in Southwest, Nigeria.
Methods: An institutional-based, cross-sectional study was conducted among 360 healthcare workers in a Tertiary Hospital in Southwest, Nigeria from October to December 2022. A pre-tested interviewer-administered, semi-structured questionnaire was used to elicit information from the respondents who were selected using a stratified sampling technique. Bivariate analysis and binary logistic regression analysis were performed to identify the predictors of sickness absenteeism using SPSS version 25.0. The significance of associations was determined at p-value < 0.05.
Results: The mean age ± SD of the respondents was 34 ± 7.15 years. The prevalence of sickness absenteeism among the health workers was 21.0%, while the causes of sickness absenteeism were malaria (51%), body pain (18%), and diarrhea (5%). Family obligation (AOR: 2.1, 95% CI: (1.20, 3.53), P=0.009) and the job type (AOR: 2.7, 95% CI: (1.05, 6.83), P=0.038) were the only predictors of sickness absenteeism.
Conclusion: About one-fifth of the respondents had one spell of sickness keeping them away from work due to illnesses such as malaria, diarrhea, and body pain. Preventive interventions should be instituted by stakeholders based on the identified factors to reduce the prevalence of sickness absenteeism among these populations
Adjunctive rifampicin for Staphylococcus aureus bacteraemia (ARREST): a multicentre, randomised, double-blind, placebo-controlled trial.
BACKGROUND: Staphylococcus aureus bacteraemia is a common cause of severe community-acquired and hospital-acquired infection worldwide. We tested the hypothesis that adjunctive rifampicin would reduce bacteriologically confirmed treatment failure or disease recurrence, or death, by enhancing early S aureus killing, sterilising infected foci and blood faster, and reducing risks of dissemination and metastatic infection. METHODS: In this multicentre, randomised, double-blind, placebo-controlled trial, adults (≥18 years) with S aureus bacteraemia who had received ≤96 h of active antibiotic therapy were recruited from 29 UK hospitals. Patients were randomly assigned (1:1) via a computer-generated sequential randomisation list to receive 2 weeks of adjunctive rifampicin (600 mg or 900 mg per day according to weight, oral or intravenous) versus identical placebo, together with standard antibiotic therapy. Randomisation was stratified by centre. Patients, investigators, and those caring for the patients were masked to group allocation. The primary outcome was time to bacteriologically confirmed treatment failure or disease recurrence, or death (all-cause), from randomisation to 12 weeks, adjudicated by an independent review committee masked to the treatment. Analysis was intention to treat. This trial was registered, number ISRCTN37666216, and is closed to new participants. FINDINGS: Between Dec 10, 2012, and Oct 25, 2016, 758 eligible participants were randomly assigned: 370 to rifampicin and 388 to placebo. 485 (64%) participants had community-acquired S aureus infections, and 132 (17%) had nosocomial S aureus infections. 47 (6%) had meticillin-resistant infections. 301 (40%) participants had an initial deep infection focus. Standard antibiotics were given for 29 (IQR 18-45) days; 619 (82%) participants received flucloxacillin. By week 12, 62 (17%) of participants who received rifampicin versus 71 (18%) who received placebo experienced treatment failure or disease recurrence, or died (absolute risk difference -1·4%, 95% CI -7·0 to 4·3; hazard ratio 0·96, 0·68-1·35, p=0·81). From randomisation to 12 weeks, no evidence of differences in serious (p=0·17) or grade 3-4 (p=0·36) adverse events were observed; however, 63 (17%) participants in the rifampicin group versus 39 (10%) in the placebo group had antibiotic or trial drug-modifying adverse events (p=0·004), and 24 (6%) versus six (2%) had drug interactions (p=0·0005). INTERPRETATION: Adjunctive rifampicin provided no overall benefit over standard antibiotic therapy in adults with S aureus bacteraemia. FUNDING: UK National Institute for Health Research Health Technology Assessment
Emergence and spread of two SARS-CoV-2 variants of interest in Nigeria.
Identifying the dissemination patterns and impacts of a virus of economic or health importance during a pandemic is crucial, as it informs the public on policies for containment in order to reduce the spread of the virus. In this study, we integrated genomic and travel data to investigate the emergence and spread of the SARS-CoV-2 B.1.1.318 and B.1.525 (Eta) variants of interest in Nigeria and the wider Africa region. By integrating travel data and phylogeographic reconstructions, we find that these two variants that arose during the second wave in Nigeria emerged from within Africa, with the B.1.525 from Nigeria, and then spread to other parts of the world. Data from this study show how regional connectivity of Nigeria drove the spread of these variants of interest to surrounding countries and those connected by air-traffic. Our findings demonstrate the power of genomic analysis when combined with mobility and epidemiological data to identify the drivers of transmission, as bidirectional transmission within and between African nations are grossly underestimated as seen in our import risk index estimates
The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance
INTRODUCTION
Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic.
RATIONALE
We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs).
RESULTS
Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants.
CONCLUSION
Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries
Abstract
Background
Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres.
Methods
This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries.
Results
In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia.
Conclusion
This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries
A 5-year retrospective study of rampant dental caries among adult patients in a Nigerian Teaching Hospital
Background: Rampant caries in adults has not been a focus of many researches unlike the childhood form of the disease. The disease is an interesting finding in an adult patient. When the condition occurs in children, it has been described as nursing bottle caries, baby bottle tooth decay, and the most recently adopted term, "early childhood caries".
Aim: The aim was to determine the prevalence of rampant caries among adult patients.
Materials and Methods: Cases of rampant caries were identified from the records of all the patients treated during a 5-year period. Variables considered included the socio-demographic data, frequency of consumption of cariogenic diet, social habits, decayed, missing, filled teeth (DMFT), socioeconomic status (SES), and oral hygiene (OH), etc. Data were analyzed using student's t-test and one-way ANOVA for continuous variables, while Fishers exact test was adopted for categorical variables. Level of significance was set at P < 0.05.
Result: Less than 1% (21 out of 3458) of patients treated during the period had adult rampant caries, but only 17 patients with complete data were analyzed. The age range of the patients was 22–61 years with a median of 36 years. The number of teeth with open carious cavities ranged from 8 to 18, with a mean of 11.6 ± 3.3 teeth. A statistically significant difference was found in the number of open carious cavities and gender (P = 0.03), and between the SES and OH (P = 0.001). Patients in low SES had the poorest OH, The number of open carious lesion was higher in those that consumed refined sugar regularly.
Conclusion: Occurrence of rampant caries was low and related to low socioeconomic status and regular consumption of cariogenic diet
Randomized clinical study comparing metallic and glass fiber post in restoration of endodontically treated teeth
Background: Post-retained crowns are indicated for endodontically treated teeth (ETT) with severely damaged coronal tissue. Metallic custom and prefabricated posts have been used over the years, however, due to unacceptable color, extreme rigidity and corrosion, fiber posts, which are flexible, aesthetically pleasing and have modulus of elasticity comparable with dentin were introduced.
Aim: To compare clinical performance of metallic and glass fiber posts in restoration of ETT.
Materials and Methods: 40 ETT requiring post retained restorations were included. These teeth were randomly allocated into 2 groups. Twenty teeth were restored using a glass fiber-reinforced post (FRP) and 20 others received stainless steel parapost (PP), each in combination with composite core buildups. Patients were observed at 1 and 6 months after post placement and cementation of porcelain fused to metal (PFM) crown. Marginal gap consideration, post retention, post fracture, root fracture, crown fracture, crown decementation and loss of restoration were part of the data recorded. All teeth were assessed clinically and radiographically. Fisher′s exact test was used for categorical values while log-rank test was used for descriptive statistical analysis.
Results: One tooth in the PP group failed, secondary to decementation of the PFM crown giving a 2.5% overall failure while none in the FRP group failed. The survival rate of FRP was thus 100% while it was 97.5% in the PP group. This however was not statistically significant (log-rank test, P = 0.32).
Conclusion: Glass FRPs performed better than the metallic post based on short-term clinical performance
An audit of post-retained crown restorations in a University teaching hospital, Nigeria: a ten-year review
Aim: To review the pattern of failure and the associated factors of the post retained restorations done over ten years.Method: A retrospective cross-sectional study, which audited the record of post-retained restorations. Data including biodata, tooth type, post type, post size, luting cement, and failure were extracted and analyzed. Statistical significance was set at p value ≤0.05.
Results: There were 210 participants (M=106, F=104). Stainless steel para posts mainly were used (91.8%); with size three (3) being the highest (23.8%) recorded. Dual cure composite was mainly (78.1%) used for the post cementation. There were 27 (12.8%) cases of failure of post retained restorations of which post-fracture combined with post and crown dislodgement had the highest (52%), with the tooth fracture being the least reported (14.8%). Post fracture only was commoner in males (66.7%). The majority (81.5%) of the failure was seen in the para post group, with no tooth fracture reported for the fibre post.
Conclusion: Post and core placement is a common procedure for restoring endodontically treated teeth with a reduced coronal structure for the main purpose of retaining the core and, ultimately, the restoration. Stainless steel post was the commonest used, and post-fracture combined with dislodgement of post and crown constituted the most prevalent failure reported