7 research outputs found

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Analgesia nociception index as a tool to predict hypotension after spinal anaesthesia for elective caesarean section

    No full text
    Arterial hypotension is the main disadvantage of spinal anaesthesia (SA) for caesarean delivery with deleterious effects on maternal–foetal outcomes. Recently, a non-invasive device ‘analgesia nociception index’ (ANI) has been developed to evaluate the parasympathetic component of the nervous autonomous system. The aim of this study was to evaluate the ability of ANI to predict the risk of hypotension after SA for elective caesarean section. One hundred patients scheduled for elective caesarean delivery under SA were recruited in this observational prospective study. Hemodynamic and ANI parameters were recorded in supine position (TB), in sitting position (T0), after induction of SA (T1) and then every three minutes (T2, T3, Tn) until the end of surgery or having resort to ephedrine. After SA, women were classified into two groups according to occurrence of hypotension (group H, n = 80) or not (group C, n = 20). The variations of ANI between T2 and T0 were significantly higher in the group H as compared to the control group. A threshold of 4.5 points decrease in instantaneous ANI value could predict maternal hypotension. ANI is a simple and effective tool in predicting the risk of SA-related hypotension.Impact statement What is already known on this subject? Arterial hypotension is the main disadvantage of spinal anaesthesia for caesarean delivery with deleterious effects on maternal-foetal outcomes. The balance between the sympathic and parasympathic systems could be used to predict the onset of hypotension following spinal anaesthesia. Analgesia nociception index (ANI) is an index calculated based on heart rate variability HRV analysis, designed originally to evaluate the antinociception/Nociception balance. What do the results of this study add? We have shown that the analysis of HRV with ANI was a predictor of maternal hypotension after spinal anaesthesia. What are the implications of these findings for clinical practice and/or further research? ANI is an effective tool in predicting the risk of spinal anaesthesia-related hypotension. These findings are of potential clinical importance in the obstetrical anaesthesia setting. Further studies are required in order to implement this simple tool and optimise prophylactic measures especially vasopressors

    A comparison between intravenous lidocaine and ketamine on acute and chronic pain after open nephrectomy: A prospective, double-blind, randomized, placebo-controlled study

    No full text
    Background: Recently, there has been increasing interest in the use of analgesic adjuncts such as intravenous (IV) ketamine and lidocaine. Objectives: To compare the effects of perioperative IV lidocaine and ketamine on morphine requirements, pain scores, quality of recovery, and chronic pain after open nephrectomy. Study Design: A prospective, randomized, placebo-controlled, double-blind trial. Settings: The study was conducted in Charles Nicolle University Hospital of Tunis. Methods: Sixty patients were randomly allocated to receive IV lidocaine: bolus of 1.5 mg/kg at the induction of anesthesia followed by infusion of 1 mg/kg/h intraoperatively and for 24 h postoperatively or ketamine: bolus of 0.15 mg/kg followed by infusion of 0.1 mg/kg/h intraoperatively and for 24 h postoperatively or an equal volume of saline (control group [CG]). Measurements: Morphine consumption, visual analog scale pain scores, time to the first passage of flatus and feces, postoperative nausea and vomiting (PONV), 6-min walk distance (6MWD) at discharge, and the incidence of chronic neuropathic pain using the “Neuropathic Pain Questionnaire” at 3 months. Results: Ketamine and lidocaine reduced significantly morphine consumption (by about 33% and 42%, respectively) and pain scores compared with the CG (P < 0.001). Lidocaine and ketamine also significantly improved bowel function in comparison to the CG (P < 0.001). Ketamine failed to reduce the incidence of PONV. The 6 MWD increased significantly from a mean ± standard deviation of 27 ± 16.2 m in the CG to 82.3 ± 28 m in the lidocaine group (P < 0.001). Lidocaine, but not ketamine, reduced significantly the development of neuropathic pain at 3 months (P < 0.05). Conclusion: Ketamine and lidocaine are safe and effective adjuvants to decrease opioid consumption and control early pain. We also suggest that lidocaine infusion serves as an interesting alternative to improve the functional walking capacity and prevent chronic neuropathic pain at 3 months after open nephrectomy

    Efficacy and safety of Parecoxib for prevention of catheter-related bladder discomfort in patients undergoing transurethral resection of bladder tumor: Prospective randomised trial

    No full text
    Background and Aims: Catheter-related bladder discomfort (CRBD) is the urge to void or discomfort in the suprapubic region secondary to an indwelling urinary catheter. We aimed to evaluate the safety and efficacy of single-dose of intravenous parecoxib in reducing the incidence and severity of CRBD in patients undergoing transurethral resection of bladder tumor (TURBT). Methods: Sixty-one adult patients, American Society of Anesthesiologists physical status I or II, undergoing elective TURBT under spinal anaesthesia, were randomly allocated to receive 40 mg of IV parecoxib (group P; n = 29) or an equal volume of normal saline (control group C; n = 32). CRBD was graded as none, mild, moderate, and severe. Between-group comparisons were made for the incidence and severity of CRBD, postoperative Visual analog scales (VAS), rescue analgesia equirements, and occurrence of adverse events. Statistical analysis done with the Mann–Whitney U-test and Fisher's Exact Test. A P value of ≀ 0.05 was considered statistically significant. Results: Parecoxib significantly reduced the incidence and severity of CRBD at 2, 4, 6, and 12 hours postoperatively compared to placebo (P < 0.05). Median pain VAS scores were lower in the P group at all times except the first hour. Rescue analgesia was given to more patients in group C (16/32, 50%) than in group P (1/29) (P < 0.001). None of the patients who received parecoxib experienced an adverse event. Conclusion: A single intravenous injection of parecoxib is safe and effective in decreasing the incidence and severity of CRBD in patients undergoing TURBT. Trial Registration Identifier: NCT02729935(www.clinicaltrials.gov)

    Chemical Composition and Cytotoxic Activity of the Fractionated Trunk Bark Essential Oil from Tetraclinis articulata (Vahl) Mast. Growing in Tunisia

    No full text
    The aim of the present research was to determine the chemical composition and the cytotoxic effects of Tetraclinis articulata trunk bark essential oil (HEE) obtained by steam distillation and five fractions obtained by normal phase silica chromatographic separation. Chemical analysis allowed the identification of 54 known compounds. Relatively high amounts of oxygenated sesquiterpenes (44.4-70.2%) were detected, mainly consisting of caryophyllene oxide (13.1-26.6%), carotol (9.2-21.2%),14-hydroxy-9-epi-(E)-caryophyllene (3.2-15.5%) and humulene epoxide II (2.6-7.2%). The cytotoxic activity against human mammary carcinoma cell lines (MDA-MB-231) and colorectal carcinoma cell lines (SW620) of the essential oil and its fractions were assessed. All the samples displayed moderate to weak activity compared to 5-fluorouracil. The colorectal carcinoma cell line was relatively more sensitive to the essential oil and its fractions compared to the breast cancer cell line, showing IC50 values from 25.7 to 96.5 mug/mL. In addition, the essential oil and its fraction E.2 revealed a cytotoxic activity against colorectal carcinoma cell line, with IC50 values lower than 30 mug/mL. This is the first report on the chemical composition and cytotoxic activity of the trunk bark essential oil of T. articulata

    High prevalence of gut microbiota colonization with broad-spectrum cephalosporin resistant Enterobacteriaceae in a Tunisian intensive care unit

    Get PDF
    Healthcare-associated infections due to cefotaxime-resistant Enterobacteriaceae (CRE) have become a major public health threat, especially in intensive care units (ICUs). Often acquired nosocomially, CRE can be introduced initially by patients at admission. This study aimed to determine the prevalence and genetic characteristics of CRE-intestinal carriage in ICU patients, to evaluate the rate of acquisition of these organisms during hospitalization, and to explore some of the associated risk factors for both carriage and acquisition.Between December 2014 and February 2015, the 63 patients admitted in the ICU of Charles Nicolle hospital were screened for rectal CRE colonization at admission and once weekly thereafter to identify acquisition. CRE fecal carriage rate was 20.63% (13/63) at admission and the acquisition rate was 42.85% (15/35). Overall, 35 CRE isolates were collected from 28 patients (25 Klebsiella pneumoniae, 7 Escherichia coli and 3 Enterobacter cloacae strains). Seven patients were simultaneously colonized with 2 CRE isolates. CTX-M-15 was detected in most of the CRE isolates (30/35, 88.23%).Three strains co-produced CMY-4 and 22 strains were carbapenem-resistant and co-produced a carbapenemase OXA-48 (n=13) or NDM-1 (n=6). All isolates were multidrug resistant. Molecular typing of K. pneumoniae strains, revealed 8 Pulsed field gel electrophoresis (PFGE) patterns and 4 sequence types (ST) ST101, ST147, ST429 and ST336. However, E. coli isolates were genetically unrelated and belonged to A (n=2), B1 (n=2) and B2 (n=3) phylogenetic groups and to ST131 (2 strains), ST572 (2 strains), ST615 (one strain) and ST617 (one strain). Five colonized patients were infected by CRE (4 with the same strain identified from their rectal swab and 1 with a different strain). Whether imported or acquired during the stay in the ICU, colonization by CRE is a major risk factor for the occurrence of serious nosocomial infections. Their systematic screening in fecal carriage is mandatory to prevent the spread of these multidrug resistant bacteria
    corecore