78 research outputs found

    Norovirus infections in young children in Lusaka Province, Zambia: clinical characteristics and molecular epidemiology

    Get PDF
    Abstract Background The burden, clinical features, and molecular epidemiology of norovirus infection in young children in southern Africa are not well defined. Methods Using data from a health facility-based surveillance study of children <5 years in Lusaka Province, Zambia presenting with diarrhea, we assessed the burden of norovirus infection. A convenience sample of 454 stool specimens was tested for norovirus using reverse-transcriptase polymerase chain reaction (RT-PCR). RT-PCR positive samples underwent additional nucleotide sequencing for genogroup and genotype identification. Clinical features and severity of diarrheal illnesses were compared between norovirus-positive and -negative subjects using Chi-squared and t-tests. Results Norovirus was detected in 52/454 (11.5%) specimens tested. Abdominal pain, fever, and vomiting were the most common presenting features in norovirus-associated illnesses. However, there were no significant differences in the clinical features of norovirus-positive compared to norovirus-negative illnesses. Of 43 isolates that were available for sequencing, 31 (72.1%) were genogroup II (GII) and 12 (27.9%) were genogroup I (GI). The distribution of genotypes was diverse. Conclusions Noroviruses were detected in approximately 10% of young children with diarrhea in the Lusaka Province of Zambia, with GII representing the majority of infections. These findings support the role of norovirus in symptomatic diarrhea disease in Africa. Further studies are needed to confirm these observations and to evaluate prevention strategies

    Community-facility linkage models and maternal and infant health outcomes in Malawi’s PMTCT/ART program: a cohort study

    Get PDF
    Background: In sub-Saharan Africa, 3 community-facility linkage (CFL) models—Expert Clients, Community Health Workers (CHWs), and Mentor Mothers—have been widely implemented to support pregnant and breastfeeding women (PBFW) living with HIV and their infants to access and sustain care for prevention of mother-to-child transmission of HIV (PMTCT), yet their comparative impact under real-world conditions is poorly understood. Methods and findings: We sought to estimate the effects of CFL models on a primary outcome of maternal loss to follow-up (LTFU), and secondary outcomes of maternal longitudinal viral suppression and infant “poor outcome” (encompassing documented HIV-positive test result, LTFU, or death), in Malawi’s PMTCT/ART program. We sampled 30 of 42 high-volume health facilities (“sites”) in 5 Malawi districts for study inclusion. At each site, we reviewed medical records for all newly HIV-diagnosed PBFW entering the PMTCT program between July 1, 2016 and June 30, 2017, and, for pregnancies resulting in live births, their HIV-exposed infants, yielding 2,589 potentially eligible mother–infant pairs. Of these, 2,049 (79.1%) had an available HIV treatment record and formed the study cohort. A randomly selected subset of 817 (40.0%) cohort members underwent a field survey, consisting of a questionnaire and HIV biomarker assessment. Survey responses and biomarker results were used to impute CFL model exposure, maternal viral load, and early infant diagnosis (EID) outcomes for those missing these measures to enrich data in the larger cohort. We applied sampling weights in all statistical analyses to account for the differing proportions of facilities sampled by district. Of the 2,049 mother–infant pairs analyzed, 62.2% enrolled in PMTCT at a primary health center, at which time 43.7% of PBFW were ≀24 years old, and 778 (38.0%) received the Expert Client model, 640 (31.2%) the CHW model, 345 (16.8%) the Mentor Mother model, 192 (9.4%) ≄2 models, and 94 (4.6%) no model. Maternal LTFU varied by model, with LTFU being more likely among Mentor Mother model recipients (adjusted hazard ratio [aHR]: 1.45; 95% confidence interval [CI]: 1.14, 1.84; p = 0.003) than Expert Client recipients. Over 2 years from HIV diagnosis, PBFW supported by CHWs spent 14.3% (95% CI: 2.6%, 26.1%; p = 0.02) more days in an optimal state of antiretroviral therapy (ART) retention with viral suppression than women supported by Expert Clients. Infants receiving the Mentor Mother model (aHR: 1.24, 95% CI: 1.01, 1.52; p = 0.04) and ≄2 models (aHR: 1.44, 95% CI: 1.20, 1.74; p < 0.001) were more likely to undergo EID testing by age 6 months than infants supported by Expert Clients. Infants receiving the CHW and Mentor Mother models were 1.15 (95% CI: 0.80, 1.67; p = 0.44) and 0.84 (95% CI: 0.50, 1.42; p = 0.51) times as likely, respectively, to experience a poor outcome by 1 year than those supported by Expert Clients, but not significantly so. Study limitations include possible residual confounding, which may lead to inaccurate conclusions about the impacts of CFL models, uncertain generalizability of findings to other settings, and missing infant medical record data that limited the precision of infant outcome measurement. Conclusions: In this descriptive study, we observed widespread reach of CFL models in Malawi, with favorable maternal outcomes in the CHW model and greater infant EID testing uptake in the Mentor Mother model. Our findings point to important differences in maternal and infant HIV outcomes by CFL model along the PMTCT continuum and suggest future opportunities to identify key features of CFL models driving these outcome differences

    Investigating the physicochemical properties and pharmacokinetics of curcumin employing density functional theory and gastric protection

    Get PDF
    The extraction, isolation as well as theoretical investigation of Cumcuma Xanthoriz (cxz) molecule was evaluated to ascertain the physicochemical properties (pc) of the investigated compound. The plant extracts were isolated and characterized using NMR, FT-IR and UV-Vis Spectroscopy study. Pre-geometry characterization as well as theoretical analysis were performed within the frame of density functional theory (DFT) at B3LYP/6-311++ G (d,p) level of theory. Global descriptors were calculated at the same level of theory to ascertain the molecular stability, chemical reactivity of the investigated molecules. Stabilization studies was conducted to properly evaluate the stability of the complex and as such, the result obtained divulged that the charge delocalization from sigma (σ) to anti-sigma (σ*) molecular orbital contributed chiefly to the molecular stability of the studied compound. The calculated UV-Vis spectroscopy study reveal that all absorption spectrum occurred at the visible region (400nm-700nm) which correlate with the experimental ʎmax obtained. Excitation of CXZ was observed to emanate from π→π*electronic transition. Result from the topology and admet properties explicates that CXZ molecule exhibited good ADMET properties and therefore suggests its suitability as potential plant based drug

    Effectiveness and tolerability of Perindopril plus Amlodipine single pill combination in Nigeria: The 13 City Hypertension Study

    Get PDF
    Background: There is no large-scale study that has shown the efficacy of single pill combination (SPC) antihypertensive medications in black African population. We therefore evaluated the blood pressure (BP) lowering efficacy and the tolerability of Perindopril plus Amlodipine SPC in black African patients. Methods: It was a multi-centre, prospective, observational programme among hypertensive patients using different doses of Perindopril and Amlodipine. Primary endpoint was assessed as the change in mean sitting systolic and diastolic BPs from baseline to 3 months. Results: 937 patients (55.7% female) were analysed, and the mean age was 56.4 ± 12.7 years. Systolic and diastolic BPs were significantly reduced by 17.3/ 9.4mmHg, 21.1/10.8mmHg mmHg and 24.6/12.7mmHg at 4, 8 and 12 weeks respectively compared to baseline value (p&lt;0.0001). Dry cough was seen in 0.64% and angioedema 0.1% of the patients. Conclusions: Perindopril plus Amlodipine SPC provided clinically meaningful BP reductions and is well tolerated in a black African population. SAHeart 2022;19:6-1

    Evasion of MAIT cell recognition by the African Salmonella Typhimurium ST313 pathovar that causes invasive disease

    Get PDF
    Mucosal-associated invariant T (MAIT) cells are innate T lymphocytes activated by bacteria that produce vitamin B2 metabolites. Mouse models of infection have demonstrated a role for MAIT cells in antimicrobial defense. However, proposed protective roles of MAIT cells in human infections remain unproven and clinical conditions associated with selective absence of MAIT cells have not been identified. We report that typhoidal and nontyphoidal Salmonella enterica strains activate MAIT cells. However, S. Typhimurium sequence type 313 (ST313) lineage 2 strains, which are responsible for the burden of multidrug-resistant nontyphoidal invasive disease in Africa, escape MAIT cell recognition through overexpression of ribB. This bacterial gene encodes the 4-dihydroxy-2-butanone-4-phosphate synthase enzyme of the riboflavin biosynthetic pathway. The MAIT cell-specific phenotype did not extend to other innate lymphocytes. We propose that ribB overexpression is an evolved trait that facilitates evasion from immune recognition by MAIT cells and contributes to the invasive pathogenesis of S. Typhimurium ST313 lineage 2

    Comparative validation of machine learning algorithms for surgical workflow and skill analysis with the HeiChole benchmark

    Get PDF
    Purpose: Surgical workflow and skill analysis are key technologies for the next generation of cognitive surgical assistance systems. These systems could increase the safety of the operation through context-sensitive warnings and semi-autonomous robotic assistance or improve training of surgeons via data-driven feedback. In surgical workflow analysis up to 91% average precision has been reported for phase recognition on an open data single-center video dataset. In this work we investigated the generalizability of phase recognition algorithms in a multicenter setting including more difficult recognition tasks such as surgical action and surgical skill. Methods: To achieve this goal, a dataset with 33 laparoscopic cholecystectomy videos from three surgical centers with a total operation time of 22 h was created. Labels included framewise annotation of seven surgical phases with 250 phase transitions, 5514 occurences of four surgical actions, 6980 occurences of 21 surgical instruments from seven instrument categories and 495 skill classifications in five skill dimensions. The dataset was used in the 2019 international Endoscopic Vision challenge, sub-challenge for surgical workflow and skill analysis. Here, 12 research teams trained and submitted their machine learning algorithms for recognition of phase, action, instrument and/or skill assessment. Results: F1-scores were achieved for phase recognition between 23.9% and 67.7% (n = 9 teams), for instrument presence detection between 38.5% and 63.8% (n = 8 teams), but for action recognition only between 21.8% and 23.3% (n = 5 teams). The average absolute error for skill assessment was 0.78 (n = 1 team). Conclusion: Surgical workflow and skill analysis are promising technologies to support the surgical team, but there is still room for improvement, as shown by our comparison of machine learning algorithms. This novel HeiChole benchmark can be used for comparable evaluation and validation of future work. In future studies, it is of utmost importance to create more open, high-quality datasets in order to allow the development of artificial intelligence and cognitive robotics in surgery

    Uganda's experience in Ebola virus disease outbreak preparedness, 2018-2019.

    Get PDF
    BACKGROUND: Since the declaration of the 10th Ebola Virus Disease (EVD) outbreak in DRC on 1st Aug 2018, several neighboring countries have been developing and implementing preparedness efforts to prevent EVD cross-border transmission to enable timely detection, investigation, and response in the event of a confirmed EVD outbreak in the country. We describe Uganda's experience in EVD preparedness. RESULTS: On 4 August 2018, the Uganda Ministry of Health (MoH) activated the Public Health Emergency Operations Centre (PHEOC) and the National Task Force (NTF) for public health emergencies to plan, guide, and coordinate EVD preparedness in the country. The NTF selected an Incident Management Team (IMT), constituting a National Rapid Response Team (NRRT) that supported activation of the District Task Forces (DTFs) and District Rapid Response Teams (DRRTs) that jointly assessed levels of preparedness in 30 designated high-risk districts representing category 1 (20 districts) and category 2 (10 districts). The MoH, with technical guidance from the World Health Organisation (WHO), led EVD preparedness activities and worked together with other ministries and partner organisations to enhance community-based surveillance systems, develop and disseminate risk communication messages, engage communities, reinforce EVD screening and infection prevention measures at Points of Entry (PoEs) and in high-risk health facilities, construct and equip EVD isolation and treatment units, and establish coordination and procurement mechanisms. CONCLUSION: As of 31 May 2019, there was no confirmed case of EVD as Uganda has continued to make significant and verifiable progress in EVD preparedness. There is a need to sustain these efforts, not only in EVD preparedness but also across the entire spectrum of a multi-hazard framework. These efforts strengthen country capacity and compel the country to avail resources for preparedness and management of incidents at the source while effectively cutting costs of using a "fire-fighting" approach during public health emergencies

    Procalcitonin Is Not a Reliable Biomarker of Bacterial Coinfection in People With Coronavirus Disease 2019 Undergoing Microbiological Investigation at the Time of Hospital Admission

    Get PDF
    Abstract Admission procalcitonin measurements and microbiology results were available for 1040 hospitalized adults with coronavirus disease 2019 (from 48 902 included in the International Severe Acute Respiratory and Emerging Infections Consortium World Health Organization Clinical Characterisation Protocol UK study). Although procalcitonin was higher in bacterial coinfection, this was neither clinically significant (median [IQR], 0.33 [0.11–1.70] ng/mL vs 0.24 [0.10–0.90] ng/mL) nor diagnostically useful (area under the receiver operating characteristic curve, 0.56 [95% confidence interval, .51–.60]).</jats:p

    Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study

    Get PDF
    BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council

    Haematological consequences of acute uncomplicated falciparum malaria: a WorldWide Antimalarial Resistance Network pooled analysis of individual patient data

    Get PDF
    Background: Plasmodium falciparum malaria is associated with anaemia-related morbidity, attributable to host, parasite and drug factors. We quantified the haematological response following treatment of uncomplicated P. falciparum malaria to identify the factors associated with malarial anaemia. Methods: Individual patient data from eligible antimalarial efficacy studies of uncomplicated P. falciparum malaria, available through the WorldWide Antimalarial Resistance Network data repository prior to August 2015, were pooled using standardised methodology. The haematological response over time was quantified using a multivariable linear mixed effects model with nonlinear terms for time, and the model was then used to estimate the mean haemoglobin at day of nadir and day 7. Multivariable logistic regression quantified risk factors for moderately severe anaemia (haemoglobin < 7 g/dL) at day 0, day 3 and day 7 as well as a fractional fall ≄ 25% at day 3 and day 7. Results: A total of 70,226 patients, recruited into 200 studies between 1991 and 2013, were included in the analysis: 50,859 (72.4%) enrolled in Africa, 18,451 (26.3%) in Asia and 916 (1.3%) in South America. The median haemoglobin concentration at presentation was 9.9 g/dL (range 5.0–19.7 g/dL) in Africa, 11.6 g/dL (range 5.0–20.0 g/dL) in Asia and 12.3 g/dL (range 6.9–17.9 g/dL) in South America. Moderately severe anaemia (Hb < 7g/dl) was present in 8.4% (4284/50,859) of patients from Africa, 3.3% (606/18,451) from Asia and 0.1% (1/916) from South America. The nadir haemoglobin occurred on day 2 post treatment with a mean fall from baseline of 0.57 g/dL in Africa and 1.13 g/dL in Asia. Independent risk factors for moderately severe anaemia on day 7, in both Africa and Asia, included moderately severe anaemia at baseline (adjusted odds ratio (AOR) = 16.10 and AOR = 23.00, respectively), young age (age < 1 compared to ≄ 12 years AOR = 12.81 and AOR = 6.79, respectively), high parasitaemia (AOR = 1.78 and AOR = 1.58, respectively) and delayed parasite clearance (AOR = 2.44 and AOR = 2.59, respectively). In Asia, patients treated with an artemisinin-based regimen were at significantly greater risk of moderately severe anaemia on day 7 compared to those treated with a non-artemisinin-based regimen (AOR = 2.06 [95%CI 1.39–3.05], p < 0.001). Conclusions: In patients with uncomplicated P. falciparum malaria, the nadir haemoglobin occurs 2 days after starting treatment. Although artemisinin-based treatments increase the rate of parasite clearance, in Asia they are associated with a greater risk of anaemia during recovery
    • 

    corecore