58 research outputs found
Exploring the host factors affecting asymptomatic Plasmodium falciparum infection: insights from a rural Burkina Faso study.
BACKGROUND
Asymptomatic Plasmodium falciparum parasitaemia forms a reservoir for the transmission of malaria disease in West Africa. Certain haemoglobin variants are known to protect against severe malaria infection. However, data on the potential roles of haemoglobin variants and nongenetic factors in asymptomatic malaria infection is scarce and controversial. Therefore, this study investigated the associations of iron homeostasis, inflammation, nutrition, and haemoglobin mutations with parasitaemia in an asymptomatic cohort from a P. falciparum-endemic region during the high transmission season.
METHODS
A sub-study population of 688 asymptomatic individuals (predominantly children and adolescents under 15 years, n = 516) from rural Burkina Faso previously recruited by the NOVAC trial (NCT03176719) between June and October 2017 was analysed. Parasitaemia was quantified with conventional haemocytometry. The haemoglobin genotype was determined by reverse hybridization assays targeting a selection of 21 HBA and 22 HBB mutations. Demographics, inflammatory markers (interleukins 6 and 10, hepcidin), nutritional status (mid upper-arm circumference and body mass index), and anaemia (total haemoglobin, ferritin, soluble transferrin receptor) were assessed as potential predictors through logistic regression.
RESULTS
Malaria parasites were detected in 56% of subjects. Parasitaemia was associated most strongly with malnutrition. The effect size increased with malnutrition severity (OR = 6.26, CI95: 2.45-19.4, p < 0.001). Furthermore, statistically significant associations (p < 0.05) with age, cytokines, hepcidin and heterozygous haemoglobin S were observed.
CONCLUSIONS
According to these findings, asymptomatic parasitaemia is attenuated by haemoglobin S, but not by any of the other detected genotypes. Aside from evidence for slight iron imbalance, overall undernutrition was found to predict parasitaemia; thus, further investigations are required to elucidate causality and inform strategies for interventions
Evidence based post graduate training. A systematic review of reviews based on the WFME quality framework
<p>Abstract</p> <p>Background</p> <p>A framework for high quality in post graduate training has been defined by the World Federation of Medical Education (WFME). The objective of this paper is to perform a systematic review of reviews to find current evidence regarding aspects of quality of post graduate training and to organise the results following the 9 areas of the WFME framework.</p> <p>Methods</p> <p>The systematic literature review was conducted in 2009 in Medline Ovid, EMBASE, ERIC and RDRB databases from 1995 onward. The reviews were selected by two independent researchers and a quality appraisal was based on the SIGN tool.</p> <p>Results</p> <p>31 reviews met inclusion criteria. The majority of the reviews provided information about the training process (WFME area 2), the assessment of trainees (WFME area 3) and the trainees (WFME area 4). One review covered the area 8 'governance and administration'. No review was found in relation to the mission and outcomes, the evaluation of the training process and the continuous renewal (respectively areas 1, 7 and 9 of the WFME framework).</p> <p>Conclusions</p> <p>The majority of the reviews provided information about the training process, the assessment of trainees and the trainees. Indicators used for quality assessment purposes of post graduate training should be based on this evidence but further research is needed for some areas in particular to assess the quality of the training process.</p
Is HIV Infection a Risk Factor for Multi-Drug Resistant Tuberculosis? A Systematic Review
BACKGROUND:Tuberculosis (TB) is an important cause of human suffering and death. Human immunodeficiency virus (HIV), multi-drug resistant TB (MDR-TB), and extensive drug resistant tuberculosis (XDR-TB) have emerged as threats to TB control. The association between MDR-TB and HIV infection has not yet been fully investigated. We conducted a systematic review and meta-analysis to summarize the evidence on the association between HIV infection and MDR-TB. METHODS AND RESULTS:Original studies providing Mycobacterium tuberculosis resistance data stratified by HIV status were identified using MEDLINE and ISI Web of Science. Crude MDR-TB prevalence ratios were calculated and analyzed by type of TB (primary or acquired), region and study period. Heterogeneity across studies was assessed, and pooled prevalence ratios were generated if appropriate. No clear association was found between MDR-TB and HIV infection across time and geographic locations. MDR-TB prevalence ratios in the 32 eligible studies, comparing MDR-TB prevalence by HIV status, ranged from 0.21 to 41.45. Assessment by geographical region or study period did not reveal noticeable patterns. The summary prevalence ratios for acquired and primary MDR-TB were 1.17 (95% CI 0.86, 1.6) and 2.72 (95% CI 2.03, 3.66), respectively. Studies eligible for review were few considering the size of the epidemics. Most studies were not adjusted for confounders and the heterogeneity across studies precluded the calculation of a meaningful overall summary measure. CONCLUSIONS:We could not demonstrate an overall association between MDR-TB and HIV or acquired MDR-TB and HIV, but our results suggest that HIV infection is associated with primary MDR-TB. Future well-designed studies and surveillance in all regions of the world are needed to better clarify the relationship between HIV infection and MDR-TB
Increased erythroferrone levels in malarial anaemia.
We assessed the diagnostic potential of erythroferrone as a biomarker for iron homeostasis comparing iron deficiency cases with anaemia of inflammation and controls. The dysregulation of the hepcidin axis was observed by Latour et al. in a mouse model of malarial anaemia induced by prolonged Plasmodium infection leading to increased erythroferrone concentrations. In line with that, we found significantly higher erythroferrone levels in cases with malaria and anaemia in an African population, compared to asymptomatic controls. Therefore, our findings extend the previous ones of the mouse model, suggesting also a dysregulation of the hepcidin axis in humans, which should be further corroborated in prospective studies and may lay the basis for the development of improved treatment strategies according to ERFE concentrations in such patients
Etiological and Pathogenic Factors in Congenital Diaphragmatic Hernia
Congenital diaphragmatic hernia (CDH) is a congenital anomaly associated with an increased mortality and morbidity. In this article, we review the currently known etiological and pathogenic factors in CDH
High genetic similarity between non-typhoidalSalmonellaisolated from paired blood and stool samples of children in the Democratic Republic of the Congo
BACKGROUND: Non-typhoidal Salmonella (NTS) serotypes Typhimurium and Enteritidis are a major cause of bloodstream infections in children in sub-Saharan Africa but their reservoir is unknown. We compared pairs of NTS blood and stool isolates (with the same NTS serotype recovered in the same patient) for genetic similarity. METHODS: Between November 2013 and April 2017, hospital-admitted children (29 days to 14 years) with culture-confirmed NTS bloodstream infections were enrolled in a cross-sectional study at Kisantu Hospital, DR Congo. Stool cultures for Salmonella were performed on a subset of enrolled children, as well as on a control group of non-febrile hospital-admitted children. Pairs of blood and stool NTS isolates were assessed for genetic similarity by multiple-locus variable-number of tandem repeats (MLVA) and genomics analysis. RESULTS: A total of 299 children with NTS grown from blood cultures (Typhimurium 68.6%, Enteritidis 30.4%, other NTS 1.0%) had a stool sample processed; in 105 (35.1%) of them NTS was detected (Typhimurium 70.5%, Enteritidis 25.7%, other NTS 3.8%). A total of 87/105 (82.9%) pairs of blood and stool NTS isolates were observed (representing 29.1% of the 299 children). Among 1598 controls, the proportion of NTS stool excretion was 2.1% (p < 0.0001). MLVA types among paired isolates were identical in 82/87 (94.3%) pairs (27.4% of the 299 children; 61/66 (92.4%) in Typhimurium and 21/21 (100%) in Enteritidis pairs). Genomics analysis confirmed high genetic similarity within 41/43 (95.3%) pairs, showing a median SNP difference of 1 (range 0-77) and 1 (range 0-4) for Typhimurium and Enteritidis pairs respectively. Typhimurium and Enteritidis isolates belonged to sequence types ST313 lineage II and ST11 respectively. CONCLUSION: Nearly 30% of children with NTS bloodstream infection showed stool excretion of an NTS isolate with high genetic similarity, adding to the evidence of humans as a potential reservoir for NTS.status: publishe
Value of echocardiography using knowledge-based reconstruction in determining right ventricular volumes in pulmonary sarcoidosis: comparison with cardiac magnetic resonance imaging
Right ventricular (RV) dysfunction in sarcoidosis is associated with adverse outcomes. Assessment of RV function by conventional transthoracic echocardiography (TTE) is challenging due to the complex RV geometry. Knowledge-based reconstruction (KBR) combines TTE measurements with three-dimensional coordinates to determine RV volumes. The aim of this study was to investigate the accuracy of TTE-KBR compared to the gold standard cardiac magnetic resonance imaging (CMR) in determining RV dimensions in pulmonary sarcoidosis. Pulmonary sarcoidosis patients prospectively received same-day TTE and TTE-KBR. If performed, CMR within 90 days after TTE-KBR was used as reference standard. Outcome parameters included RV end-diastolic volume (RVEDV), end-systolic volume (RVESV), stroke volume (RVSV) and ejection fraction (RVEF). 281 patients underwent same day TTE and TTE-KBR. In total, 122 patients received a CMR within 90 days of TTE and were included. TTE-KBR measured RVEDV and RVESV showed strong correlation with CMR measurements (R = 0.73, R = 0.76), while RVSV and RVEF correlated weakly (R = 0.46, R = 0.46). Bland-Altman analyses (mean bias ± 95% limits of agreement), showed good agreement for RVEDV (ΔRVEDVKBR-CMR, 5.67 ± 55.4 mL), while RVESV, RVSV and RVEF showed poor agreement (ΔRVESVKBR-CMR, 21.6 ± 34.1 mL; ΔRVSVKBR-CMR, - 16.1 ± 42.9 mL; ΔRVEFKBR-CMR, - 12.9 ± 16.4%). The image quality and time between CMR and TTE-KBR showed no impact on intermodality differences and there was no sign of a possible learning curve. TTE-KBR is convenient and shows good agreement with CMR for RVEDV. However, there is poor agreement for RVESV, RVSV and RVEF. The use of TTE-KBR does not seem to provide additional value in the determination of RV dimensions in pulmonary sarcoidosis patients
Predictors of appropriate implantable cardiac defibrillator therapy in cardiac sarcoidosis
Background: Cardiac sarcoidosis (CS) is associated with an increased risk for sudden cardiac death. An implantable cardiac defibrillator (ICD) is recommended in a subgroup of CS patients. However, the recommendations for primary prevention differ between guidelines. The purpose of the study was to evaluate the efficacy and safety of ICDs in CS and to identify predictors of appropriate therapy. Methods: A retrospective cohort study was performed in CS patients with an ICD implantation between 2010 and 2019. Primary outcome was appropriate ICD therapy. Independent predictors were calculated using Cox proportional hazard analysis. Results: 105 patients were included. An ICD was implanted for primary prevention in 79%. During a median follow-up of 2.8 years, 34 patients (32.4%) received appropriate ICD therapy of whom 24 (22.9%) received an appropriate shock. Three patients (2.9%) received an inappropriate shock due to atrial fibrillation. Independent predictors of appropriate therapy included prior ventricular arrhythmias (hazard ratio [HR]: 10.5 [95% confidence interval (CI): 5.0–21.9]) and right ventricular late gadolinium enhancement (LGE) (HR: 3.6 [95% CI: 1.7–7.6]). Within the primary prevention group, right ventricular LGE (HR: 5.7 [95% CI: 1.6–20.7]) was the only independent predictor of appropriate therapy. Left ventricular ejection fraction did not differ between patients with and without appropriate therapy (44.4% vs. 45.6%, p =.70). Conclusion: In CS patients with an ICD, a high rate of appropriate therapy was observed and a low rate of inappropriate shocks. Prior ventricular arrhythmias and right ventricular LGE were independent predictors of appropriate therapy
- …