28 research outputs found

    Insights from quantitative analysis and mathematical modelling on the proposed who 2030 goals for soil-transmitted helminths [version 1; peer review: 2 approved]

    Get PDF
    Soil-transmitted helminths (STHs) are a group of parasitic worms that infect humans, causing a wide spectrum of disease, notably anaemia, growth retardation, and delayed cognitive development. The three main STHs are Ascaris lumbricoides, Trichuris trichiura and hookworm (Necator americanus and Ancylostoma duodenale). Approximately 1.5 billion people are infected with STHs worldwide. The World Health Organization goal for 2030 is morbidity control, defined as reaching <2% prevalence of medium-to-high intensity infections in preschool-age children and school-age children (SAC). Treatment guidelines for achieving this goal have been recommended. The Neglected Tropical Diseases Modelling Consortium has developed mathematical and statistical models to quantify, predict, and evaluate the impact of control measures on STHs. These models show that the morbidity target can be achieved following current guidelines in moderate prevalence settings (20-50% in SAC). In high prevalence settings, semi-annual preventive chemotherapy (PC) ideally including adults, or at least women of reproductive age, is required. For T. trichiura, dual therapy with albendazole and ivermectin is required. In general, stopping PC is not possible without infection resurgence, unless effective measures for improved access to water, hygiene, and sanitation have been implemented, or elimination of transmission has been achieved. Current diagnostic methods are based on egg counts in stool samples, but these are known to have poor sensitivity at low prevalence levels. A target threshold for novel, more sensitive diagnostics should be defined relative to currently preferred diagnostics (Kato-Katz). Our analyses identify the extent of systematic non-access to treatment and the individual patterns of compliance over multiple rounds of treatment as the biggest unknowns and the main impediment to reaching the target. Moreover, the link between morbidity and infection intensity has not been fully elucidated. By providing more insights on all the above, we aim to inform discussions on the goals and treatment guidelines for STHs

    Common variants in CLDN2 and MORC4 genes confer disease susceptibility in patients with chronic pancreatitis

    Get PDF
    A recent Genome-wide Association Study (GWAS) identified association with variants in X-linked CLDN2 and MORC4 and PRSS1-PRSS2 loci with Chronic Pancreatitis (CP) in North American patients of European ancestry. We selected 9 variants from the reported GWAS and replicated the association with CP in Indian patients by genotyping 1807 unrelated Indians of Indo-European ethnicity, including 519 patients with CP and 1288 controls. The etiology of CP was idiopathic in 83.62% and alcoholic in 16.38% of 519 patients. Our study confirmed a significant association of 2 variants in CLDN2 gene (rs4409525—OR 1.71, P = 1.38 x 10-09; rs12008279—OR 1.56, P = 1.53 x 10-04) and 2 variants in MORC4 gene (rs12688220—OR 1.72, P = 9.20 x 10-09; rs6622126—OR 1.75, P = 4.04x10-05) in Indian patients with CP. We also found significant association at PRSS1-PRSS2 locus (OR 0.60; P = 9.92 x 10-06) and SAMD12-TNFRSF11B (OR 0.49, 95% CI [0.31–0.78], P = 0.0027). A variant in the gene MORC4 (rs12688220) showed significant interaction with alcohol (OR for homozygous and heterozygous risk allele -14.62 and 1.51 respectively, P = 0.0068) suggesting gene-environment interaction. A combined analysis of the genes CLDN2 and MORC4 based on an effective risk allele score revealed a higher percentage of individuals homozygous for the risk allele in CP cases with 5.09 fold enhanced risk in individuals with 7 or more effective risk alleles compared with individuals with 3 or less risk alleles (P = 1.88 x 10-14). Genetic variants in CLDN2 and MORC4 genes were associated with CP in Indian patients

    Effects of antibiotic resistance, drug target attainment, bacterial pathogenicity and virulence, and antibiotic access and affordability on outcomes in neonatal sepsis: an international microbiology and drug evaluation prospective substudy (BARNARDS)

    Get PDF
    Background Sepsis is a major contributor to neonatal mortality, particularly in low-income and middle-income countries (LMICs). WHO advocates ampicillin–gentamicin as first-line therapy for the management of neonatal sepsis. In the BARNARDS observational cohort study of neonatal sepsis and antimicrobial resistance in LMICs, common sepsis pathogens were characterised via whole genome sequencing (WGS) and antimicrobial resistance profiles. In this substudy of BARNARDS, we aimed to assess the use and efficacy of empirical antibiotic therapies commonly used in LMICs for neonatal sepsis. Methods In BARNARDS, consenting mother–neonates aged 0–60 days dyads were enrolled on delivery or neonatal presentation with suspected sepsis at 12 BARNARDS clinical sites in Bangladesh, Ethiopia, India, Pakistan, Nigeria, Rwanda, and South Africa. Stillborn babies were excluded from the study. Blood samples were collected from neonates presenting with clinical signs of sepsis, and WGS and minimum inhibitory concentrations for antibiotic treatment were determined for bacterial isolates from culture-confirmed sepsis. Neonatal outcome data were collected following enrolment until 60 days of life. Antibiotic usage and neonatal outcome data were assessed. Survival analyses were adjusted to take into account potential clinical confounding variables related to the birth and pathogen. Additionally, resistance profiles, pharmacokinetic–pharmacodynamic probability of target attainment, and frequency of resistance (ie, resistance defined by in-vitro growth of isolates when challenged by antibiotics) were assessed. Questionnaires on health structures and antibiotic costs evaluated accessibility and affordability. Findings Between Nov 12, 2015, and Feb 1, 2018, 36 285 neonates were enrolled into the main BARNARDS study, of whom 9874 had clinically diagnosed sepsis and 5749 had available antibiotic data. The four most commonly prescribed antibiotic combinations given to 4451 neonates (77·42%) of 5749 were ampicillin–gentamicin, ceftazidime–amikacin, piperacillin–tazobactam–amikacin, and amoxicillin clavulanate–amikacin. This dataset assessed 476 prescriptions for 442 neonates treated with one of these antibiotic combinations with WGS data (all BARNARDS countries were represented in this subset except India). Multiple pathogens were isolated, totalling 457 isolates. Reported mortality was lower for neonates treated with ceftazidime–amikacin than for neonates treated with ampicillin–gentamicin (hazard ratio [adjusted for clinical variables considered potential confounders to outcomes] 0·32, 95% CI 0·14–0·72; p=0·0060). Of 390 Gram-negative isolates, 379 (97·2%) were resistant to ampicillin and 274 (70·3%) were resistant to gentamicin. Susceptibility of Gram-negative isolates to at least one antibiotic in a treatment combination was noted in 111 (28·5%) to ampicillin–gentamicin; 286 (73·3%) to amoxicillin clavulanate–amikacin; 301 (77·2%) to ceftazidime–amikacin; and 312 (80·0%) to piperacillin–tazobactam–amikacin. A probability of target attainment of 80% or more was noted in 26 neonates (33·7% [SD 0·59]) of 78 with ampicillin–gentamicin; 15 (68·0% [3·84]) of 27 with amoxicillin clavulanate–amikacin; 93 (92·7% [0·24]) of 109 with ceftazidime–amikacin; and 70 (85·3% [0·47]) of 76 with piperacillin–tazobactam–amikacin. However, antibiotic and country effects could not be distinguished. Frequency of resistance was recorded most frequently with fosfomycin (in 78 isolates [68·4%] of 114), followed by colistin (55 isolates [57·3%] of 96), and gentamicin (62 isolates [53·0%] of 117). Sites in six of the seven countries (excluding South Africa) stated that the cost of antibiotics would influence treatment of neonatal sepsis

    Iron Deficiency in Adolescent and Young Adult German Athletes—A Retrospective Study

    No full text
    Background: Iron deficiency is a common phenomenon in sports and may lead to impaired physical performance. The aim of the study was to determine the frequency of iron deficiency in competitive athletes and to discuss the resulting consequences. Methods: The data of 629 athletes (339 male, 290 female) who presented for their annual basic sports medicine examination were investigated. Depending on age (<14 years, 15–17 years, ≥18–30 years), four groups ((I.) normal hemoglobin (Hb) and ferritin level (≥30 ng/mL for adults and 15–18-year-olds; ≥20 ng/mL, respectively, ≥15 ng/mL for adolescents and children), (II.) prelatent iron deficiency (ID) (normal Hb, low ferritin), (III.) latent ID (additionally elevated soluble transferrin receptor or decreased transferrin saturation) and (IV.) manifest anemia) were distinguished. In addition, the iron status and exercise capacity of different types of sports were compared. Results: Overall we found an iron deficiency of 10.9% in male (mainly in adolescence) and 35.9% in female athletes (emphasized in adolescence and young adulthood). There were no significant differences in iron status in regard to the different sport types or in maximum performance for the different groups of iron deficiency. Conclusions: Adolescent and female athletes are more likely to have an iron deficiency. Therapy concepts for athletes therefore should pay attention to iron-rich diets

    Evaluating perspective and quality of life of glaucoma patients during the COVID-19 pandemic in India: Results of a telephone survey

    No full text
    Purpose: The past few years have been difficult in the lives of most glaucoma patients in view of the COVID-19 pandemic. Our aim was to find out patients' perspective and disruption of their quality of life during the COVID-19 pandemic by conducting a telephone survey among glaucoma patients. Methods: This was a cross-sectional study involving the glaucoma patients of a tertiary eye care hospital in India. Patients who had completed at least five years of follow-up before 2020 were randomized by a random number generator. A validated (forward–backward translation and completed pilot analysis) set of 14 questionnaires was administered to the patients, the latter of whom were telephonically interviewed by one of the investigators in February 2022. The entire data was audio-recorded. Statistical Package for the Social Sciences (SPSS) version 26 was used. Results: Out of 1141 patients with >5 years of follow-up, 103 were selected by randomization. A large group of 46 patients (44.6%) admitted to glaucoma affecting their daily activities. Only 12 (11.6%) admitted to being irregular with their drops. Thirty-four (33%) patients felt that their glaucoma was deteriorating and 31 (30.1%) had fear of blindness. Ninety-five patients (92.7%) felt that they were safe under the care of the treating doctor. There were 46 (44.6%) out of 103 patients who did not turn up for follow-up for six months or more. Lockdown (36.2%) and travel-expenses (27.6%) were the two most common reasons for the loss to follow-up visits. Conclusion: Nearly half of the long-term glaucoma patients were lost to follow-up during the COVID-19 pandemic. Glaucoma affecting daily lives and fear of losing vision turned out to be significant observations in the telephone survey. This fear seemed to be ameliorated by the majority still feeling safe by being in touch with their doctor for continued care even during the COVID-19 pandemic
    corecore