88 research outputs found

    Impact of two interventions on timeliness and data quality of an electronic disease surveillance system in a resource limited setting (Peru): a prospective evaluation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A timely detection of outbreaks through surveillance is needed in order to prevent future pandemics. However, current surveillance systems may not be prepared to accomplish this goal, especially in resource limited settings. As data quality and timeliness are attributes that improve outbreak detection capacity, we assessed the effect of two interventions on such attributes in Alerta, an electronic disease surveillance system in the Peruvian Navy.</p> <p>Methods</p> <p>40 Alerta reporting units (18 clinics and 22 ships) were included in a 12-week prospective evaluation project. After a short refresher course on the notification process, units were randomly assigned to either a phone, visit or control group. Phone group sites were called three hours before the biweekly reporting deadline if they had not sent their report. Visit group sites received supervision visits on weeks 4 & 8, but no phone calls. The control group sites were not contacted by phone or visited. Timeliness and data quality were assessed by calculating the percentage of reports sent on time and percentage of errors per total number of reports, respectively.</p> <p>Results</p> <p>Timeliness improved in the phone group from 64.6% to 84% in clinics (+19.4 [95% CI, +10.3 to +28.6]; p < 0.001) and from 46.9% to 77.3% on ships (+30.4 [95% CI, +16.9 to +43.8]; p < 0.001). Visit and control groups did not show significant changes in timeliness. Error rates decreased in the visit group from 7.1% to 2% in clinics (-5.1 [95% CI, -8.7 to -1.4]; p = 0.007), but only from 7.3% to 6.7% on ships (-0.6 [95% CI, -2.4 to +1.1]; p = 0.445). Phone and control groups did not show significant improvement in data quality.</p> <p>Conclusion</p> <p>Regular phone reminders significantly improved timeliness of reports in clinics and ships, whereas supervision visits led to improved data quality only among clinics. Further investigations are needed to establish the cost-effectiveness and optimal use of each of these strategies.</p

    Targeted Gene Panel Sequencing for Early-onset Inflammatory Bowel Disease and Chronic Diarrhea

    Get PDF
    Background: In contrast to adult-onset inflammatory bowel disease (IBD), where many genetic loci have been shown to be involved in complex disease etiology, early-onset IBD (eoIBD) and associated syndromes can sometimes present as monogenic conditions. As a result, the clinical phenotype and ideal disease management in these patients often differ from those in adult-onset IBD. However, due to high costs and the complexity of data analysis, high-throughput screening for genetic causes has not yet become a standard part of the diagnostic work-up of eoIBD patients. Methods: We selected 28 genes of interest associated with monogenic IBD and performed targeted panel sequencing in 71 patients diagnosed with eoIBD or early-onset chronic diarrhea to detect causative variants. We compared these results to whole-exome sequencing (WES) data available for 25 of these patients. Results: Target coverage was significantly higher in the targeted gene panel approach compared with WES, whereas the cost of the panel was considerably lower (approximately 25% of WES). Disease-causing variants affecting protein function were identified in 5 patients (7%), located in genes of the IL10 signaling pathway (3), WAS (1), and DKC1 (1). The functional effects of 8 candidate variants in 5 additional patients (7%) are under further investigation. WES did not identify additional causative mutations in 25 patients. Conclusions: Targeted gene panel sequencing is a fast and effective screening method for monogenic causes of eoIBD that should be routinely established in national referral centers.info:eu-repo/semantics/publishedVersio

    Mosquitoes Put the Brake on Arbovirus Evolution: Experimental Evolution Reveals Slower Mutation Accumulation in Mosquito Than Vertebrate Cells

    Get PDF
    Like other arthropod-borne viruses (arboviruses), mosquito-borne dengue virus (DENV) is maintained in an alternating cycle of replication in arthropod and vertebrate hosts. The trade-off hypothesis suggests that this alternation constrains DENV evolution because a fitness increase in one host usually diminishes fitness in the other. Moreover, the hypothesis predicts that releasing DENV from host alternation should facilitate adaptation. To test this prediction, DENV was serially passaged in either a single human cell line (Huh-7), a single mosquito cell line (C6/36), or in alternating passages between Huh-7 and C6/36 cells. After 10 passages, consensus mutations were identified and fitness was assayed by evaluating replication kinetics in both cell types as well as in a novel cell type (Vero) that was not utilized in any of the passage series. Viruses allowed to specialize in single host cell types exhibited fitness gains in the cell type in which they were passaged, but fitness losses in the bypassed cell type, and most alternating passages, exhibited fitness gains in both cell types. Interestingly, fitness gains were observed in the alternately passaged, cloned viruses, an observation that may be attributed to the acquisition of both host cell–specific and amphi-cell-specific adaptations or to recovery from the fitness losses due to the genetic bottleneck of biological cloning. Amino acid changes common to both passage series suggested convergent evolution to replication in cell culture via positive selection. However, intriguingly, mutations accumulated more rapidly in viruses passed in Huh-7 cells than in those passed in C6/36 cells or in alternation. These results support the hypothesis that releasing DENV from host alternation facilitates adaptation, but there is limited support for the hypothesis that such alternation necessitates a fitness trade-off. Moreover, these findings suggest that patterns of genetic evolution may differ between viruses replicating in mammalian and mosquito cells

    Extreme events are more likely to affect the breeding success of lesser kestrels than average climate change

    Get PDF
    Climate change is predicted to severely impact interactions between prey, predators and habitats. In Southern Europe, within the Mediterranean climate, herbaceous vegetation achieves its maximum growth in middle spring followed by a three-month dry summer, limiting prey availability for insectivorous birds. Lesser kestrels (Falco naumanni) breed in a time-window that matches the nestling-rearing period with the peak abundance of grasshoppers and forecasted climate change may impact reproductive success through changes in prey availability and abundance. We used Normalised Difference Vegetation Index (NDVI) as a surrogate of habitat quality and prey availability to investigate the impacts of forecasted climate change and extreme climatic events on lesser kestrel breeding performance. First, using 14 years of data from 15 colonies in Southwestern Iberia, we linked fledging success and climatic variables with NDVI, and secondly, based on these relationships and according to climatic scenarios for 2050 and 2070, forecasted NDVI and fledging success. Finally, we evaluated how fledging success was influenced by drought events since 2004. Despite predicting a decrease in vegetation greenness in lesser kestrel foraging areas during spring, we found no impacts of predicted gradual rise in temperature and decline in precipitation on their fledging success. Notwithstanding, we found a decrease of 12% in offspring survival associated with drought events, suggesting that a higher frequency of droughts might, in the future, jeopardize the recent recovery of the European population. Here, we show that extreme events, such as droughts, can have more significant impacts on species than gradual climatic changes, especially in regions like the Mediterranean Basin, a biodiversity and climate change hotspotinfo:eu-repo/semantics/publishedVersio

    Effect of sitagliptin on cardiovascular outcomes in type 2 diabetes

    Get PDF
    BACKGROUND: Data are lacking on the long-term effect on cardiovascular events of adding sitagliptin, a dipeptidyl peptidase 4 inhibitor, to usual care in patients with type 2 diabetes and cardiovascular disease. METHODS: In this randomized, double-blind study, we assigned 14,671 patients to add either sitagliptin or placebo to their existing therapy. Open-label use of antihyperglycemic therapy was encouraged as required, aimed at reaching individually appropriate glycemic targets in all patients. To determine whether sitagliptin was noninferior to placebo, we used a relative risk of 1.3 as the marginal upper boundary. The primary cardiovascular outcome was a composite of cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for unstable angina. RESULTS: During a median follow-up of 3.0 years, there was a small difference in glycated hemoglobin levels (least-squares mean difference for sitagliptin vs. placebo, -0.29 percentage points; 95% confidence interval [CI], -0.32 to -0.27). Overall, the primary outcome occurred in 839 patients in the sitagliptin group (11.4%; 4.06 per 100 person-years) and 851 patients in the placebo group (11.6%; 4.17 per 100 person-years). Sitagliptin was noninferior to placebo for the primary composite cardiovascular outcome (hazard ratio, 0.98; 95% CI, 0.88 to 1.09; P<0.001). Rates of hospitalization for heart failure did not differ between the two groups (hazard ratio, 1.00; 95% CI, 0.83 to 1.20; P = 0.98). There were no significant between-group differences in rates of acute pancreatitis (P = 0.07) or pancreatic cancer (P = 0.32). CONCLUSIONS: Among patients with type 2 diabetes and established cardiovascular disease, adding sitagliptin to usual care did not appear to increase the risk of major adverse cardiovascular events, hospitalization for heart failure, or other adverse events
    • …
    corecore