73 research outputs found

    Clinically immune hosts as a refuge for drug-sensitive malaria parasites

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Mutations in <it>Plasmodium falciparum </it>that confer resistance to first-line antimalarial drugs have spread throughout the world from a few independent foci, all located in areas that were likely characterized by low or unstable malaria transmission. One of the striking differences between areas of low or unstable malaria transmission and hyperendemic areas is the difference in the size of the population of immune individuals. However, epidemiological models of malaria transmission have generally ignored the role of immune individuals in transmission, assuming that they do not affect the fitness of the parasite. This model reconsiders the role of immunity in the dynamics of malaria transmission and its impact on the evolution of antimalarial drug resistance under the assumption that immune individuals are infectious.</p> <p>Methods</p> <p>The model is constructed as a two-stage susceptible-infected-susceptible (SIS) model of malaria transmission that assumes that individuals build up clinical immunity over a period of years. This immunity reduces the frequency and severity of clinical symptoms, and thus their use of drugs. It also reduces an individual's level of infectiousness, but does not impact the likelihood of becoming infected.</p> <p>Results</p> <p>Simulations found that with the introduction of resistance into a population, clinical immunity can significantly alter the fitness of the resistant parasite, and thereby impact the ability of the resistant parasite to spread from an initial host by reducing the effective reproductive number of the resistant parasite as transmission intensity increases. At high transmission levels, despite a higher basic reproductive number, <it>R</it><sub>0</sub>, the effective reproductive number of the resistant parasite may fall below the reproductive number of the sensitive parasite.</p> <p>Conclusion</p> <p>These results suggest that high-levels of clinical immunity create a natural ecological refuge for drug-sensitive parasites. This provides an epidemiological rationale for historical patterns of resistance emergence and suggests that future outbreaks of resistance are more likely to occur in low- or unstable-transmission settings. This finding has implications for the design of drug policies and the formulation of malaria control strategies, especially those that lower malaria transmission intensity.</p

    Trends in antibiotic resistance in coagulase-negative staphylococci in the United States, 1999 to 2012

    Get PDF
    Coagulase-negative staphylococci (CoNS) are important bloodstream pathogens that are typically resistant to multiple antibiotics. Despite the concern about increasing resistance, there have been no recent studies describing the national prevalence of CoNS pathogens. We used national resistance data over a period of 13 years (1999 to 2012) from The Surveillance Network (TSN) to determine the prevalence of and assess the trends in resistance for Staphylococcus epidermidis, the most common CoNS pathogen, and all other CoNS pathogens. Over the course of the study period, S. epidermidis resistance to ciprofloxacin and clindamycin increased steadily from 58.3% to 68.4% and from 43.4% to 48.5%, respectively. Resistance to levofloxacin increased rapidly from 57.1% in 1999 to a high of 78.6% in 2005, followed by a decrease to 68.1% in 2012. Multidrug resistance for CoNS followed a similar pattern, and this rise and small decline in resistance were found to be strongly correlated with levofloxacin prescribing patterns. The resistance patterns were similar for the aggregate of CoNS pathogens. The results from our study demonstrate that the antibiotic resistance in CoNS pathogens has increased significantly over the past 13 years. These results are important, as CoNS can serve as sentinels for monitoring resistance, and they play a role as reservoirs of resistance genes that can be transmitted to other pathogens. The link between the levofloxacin prescription rate and resistance levels suggests a critical role for reducing the inappropriate use of fluoroquinolones and other broad-spectrum antibiotics in health care settings and in the community to help curb the reservoir of resistance in these colonizing pathogens

    The Effect of Medicaid Expansion on Utilization in Maryland Emergency Departments

    Get PDF
    Study objective A proposed benefit of expanding Medicaid eligibility under the Patient Protection and Affordable Care Act (ACA) was a reduction in emergency department (ED) utilization for primary care needs. Pre-ACA studies found that new Medicaid enrollees increased their ED utilization rates, but the effect on system-level ED visits was less clear. Our objective was to estimate the effect of Medicaid expansion on aggregate and individual-based ED utilization patterns within Maryland. Methods We performed a retrospective cross-sectional study of ED utilization patterns across Maryland, using data from Maryland’s Health Services Cost Review Commission. We also analyzed utilization differences between pre-ACA (July 2012 to December 2013) uninsuredpatients who returned post-ACA (July 2014 to December 2015). Results The total number of ED visits in Maryland decreased by 36,531 (–1.2%) between the 6 quarters pre-ACA and the 6 quarters post-ACA. Medicaid-covered ED visits increased from 23.3% to 28.9% (159,004 additional visits), whereas uninsured patient visits decreased from 16.3% to 10.4% (181,607 fewer visits). Coverage by other insurance types remained largely stable between periods. We found no significant relationship between Medicaid expansion and changes in ED volume by hospital. For patients uninsured pre-ACA who returned post-ACA, the adjusted visits per person during 6 quarters was 2.38 (95% confidence interval 2.35 to 2.40) for those newly enrolled in Medicaid post-ACA compared with 1.66 (95% confidence interval 1.64 to 1.68) for those remaining uninsured. Conclusion There was a substantial increase in patients covered by Medicaid in the post-ACA period, but this did not significantly affect total ED volume. Returning patients newly enrolled in Medicaid visited the ED more than their uninsured counterparts; however, this cohort accounted for only a small percentage of total ED visits in Maryland

    The Frequency of Influenza and Bacterial Co-infection: A Systematic Review and Meta-Analysis.

    Get PDF
    AIM: Co-infecting bacterial pathogens are a major cause of morbidity and mortality in influenza. However, there remains a paucity of literature on the magnitude of co-infection in influenza patients. METHOD: A systematic search of MeSH, Cochrane Library, Web of Science, SCOPUS, EMBASE, and PubMed was performed. Studies of humans in which all individuals had laboratory confirmed influenza, and all individuals were tested for an array of common bacterial species, met inclusion criteria. RESULTS: Twenty-seven studies including 3,215 participants met all inclusion criteria. Common etiologies were defined from a subset of eight articles. There was high heterogeneity in the results (I(2) = 95%), with reported co-infection rates ranging from 2% to 65%. Though only a subset of papers were responsible for observed heterogeneity, subanalyses and meta-regression analysis found no study characteristic that was significantly associated with co-infection. The most common co-infecting species were Streptococcus pneumoniae and Staphylococcus aureus, which accounted for 35% (95% CI, 14%-56%) and 28% (95% CI, 16%-40%) of infections, respectively; a wide range of other pathogens caused the remaining infections. An assessment of bias suggested that lack of small-study publications may have biased the results. CONCLUSIONS: The frequency of co-infection in the published studies included in this review suggests that though providers should consider possible bacterial co-infection in all patients hospitalized with influenza, they should not assume all patients are co-infected and be sure to properly treat underlying viral processes. Further, high heterogeneity suggests additional large-scale studies are needed to better understand the etiology of influenza bacterial co-infection. This article is protected by copyright. All rights reserved

    Prospective strategies to delay the evolution of anti-malarial drug resistance: weighing the uncertainty

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The evolution of drug resistance in malaria parasites highlights a need to identify and evaluate strategies that could extend the useful therapeutic life of anti-malarial drugs. Such strategies are deployed to best effect before resistance has emerged, under conditions of great uncertainty.</p> <p>Methods</p> <p>Here, the emergence and spread of resistance was modelled using a hybrid framework to evaluate prospective strategies, estimate the time to drug failure, and weigh uncertainty. The waiting time to appearance was estimated as the product of low mutation rates, drug pressure, and parasite population sizes during treatment. Stochastic persistence and the waiting time to establishment were simulated as an evolving branching process. The subsequent spread of resistance was simulated in simple epidemiological models.</p> <p>Results</p> <p>Using this framework, the waiting time to the failure of artemisinin combination therapy (ACT) for malaria was estimated, and a policy of multiple first-line therapies (MFTs) was evaluated. The models quantify the effects of reducing drug pressure in delaying appearance, reducing the chances of establishment, and slowing spread. By using two first-line therapies in a population, it is possible to reduce drug pressure while still treating the full complement of cases.</p> <p>Conclusions</p> <p>At a global scale, because of uncertainty about the time to the emergence of ACT resistance, there was a strong case for MFTs to guard against early failure. Our study recommends developing operationally feasible strategies for implementing MFTs, such as distributing different ACTs at the clinic and for home-based care, or formulating different ACTs for children and adults.</p

    Optimally timing primaquine treatment to reduce Plasmodium falciparum transmission in low endemicity Thai-Myanmar border populations

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Effective malaria control has successfully reduced the malaria burden in many countries, but to eliminate malaria, these countries will need to further improve their control efforts. Here, a malaria control programme was critically evaluated in a very low-endemicity Thai-Myanmar border population, where early detection and prompt treatment have substantially reduced, though not ended, <it>Plasmodium falciparum </it>transmission, in part due to carriage of late-maturing gametocytes that remain post-treatment. To counter this effect, the WHO recommends the use of a single oral dose of primaquine along with an effective blood schizonticide. However, while the effectiveness of primaquine as a gametocidal agent is widely documented, the mismatch between primaquine's short half-life, the long-delay for gametocyte maturation and the proper timing of primaquine administration have not been studied.</p> <p>Methods</p> <p>Mathematical models were constructed to simulate 8-year surveillance data, between 1999 and 2006, of seven villages along the Thai-Myanmar border. A simple model was developed to consider primaquine pharmacokinetics and pharmacodynamics, gametocyte carriage, and infectivity.</p> <p>Results</p> <p>In these populations, transmission intensity is very low, so the <it>P. falciparum </it>parasite rate is strongly linked to imported malaria and to the fraction of cases not treated. Given a 3.6-day half-life of gametocyte, the estimated duration of infectiousness would be reduced by 10 days for every 10-fold reduction in initial gametocyte densities. Infectiousness from mature gametocytes would last two to four weeks and sustain some transmission, depending on the initial parasite densities, but the residual mature gametocytes could be eliminated by primaquine. Because of the short half-life of primaquine (approximately eight hours), it was immediately obvious that with early administration (within three days after an acute attack), primaquine would not be present when mature gametocytes emerged eight days after the appearance of asexual blood-stage parasites. A model of optimal timing suggests that primaquine follow-up approximately eight days after a clinical episode could further reduce the duration of infectiousness from two to four weeks down to a few days. The prospects of malaria elimination would be substantially improved by changing the timing of primaquine administration and combining this with effective detection and management of imported malaria cases. The value of using primaquine to reduce residual gametocyte densities and to reduce malaria transmission was considered in the context of a malaria transmission model; the added benefit of the primaquine follow-up treatment would be relatively large only if a high fraction of patients (>95%) are initially treated with schizonticidal agents.</p> <p>Conclusion</p> <p>Mathematical models have previously identified the long duration of <it>P. falciparum </it>asexual blood-stage infections as a critical point in maintaining malaria transmission, but infectiousness can persist for two to four weeks because of residual populations of mature gametocytes. Simulations from new models suggest that, in areas where a large fraction of malaria cases are treated, curing the asexual parasitaemia in a primary infection, and curing mature gametocyte infections with an eight-day follow-up treatment with primaquine have approximately the same proportional effects on reducing the infectious period. Changing the timing of primaquine administration would, in all likelihood, interrupt transmission in this area with very good health systems and with very low endemicity.</p

    Cost-Effectiveness of “Golden Mustard” for Treating Vitamin A Deficiency in India

    Get PDF
    BACKGROUND: Vitamin A deficiency (VAD) is an important nutritional problem in India, resulting in an increased risk of severe morbidity and mortality. Periodic, high-dose vitamin A supplementation is the WHO-recommended method to prevent VAD, since a single dose can compensate for reduced dietary intake or increased need over a period of several months. However, in India only 34 percent of targeted children currently receive the two doses per year, and new strategies are urgently needed. METHODOLOGY: Recent advancements in biotechnology permit alternative strategies for increasing the vitamin A content of common foods. Mustard (Brassica juncea), which is consumed widely in the form of oil by VAD populations, can be genetically modified to express high levels of beta-carotene, a precursor to vitamin A. Using estimates for consumption, we compare predicted costs and benefits of genetically modified (GM) fortification of mustard seed with high-dose vitamin A supplementation and industrial fortification of mustard oil during processing to alleviate VAD by calculating the avertable health burden in terms of disability-adjusted life years (DALY). PRINCIPAL FINDINGS: We found that all three interventions potentially avert significant numbers of DALYs and deaths. Expanding vitamin A supplementation to all areas was the least costly intervention, at 2323-50 per DALY averted and 1,0001,000-6,100 per death averted, though cost-effectiveness varied with prevailing health subcenter coverage. GM fortification could avert 5 million-6 million more DALYs and 8,000-46,000 more deaths, mainly because it would benefit the entire population and not just children. However, the costs associated with GM fortification were nearly five times those of supplementation. Industrial fortification was dominated by both GM fortification and supplementation. The cost-effectiveness ratio of each intervention decreased with the prevalence of VAD and was sensitive to the efficacy rate of averted mortality. CONCLUSIONS: Although supplementation is the least costly intervention, our findings also indicate that GM fortification could reduce the VAD disease burden to a substantially greater degree because of its wider reach. Given the difficulties in expanding supplementation to areas without health subcenters, GM fortification of mustard seed is an attractive alternative, and further exploration of this technology is warranted
    corecore