73 research outputs found
Clinically immune hosts as a refuge for drug-sensitive malaria parasites
<p>Abstract</p> <p>Background</p> <p>Mutations in <it>Plasmodium falciparum </it>that confer resistance to first-line antimalarial drugs have spread throughout the world from a few independent foci, all located in areas that were likely characterized by low or unstable malaria transmission. One of the striking differences between areas of low or unstable malaria transmission and hyperendemic areas is the difference in the size of the population of immune individuals. However, epidemiological models of malaria transmission have generally ignored the role of immune individuals in transmission, assuming that they do not affect the fitness of the parasite. This model reconsiders the role of immunity in the dynamics of malaria transmission and its impact on the evolution of antimalarial drug resistance under the assumption that immune individuals are infectious.</p> <p>Methods</p> <p>The model is constructed as a two-stage susceptible-infected-susceptible (SIS) model of malaria transmission that assumes that individuals build up clinical immunity over a period of years. This immunity reduces the frequency and severity of clinical symptoms, and thus their use of drugs. It also reduces an individual's level of infectiousness, but does not impact the likelihood of becoming infected.</p> <p>Results</p> <p>Simulations found that with the introduction of resistance into a population, clinical immunity can significantly alter the fitness of the resistant parasite, and thereby impact the ability of the resistant parasite to spread from an initial host by reducing the effective reproductive number of the resistant parasite as transmission intensity increases. At high transmission levels, despite a higher basic reproductive number, <it>R</it><sub>0</sub>, the effective reproductive number of the resistant parasite may fall below the reproductive number of the sensitive parasite.</p> <p>Conclusion</p> <p>These results suggest that high-levels of clinical immunity create a natural ecological refuge for drug-sensitive parasites. This provides an epidemiological rationale for historical patterns of resistance emergence and suggests that future outbreaks of resistance are more likely to occur in low- or unstable-transmission settings. This finding has implications for the design of drug policies and the formulation of malaria control strategies, especially those that lower malaria transmission intensity.</p
Trends in antibiotic resistance in coagulase-negative staphylococci in the United States, 1999 to 2012
Coagulase-negative staphylococci (CoNS) are important bloodstream pathogens that are typically resistant to multiple antibiotics. Despite the concern about increasing resistance, there have been no recent studies describing the national prevalence of CoNS pathogens. We used national resistance data over a period of 13 years (1999 to 2012) from The Surveillance Network (TSN) to determine the prevalence of and assess the trends in resistance for Staphylococcus epidermidis, the most common CoNS pathogen, and all other CoNS pathogens. Over the course of the study period, S. epidermidis resistance to ciprofloxacin and clindamycin increased steadily from 58.3% to 68.4% and from 43.4% to 48.5%, respectively. Resistance to levofloxacin increased rapidly from 57.1% in 1999 to a high of 78.6% in 2005, followed by a decrease to 68.1% in 2012. Multidrug resistance for CoNS followed a similar pattern, and this rise and small decline in resistance were found to be strongly correlated with levofloxacin prescribing patterns. The resistance patterns were similar for the aggregate of CoNS pathogens. The results from our study demonstrate that the antibiotic resistance in CoNS pathogens has increased significantly over the past 13 years. These results are important, as CoNS can serve as sentinels for monitoring resistance, and they play a role as reservoirs of resistance genes that can be transmitted to other pathogens. The link between the levofloxacin prescription rate and resistance levels suggests a critical role for reducing the inappropriate use of fluoroquinolones and other broad-spectrum antibiotics in health care settings and in the community to help curb the reservoir of resistance in these colonizing pathogens
Recommended from our members
Influenza A H1N1 Pandemic Strain Evolution – Divergence and the Potential for Antigenic Drift Variants
The emergence of a novel A(H1N1) strain in 2009 was the first influenza pandemic of the genomic age, and unprecedented surveillance of the virus provides the opportunity to better understand the evolution of influenza. We examined changes in the nucleotide coding regions and the amino acid sequences of the hemagglutinin (HA), neuraminidase (NA), and nucleoprotein (NP) segments of the A(H1N1)pdm09 strain using publicly available data. We calculated the nucleotide and amino acid hamming distance from the vaccine strain A/California/07/2009 for each sequence. We also estimated Pepitope–a measure of antigenic diversity based on changes in the epitope regions–for each isolate. Finally, we compared our results to A(H3N2) strains collected over the same period. Our analysis found that the mean hamming distance for the HA protein of the A(H1N1)pdm09 strain increased from 3.6 (standard deviation [SD]: 1.3) in 2009 to 11.7 (SD: 1.0) in 2013, while the mean hamming distance in the coding region increased from 7.4 (SD: 2.2) in 2009 to 28.3 (SD: 2.1) in 2013. These trends are broadly similar to the rate of mutation in H3N2 over the same time period. However, in contrast to H3N2 strains, the rate of mutation accumulation has slowed in recent years. Our results are notable because, over the course of the study, mutation rates in H3N2 similar to that seen with A(H1N1)pdm09 led to the emergence of two antigenic drift variants. However, while there has been an H1N1 epidemic in North America this season, evidence to date indicates the vaccine is still effective, suggesting the epidemic is not due to the emergence of an antigenic drift variant. Our results suggest that more research is needed to understand how viral mutations are related to vaccine effectiveness so that future vaccine choices and development can be more predictive
The Effect of Medicaid Expansion on Utilization in Maryland Emergency Departments
Study objective A proposed benefit of expanding Medicaid eligibility under the Patient Protection and Affordable Care Act (ACA) was a reduction in emergency department (ED) utilization for primary care needs. Pre-ACA studies found that new Medicaid enrollees increased their ED utilization rates, but the effect on system-level ED visits was less clear. Our objective was to estimate the effect of Medicaid expansion on aggregate and individual-based ED utilization patterns within Maryland. Methods We performed a retrospective cross-sectional study of ED utilization patterns across Maryland, using data from Maryland’s Health Services Cost Review Commission. We also analyzed utilization differences between pre-ACA (July 2012 to December 2013) uninsuredpatients who returned post-ACA (July 2014 to December 2015). Results The total number of ED visits in Maryland decreased by 36,531 (–1.2%) between the 6 quarters pre-ACA and the 6 quarters post-ACA. Medicaid-covered ED visits increased from 23.3% to 28.9% (159,004 additional visits), whereas uninsured patient visits decreased from 16.3% to 10.4% (181,607 fewer visits). Coverage by other insurance types remained largely stable between periods. We found no significant relationship between Medicaid expansion and changes in ED volume by hospital. For patients uninsured pre-ACA who returned post-ACA, the adjusted visits per person during 6 quarters was 2.38 (95% confidence interval 2.35 to 2.40) for those newly enrolled in Medicaid post-ACA compared with 1.66 (95% confidence interval 1.64 to 1.68) for those remaining uninsured. Conclusion There was a substantial increase in patients covered by Medicaid in the post-ACA period, but this did not significantly affect total ED volume. Returning patients newly enrolled in Medicaid visited the ED more than their uninsured counterparts; however, this cohort accounted for only a small percentage of total ED visits in Maryland
The Frequency of Influenza and Bacterial Co-infection: A Systematic Review and Meta-Analysis.
AIM: Co-infecting bacterial pathogens are a major cause of morbidity and mortality in influenza. However, there remains a paucity of literature on the magnitude of co-infection in influenza patients.
METHOD: A systematic search of MeSH, Cochrane Library, Web of Science, SCOPUS, EMBASE, and PubMed was performed. Studies of humans in which all individuals had laboratory confirmed influenza, and all individuals were tested for an array of common bacterial species, met inclusion criteria.
RESULTS: Twenty-seven studies including 3,215 participants met all inclusion criteria. Common etiologies were defined from a subset of eight articles. There was high heterogeneity in the results (I(2) = 95%), with reported co-infection rates ranging from 2% to 65%. Though only a subset of papers were responsible for observed heterogeneity, subanalyses and meta-regression analysis found no study characteristic that was significantly associated with co-infection. The most common co-infecting species were Streptococcus pneumoniae and Staphylococcus aureus, which accounted for 35% (95% CI, 14%-56%) and 28% (95% CI, 16%-40%) of infections, respectively; a wide range of other pathogens caused the remaining infections. An assessment of bias suggested that lack of small-study publications may have biased the results.
CONCLUSIONS: The frequency of co-infection in the published studies included in this review suggests that though providers should consider possible bacterial co-infection in all patients hospitalized with influenza, they should not assume all patients are co-infected and be sure to properly treat underlying viral processes. Further, high heterogeneity suggests additional large-scale studies are needed to better understand the etiology of influenza bacterial co-infection. This article is protected by copyright. All rights reserved
Recommended from our members
Stability of the Influenza Virus Hemagglutinin Protein Correlates with Evolutionary Dynamics
ABSTRACT Protein thermodynamics are an integral determinant of viral fitness and one of the major drivers of protein evolution. Mutations in the influenza A virus (IAV) hemagglutinin (HA) protein can eliminate neutralizing antibody binding to mediate escape from preexisting antiviral immunity. Prior research on the IAV nucleoprotein suggests that protein stability may constrain seasonal IAV evolution; however, the role of stability in shaping the evolutionary dynamics of the HA protein has not been explored. We used the full coding sequence of 9,797 H1N1pdm09 HA sequences and 16,716 human seasonal H3N2 HA sequences to computationally estimate relative changes in the thermal stability of the HA protein between 2009 and 2016. Phylogenetic methods were used to characterize how stability differences impacted the evolutionary dynamics of the virus. We found that pandemic H1N1 IAV strains split into two lineages that had different relative HA protein stabilities and that later variants were descended from the higher-stability lineage. Analysis of the mutations associated with the selective sweep of the higher-stability lineage found that they were characterized by the early appearance of highly stabilizing mutations, the earliest of which was not located in a known antigenic site. Experimental evidence further suggested that H1N1 HA stability may be correlated with in vitro virus production and infection. A similar analysis of H3N2 strains found that surviving lineages were also largely descended from viruses predicted to encode more-stable HA proteins. Our results suggest that HA protein stability likely plays a significant role in the persistence of different IAV lineages. IMPORTANCE: One of the constraints on fast-evolving viruses, such as influenza virus, is protein stability, or how strongly the folded protein holds together. Despite the importance of this protein property, there has been limited investigation of the impact of the stability of the influenza virus hemagglutinin protein—the primary antibody target of the immune system—on its evolution. Using a combination of computational estimates of stability and experiments, our analysis found that viruses with more-stable hemagglutinin proteins were associated with long-term persistence in the population. There are two potential reasons for the observed persistence. One is that more-stable proteins tolerate destabilizing mutations that less-stable proteins could not, thus increasing opportunities for immune escape. The second is that greater stability increases the fitness of the virus through increased production of infectious particles. Further research on the relative importance of these mechanisms could help inform the annual influenza vaccine composition decision process
Prospective strategies to delay the evolution of anti-malarial drug resistance: weighing the uncertainty
<p>Abstract</p> <p>Background</p> <p>The evolution of drug resistance in malaria parasites highlights a need to identify and evaluate strategies that could extend the useful therapeutic life of anti-malarial drugs. Such strategies are deployed to best effect before resistance has emerged, under conditions of great uncertainty.</p> <p>Methods</p> <p>Here, the emergence and spread of resistance was modelled using a hybrid framework to evaluate prospective strategies, estimate the time to drug failure, and weigh uncertainty. The waiting time to appearance was estimated as the product of low mutation rates, drug pressure, and parasite population sizes during treatment. Stochastic persistence and the waiting time to establishment were simulated as an evolving branching process. The subsequent spread of resistance was simulated in simple epidemiological models.</p> <p>Results</p> <p>Using this framework, the waiting time to the failure of artemisinin combination therapy (ACT) for malaria was estimated, and a policy of multiple first-line therapies (MFTs) was evaluated. The models quantify the effects of reducing drug pressure in delaying appearance, reducing the chances of establishment, and slowing spread. By using two first-line therapies in a population, it is possible to reduce drug pressure while still treating the full complement of cases.</p> <p>Conclusions</p> <p>At a global scale, because of uncertainty about the time to the emergence of ACT resistance, there was a strong case for MFTs to guard against early failure. Our study recommends developing operationally feasible strategies for implementing MFTs, such as distributing different ACTs at the clinic and for home-based care, or formulating different ACTs for children and adults.</p
Optimally timing primaquine treatment to reduce Plasmodium falciparum transmission in low endemicity Thai-Myanmar border populations
<p>Abstract</p> <p>Background</p> <p>Effective malaria control has successfully reduced the malaria burden in many countries, but to eliminate malaria, these countries will need to further improve their control efforts. Here, a malaria control programme was critically evaluated in a very low-endemicity Thai-Myanmar border population, where early detection and prompt treatment have substantially reduced, though not ended, <it>Plasmodium falciparum </it>transmission, in part due to carriage of late-maturing gametocytes that remain post-treatment. To counter this effect, the WHO recommends the use of a single oral dose of primaquine along with an effective blood schizonticide. However, while the effectiveness of primaquine as a gametocidal agent is widely documented, the mismatch between primaquine's short half-life, the long-delay for gametocyte maturation and the proper timing of primaquine administration have not been studied.</p> <p>Methods</p> <p>Mathematical models were constructed to simulate 8-year surveillance data, between 1999 and 2006, of seven villages along the Thai-Myanmar border. A simple model was developed to consider primaquine pharmacokinetics and pharmacodynamics, gametocyte carriage, and infectivity.</p> <p>Results</p> <p>In these populations, transmission intensity is very low, so the <it>P. falciparum </it>parasite rate is strongly linked to imported malaria and to the fraction of cases not treated. Given a 3.6-day half-life of gametocyte, the estimated duration of infectiousness would be reduced by 10 days for every 10-fold reduction in initial gametocyte densities. Infectiousness from mature gametocytes would last two to four weeks and sustain some transmission, depending on the initial parasite densities, but the residual mature gametocytes could be eliminated by primaquine. Because of the short half-life of primaquine (approximately eight hours), it was immediately obvious that with early administration (within three days after an acute attack), primaquine would not be present when mature gametocytes emerged eight days after the appearance of asexual blood-stage parasites. A model of optimal timing suggests that primaquine follow-up approximately eight days after a clinical episode could further reduce the duration of infectiousness from two to four weeks down to a few days. The prospects of malaria elimination would be substantially improved by changing the timing of primaquine administration and combining this with effective detection and management of imported malaria cases. The value of using primaquine to reduce residual gametocyte densities and to reduce malaria transmission was considered in the context of a malaria transmission model; the added benefit of the primaquine follow-up treatment would be relatively large only if a high fraction of patients (>95%) are initially treated with schizonticidal agents.</p> <p>Conclusion</p> <p>Mathematical models have previously identified the long duration of <it>P. falciparum </it>asexual blood-stage infections as a critical point in maintaining malaria transmission, but infectiousness can persist for two to four weeks because of residual populations of mature gametocytes. Simulations from new models suggest that, in areas where a large fraction of malaria cases are treated, curing the asexual parasitaemia in a primary infection, and curing mature gametocyte infections with an eight-day follow-up treatment with primaquine have approximately the same proportional effects on reducing the infectious period. Changing the timing of primaquine administration would, in all likelihood, interrupt transmission in this area with very good health systems and with very low endemicity.</p
Cost-Effectiveness of “Golden Mustard” for Treating Vitamin A Deficiency in India
BACKGROUND: Vitamin A deficiency (VAD) is an important nutritional problem in India, resulting in an increased risk of severe morbidity and mortality. Periodic, high-dose vitamin A supplementation is the WHO-recommended method to prevent VAD, since a single dose can compensate for reduced dietary intake or increased need over a period of several months. However, in India only 34 percent of targeted children currently receive the two doses per year, and new strategies are urgently needed. METHODOLOGY: Recent advancements in biotechnology permit alternative strategies for increasing the vitamin A content of common foods. Mustard (Brassica juncea), which is consumed widely in the form of oil by VAD populations, can be genetically modified to express high levels of beta-carotene, a precursor to vitamin A. Using estimates for consumption, we compare predicted costs and benefits of genetically modified (GM) fortification of mustard seed with high-dose vitamin A supplementation and industrial fortification of mustard oil during processing to alleviate VAD by calculating the avertable health burden in terms of disability-adjusted life years (DALY). PRINCIPAL FINDINGS: We found that all three interventions potentially avert significant numbers of DALYs and deaths. Expanding vitamin A supplementation to all areas was the least costly intervention, at 50 per DALY averted and 6,100 per death averted, though cost-effectiveness varied with prevailing health subcenter coverage. GM fortification could avert 5 million-6 million more DALYs and 8,000-46,000 more deaths, mainly because it would benefit the entire population and not just children. However, the costs associated with GM fortification were nearly five times those of supplementation. Industrial fortification was dominated by both GM fortification and supplementation. The cost-effectiveness ratio of each intervention decreased with the prevalence of VAD and was sensitive to the efficacy rate of averted mortality. CONCLUSIONS: Although supplementation is the least costly intervention, our findings also indicate that GM fortification could reduce the VAD disease burden to a substantially greater degree because of its wider reach. Given the difficulties in expanding supplementation to areas without health subcenters, GM fortification of mustard seed is an attractive alternative, and further exploration of this technology is warranted
- …