317 research outputs found
Insecticide Control of Vector-Borne Diseases: When Is Insecticide Resistance a Problem?
Many of the most dangerous human diseases are transmitted by insect vectors. After decades of repeated insecticide use, all of these vector species have demonstrated the capacity to evolve resistance to insecticides. Insecticide resistance is generally considered to undermine control of vector-transmitted diseases because it increases the number of vectors that survive the insecticide treatment. Disease control failure, however, need not follow from vector control failure. Here, we review evidence that insecticide resistance may have an impact on the quality of vectors and, specifically, on three key determinants of parasite transmission: vector longevity, competence, and behaviour. We argue that, in some instances, insecticide resistance is likely to result in a decrease in vector longevity, a decrease in infectiousness, or in a change in behaviour, all of which will reduce the vectorial capacity of the insect. If this effect is sufficiently large, the impact of insecticide resistance on disease management may not be as detrimental as previously thought. In other instances, however, insecticide resistance may have the opposite effect, increasing the insect's vectorial capacity, which may lead to a dramatic increase in the transmission of the disease and even to a higher prevalence than in the absence of insecticides. Either way—and there may be no simple generality—the consequence of the evolution of insecticide resistance for disease ecology deserves additional attention
Killing them softly:managing pathogen polymorphism and virulence in spatially variable environments
Understanding why pathogen populations are genetically variable is vital because genetic variation fuels evolution, which often hampers disease control efforts. Here I argue that classical models of evolution in spatially variable environments – specifically, models of hard and soft selection – provide a useful framework to understand the maintenance of pathogen polymorphism and the evolution of virulence. First, the similarities between models of hard and soft selection and pathogen life cycles are described, highlighting how the type and timing of pathogen control measures impose density regulation that may affect both the level of pathogen polymorphism and virulence. The article concludes with an outline of potential lines of future theoretical and experimental work
doi:10.1016/j.vaccine.2008.04.012
a b s t r a c t One theory of why some pathogens are virulent (i.e., they damage their host) is that they need to extract resources from their host in order to compete for transmission to new hosts, and this resource extraction can damage the host. Here we describe our studies in malaria that test and support this idea. We go on to show that host immunity can exacerbate selection for virulence and therefore that vaccines that reduce pathogen replication may select for more virulent pathogens, eroding the benefits of vaccination and putting the unvaccinated at greater risk. We suggest that in disease contexts where wild-type parasites can be transmitted through vaccinated hosts, evolutionary outcomes need to be considered. © 2008 Elsevier Ltd. All rights reserved. An evolutionary hypothesis for pathogen virulence Why are pathogens virulent? 1 Why would they run the risk of killing their host when, in doing so, they lose their ongoing source of transmission to new hosts? Some evolutionary biologists believe that the answer to this question will make it possible to design vaccines and other control measures that, in the event of eradication being impossible, drive the pathogen towards lower virulence One answer to this question is that virulence is a mistake by the pathogen-an ultimately maladaptative outcome that occasionally happens when a pathogen accidentally ends up in an abnormal host environment, or when a virulent mutant has a transient competitive advantage within a host ('short-sighted, or dead-end evolution') E-mail address: [email protected] (M.J. Mackinnon). 1 Throughout this paper, we strictly define virulence as the fitness cost that the parasite causes the host. This may be through mortality, or morbidity-related reduction in fertility or fecundity. We sometimes use morbidity as a surrogate measure of virulence. Of all the explanations for virulence, the trade-off hypothesis has received most attention and a large body of theory has been derived from it. Yet it is poorly supported by data. Here we describe our studies in malaria parasites, the causative agents of a disease of global importance, in which we have comprehensively explored the trade-off hypothesis. We begin by summarising our experimental tests in a laboratory mouse-malaria system of the assumptions underlying the trade-off theory. We then ask whether the rodent data are relevant to malaria parasites in their human setting. Next, we use the trade-off theory to predict what the impact might be on the evolution of the pathogen's virulence if malaria vaccines went into widespread use. Finally, we summarise an experimental evolution study to test our prediction that enhanced immunity would select for more virulent parasites. Together, this work has led us to a deeper understanding of why malaria still kills its host despite millenia of coevolution, and what might happen when disease control campaigns change the level of population immunity, e.g., enhance it using vaccines, or reduce it using bednets and vector control. The trade-off hypothesis and its assumptions Under the trade-off hypothesis, it is assumed that there are both fitness benefits and costs associated with virulence. The cost is assumed to be host death because, for most pathogens, transmission stops when the host dies. The benefits associated with virulence are assumed to be production of more transmission forms per unit time, and/or increased persistence in a live host. However, the benefits of higher transmissibility and persistence only accrue 0264-410X/$ -see front matte
Imperfect vaccines and the evolution of pathogen virulence
Vaccines rarely provide full protection from disease. Nevertheless,
partially effective (imperfect) vaccines may be used to protect
both individuals and whole populations.We studied the potential
impact of different types of imperfect vaccines on the evolution
of pathogen virulence (induced host mortality) and the
consequences for public health. Here we show that vaccines
designed to reduce pathogen growth rate and/or toxicity diminish
selection against virulent pathogens. The subsequent evolution
leads to higher levels of intrinsic virulence and hence to more
severe disease in unvaccinated individuals. This evolution can
erode any population-wide benefits such that overall mortality
rates are unaffected, or even increase, with the level of vaccination
coverage. In contrast, infection-blocking vaccines induce no such
effects, and can even select for lower virulence. These findings
have policy implications for the development and use of vaccines
that are not expected to provide full immunity, such as candidate
vaccines for malaria
Evolutionary Epidemiology of Drug-Resistance in Space
The spread of drug-resistant parasites erodes the efficacy of therapeutic
treatments against many infectious diseases and is a major threat of the 21st
century. The evolution of drug-resistance depends, among other things, on how
the treatments are administered at the population level. “Resistance
management” consists of finding optimal treatment strategies that both
reduce the consequence of an infection at the individual host level, and limit
the spread of drug-resistance in the pathogen population. Several studies have
focused on the effect of mixing different treatments, or of alternating them in
time. Here, we analyze another strategy, where the use of the drug varies
spatially: there are places where no one receives any treatment. We find that
such a spatial heterogeneity can totally prevent the rise of drug-resistance,
provided that the size of treated patches is below a critical threshold. The
range of parasite dispersal, the relative costs and benefits of being
drug-resistant compared to being drug-sensitive, and the duration of an
infection with drug-resistant parasites are the main factors determining the
value of this threshold. Our analysis thus provides some general guidance
regarding the optimal spatial use of drugs to prevent or limit the evolution of
drug-resistance
Phylogenetic Codivergence Supports Coevolution of Mimetic Heliconius Butterflies
The unpalatable and warning-patterned butterflies _Heliconius erato_ and _Heliconius melpomene_ provide the best studied example of mutualistic Müllerian mimicry, thought – but rarely demonstrated – to promote coevolution. Some of the strongest available evidence for coevolution comes from phylogenetic codivergence, the parallel divergence of ecologically associated lineages. Early evolutionary reconstructions suggested codivergence between mimetic populations of _H. erato_ and _H. melpomene_, and this was initially hailed as the most striking known case of coevolution. However, subsequent molecular phylogenetic analyses found discrepancies in phylogenetic branching patterns and timing (topological and temporal incongruence) that argued against codivergence. We present the first explicit cophylogenetic test of codivergence between mimetic populations of _H. erato_ and _H. melpomene_, and re-examine the timing of these radiations. We find statistically significant topological congruence between multilocus coalescent population phylogenies of _H. erato_ and _H. melpomene_, supporting repeated codivergence of mimetic populations. Divergence time estimates, based on a Bayesian coalescent model, suggest that the evolutionary radiations of _H. erato_ and _H. melpomene_ occurred over the same time period, and are compatible with a series of temporally congruent codivergence events. This evidence supports a history of reciprocal coevolution between Müllerian co-mimics characterised by phylogenetic codivergence and parallel phenotypic change
Evolution of infectious bronchitis virus in the field after homologous vaccination introduction
International audienceAbstractDespite the fact that vaccine resistance has been typically considered a rare phenomenon, some episodes of vaccine failure have been reported with increasing frequency in intensively-raised livestock. Infectious bronchitis virus (IBV) is a widespread avian coronavirus, whose control relies mainly on extensive vaccine administration. Unfortunately, the continuous emergence of new vaccine-immunity escaping variants prompts the development of new vaccines. In the present work, a molecular epidemiology study was performed to evaluate the potential role of homologous vaccination in driving IBV evolution. This was undertaken by assessing IBV viral RNA sequences from the ORF encoding the S1 portion of viral surface glycoprotein (S) before and after the introduction of a new live vaccine on broiler farms in northern-Italy. The results of several biostatistics analyses consistently demonstrate the presence of a higher pressure in the post-vaccination period. Natural selection was detected essentially on sites located on the protein surface, within or nearby domains involved in viral attachment or related functions. This evidence strongly supports the action of vaccine-induced immunity in conditioning viral evolution, potentially leading to the emergence of new vaccine-escape variants. The great plasticity of rapidly-evolving RNA-viruses in response to human intervention, which extends beyond the poultry industry, is demonstrated, claiming further attention due to their relevance for animal and especially human health
- …