84 research outputs found

    Identifying circumstances under which high insecticide dose increases or decreases resistance selection

    Get PDF
    Insect management strategies for agricultural crop pests must reduce selection for insecticide resistant mutants while providing effective control of the insect pest. One management strategy that has long been advocated is the application of insecticides at the maximum permitted dose. This has been found, under some circumstances, to be able to prevent the resistance allele frequency from increasing. However this approach may, under different circumstances, lead to rapid selection for resistance to the insecticide. To test when a high dose would be an effective resistance management strategy, we present a flexible deterministic model of a population of an insect pest of agricultural crops. The model includes several possible life-history traits including sexual or asexual reproduction, diploid or haplodiploid genetics, univoltine or multivoltine life cycle, so that the high dose strategy can be tested for many different insect pests. Using this model we aim to identify the key characteristics of pests that make either a high dose or a low dose of insecticide optimal for resistance management. Two outputs are explored: firstly whether the frequency of the resistance allele increases over time or remains low indefinitely; and secondly whether lowering the dose of insecticide applied reduces or increases the rate of selection for the resistance allele. It is demonstrated that with high immigration resistance can be suppressed. This suppression however, is rarely lost if the insecticide dose is reduced, and is absent altogether when individuals move from the treated population back into an untreated population. Reducing the dose of insecticide often resulted in slower development of resistance, except where the population combined a high influx of less resistant individuals into the treated population, a recessive resistance gene and a high efficacy, in which case reducing the dose of insecticide could result in faster selection for resistance

    Dose and number of applications that maximise fungicide effective life exemplified by Zymoseptoria tritici on wheat - a model analysis

    Get PDF
    Two key decisions that need to be taken about a fungicide treatment programme are (i) the number of applications that should be used per crop growing season, and (ii) the dosage that should be used in each application. There are two opposing considerations, with control efficacy improved by a higher number of applications and higher dose, and resistance management improved by a lower number of applications and lower dose. Resistance management aims to prolong the effective life of the fungicide, defined as the time between its introduction onto the market for use on the target pathogen, and the moment when effective control is lost due to a build-up of fungicide resistance. Thus, the question is whether there are optimal combinations of dose rate and number of applications that both provide effective control and lead to a longer effective life. In this paper, it is shown how a range of spray programmes can be compared and optimal programmes selected. This is explored with Zymoseptoria tritici on wheat and a quinone outside inhibitor (QoI) fungicide. For this pathogen-fungicide combination, a single treatment provided effective control under the simulated disease pressure, but only if the application timing was optimal and the dose was close to the maximum permitted. Programmes with three applications were generally not optimal as they exerted too much selection for resistance. Two-application fungicide programmes balanced effective control with reasonable flexibility of dose and application timing, and low resistance selection, leading to long effective lives of the fungicide

    Tolerance of septoria leaf blotch in winter wheat

    Get PDF
    For individual varieties, tolerance of septoria leaf blotch was quantified by the slope of the relationship between disease and yield. Variation in disease severity and the associated yield responses were provided across two sites and three seasons of field experiments. Slopes were fitted by residual maximum likelihood for two contrasting models: (i) a fixed-effects model, where no prior assumptions were made about the form of the variety slopes; and (ii) a random-effects model, where deviations in individual variety slopes away from the mean variety slope formed a normal random population with unknown variance. The analyses gave broadly similar results, but with some significant differences. The random model was considered more reliable for predicting variety performance. The effects of disease were quantified as symptom area and green canopy duration. Models of the relationship between symptom area and yield were site-specific. When site effects were not taken into account, these models had poor predictive precision. Models based on the canopy green area gave robust predictions of yield and were not site-specific. Differences in disease tolerance were detected in a comparison of 25 commercial winter wheat varieties. Tolerance was not detected directly through symptom measurements, but instead through measurements of canopy green area, which provides a measurement of the effects of disease that accounts for differences in canopy size across sites and seasons. The varieties showing greatest tolerance tended to have lower attainable yield than the intolerant varieties. Presence of the 1BL/1RS chromosome translocation, which has been reported to increase radiation use efficiency, appeared to be associated with intolerance

    Optimal fungicide application timings for disease control are also an effective anti-resistance strategy: a case study for Zymoseptoria tritici (Mycosphaerella graminicola) on wheat

    Get PDF
    Strategies to slow fungicide resistance evolution often advocate early “prophylactic” fungicide application and avoidance of “curative” treatments where possible. There is little evidence to support such guidance. Fungicide applications are usually timed to maximize the efficiency of disease control during the yield-forming period. This article reports mathematical modeling to explore whether earlier timings might be more beneficial for fungicide resistance management compared with the timings that are optimal for efficacy. There are two key timings for fungicide treatment of winter wheat in the United Kingdom: full emergence of leaf three (counting down the canopy) and full emergence of the flag leaf (leaf 1). These timings (referred to as T1 and T2, respectively) maximize disease control on the upper leaves of the crop canopy that are crucial to yield. A differential equation model was developed to track the dynamics of leaf emergence and senescence, epidemic growth, fungicide efficacy, and selection for a resistant strain. The model represented Zymoseptoria tritici on wheat treated twice at varying spray timings. At all fungicide doses tested, moving one or both of the two sprays earlier than the normal T1 and T2 timings reduced selection but also reduced efficacy. Despite these opposing effects, at a fungicide dose just sufficient to obtain effective control, the T1 and T2 timings optimized fungicide effective life (the number of years that effective control can be maintained). At a higher dose, earlier spray timings maximized effective life but caused some reduction in efficacy, whereas the T1 and T2 timings maximized efficacy but resulted in an effective life 1 year shorter than the maximum achievable. </jats:p

    Maximizing realized yield by breeding for disease tolerance: A case study for Septoria tritici blotch

    Get PDF
    Disease-tolerant cultivars maintain yield in the presence of disease. When disease intensity is high, they can improve a grower's net return compared to less tolerant cultivars. Many authors report a trade-off, whereby higher fully protected yields are correlated with a lower disease tolerance. We analyse the question for breeders: to what extent should they breed for tolerance when it compromises maximizing fully protected yield? Field trials with 147 progeny from five parental crosses of wheat were used to measure yield and tolerance under a range of disease intensities from Septoria tritici blotch (STB; causal organism Zymoseptoria tritici) at a range of sites and seasons. The data define the variation for these traits from which breeders can select. A simple data-driven descriptive model was used to calculate the combination of tolerance and fully protected yield that maximizes actual yield for any given level of disease—quantified by loss of healthy canopy area duration (HAD-loss). This model was combined with data on the year-to-year variability of HAD-loss in the UK to calculate the tolerance and fully protected yield that maximizes the mean actual yield. We found that even when an effective fungicide treatment programme is applied, breeding for tolerance increases the mean actual yield. Some commercially available cultivars were found to have a level of tolerance that leads to yields close to the maximum yield in the presence of disease, others had a lower tolerance leading to suboptimal yields

    Extending the durability of cultivar resistance by limiting epidemic growth rates

    Get PDF
    Cultivar resistance is an essential part of disease control programmes in many agricultural systems. The use of resistant cultivars applies a selection pressure on pathogen populations for the evolution of virulence, resulting in loss of disease control. Various techniques for the deployment of host resistance genes have been proposed to reduce the selection for virulence, but these are often difficult to apply in practice. We present a general technique to maintain the effectiveness of cultivar resistance. Derived from classical population genetics theory; any factor that reduces the population growth rates of both the virulent and avirulent strains will reduce selection. We model the specific example of fungicide application to reduce the growth rates of virulent and avirulent strains of a pathogen, demonstrating that appropriate use of fungicides reduces selection for virulence, prolonging cultivar resistance. This specific example of chemical control illustrates a general principle for the development of techniques to manage the evolution of virulence by slowing epidemic growth rates

    Derivation and testing of a model to predict selection for fungicide resistance

    Get PDF
    A mathematical model was derived to predict selection for fungicide resistance in foliar pathogens of cereal crops. The model was tested against independent data from four field experiments quantifying selection for the G143A mutation conferring resistance to a quinone outside inhibitor (QoI) fungicide in powdery mildew (Blumeria graminis f.sp. hordei) on spring barley (Hordeum vulgare). Fungicide treatments with azoxystrobin differed in the total applied dose and spray number. For each treatment, we calculated the observed selection ratio as the ratio of the frequency of the resistant strain after the last and before the first spray. The model accurately predicted the variation in observed selection ratios with total applied fungicide dose and number of sprays for three of the four experiments. Underprediction of selection ratios in one experiment was attributed to the particularly late epidemic onset in that experiment. When the equation representing epidemic development was modified to account for the late epidemic, predicted and observed selection ratios at that site were in close agreement. On a scatter plot of observed selection ratios on predicted selection ratios, for all four experiments, the 1:1 line explained 89-92% of the variance in the mean of observed selection ratios. To our knowledge, this is the first fungicide resistance model for plant pathogens to be rigorously tested against field data. The model can be used with some degree of confidence, to identify anti-resistance treatment strategies which are likely to be effective and would justify the resources required for experimental testing
    • …
    corecore