214 research outputs found

    Identifying circumstances under which high insecticide dose increases or decreases resistance selection

    Get PDF
    Insect management strategies for agricultural crop pests must reduce selection for insecticide resistant mutants while providing effective control of the insect pest. One management strategy that has long been advocated is the application of insecticides at the maximum permitted dose. This has been found, under some circumstances, to be able to prevent the resistance allele frequency from increasing. However this approach may, under different circumstances, lead to rapid selection for resistance to the insecticide. To test when a high dose would be an effective resistance management strategy, we present a flexible deterministic model of a population of an insect pest of agricultural crops. The model includes several possible life-history traits including sexual or asexual reproduction, diploid or haplodiploid genetics, univoltine or multivoltine life cycle, so that the high dose strategy can be tested for many different insect pests. Using this model we aim to identify the key characteristics of pests that make either a high dose or a low dose of insecticide optimal for resistance management. Two outputs are explored: firstly whether the frequency of the resistance allele increases over time or remains low indefinitely; and secondly whether lowering the dose of insecticide applied reduces or increases the rate of selection for the resistance allele. It is demonstrated that with high immigration resistance can be suppressed. This suppression however, is rarely lost if the insecticide dose is reduced, and is absent altogether when individuals move from the treated population back into an untreated population. Reducing the dose of insecticide often resulted in slower development of resistance, except where the population combined a high influx of less resistant individuals into the treated population, a recessive resistance gene and a high efficacy, in which case reducing the dose of insecticide could result in faster selection for resistance

    Dose and number of applications that maximise fungicide effective life exemplified by Zymoseptoria tritici on wheat - a model analysis

    Get PDF
    Two key decisions that need to be taken about a fungicide treatment programme are (i) the number of applications that should be used per crop growing season, and (ii) the dosage that should be used in each application. There are two opposing considerations, with control efficacy improved by a higher number of applications and higher dose, and resistance management improved by a lower number of applications and lower dose. Resistance management aims to prolong the effective life of the fungicide, defined as the time between its introduction onto the market for use on the target pathogen, and the moment when effective control is lost due to a build-up of fungicide resistance. Thus, the question is whether there are optimal combinations of dose rate and number of applications that both provide effective control and lead to a longer effective life. In this paper, it is shown how a range of spray programmes can be compared and optimal programmes selected. This is explored with Zymoseptoria tritici on wheat and a quinone outside inhibitor (QoI) fungicide. For this pathogen-fungicide combination, a single treatment provided effective control under the simulated disease pressure, but only if the application timing was optimal and the dose was close to the maximum permitted. Programmes with three applications were generally not optimal as they exerted too much selection for resistance. Two-application fungicide programmes balanced effective control with reasonable flexibility of dose and application timing, and low resistance selection, leading to long effective lives of the fungicide

    Tolerance of septoria leaf blotch in winter wheat

    Get PDF
    For individual varieties, tolerance of septoria leaf blotch was quantified by the slope of the relationship between disease and yield. Variation in disease severity and the associated yield responses were provided across two sites and three seasons of field experiments. Slopes were fitted by residual maximum likelihood for two contrasting models: (i) a fixed-effects model, where no prior assumptions were made about the form of the variety slopes; and (ii) a random-effects model, where deviations in individual variety slopes away from the mean variety slope formed a normal random population with unknown variance. The analyses gave broadly similar results, but with some significant differences. The random model was considered more reliable for predicting variety performance. The effects of disease were quantified as symptom area and green canopy duration. Models of the relationship between symptom area and yield were site-specific. When site effects were not taken into account, these models had poor predictive precision. Models based on the canopy green area gave robust predictions of yield and were not site-specific. Differences in disease tolerance were detected in a comparison of 25 commercial winter wheat varieties. Tolerance was not detected directly through symptom measurements, but instead through measurements of canopy green area, which provides a measurement of the effects of disease that accounts for differences in canopy size across sites and seasons. The varieties showing greatest tolerance tended to have lower attainable yield than the intolerant varieties. Presence of the 1BL/1RS chromosome translocation, which has been reported to increase radiation use efficiency, appeared to be associated with intolerance

    Fluorescent Imaging of Antigen Released by a Skin-Invading Helminth Reveals Differential Uptake and Activation Profiles by Antigen Presenting Cells

    Get PDF
    Infection of the mammalian host by the parasitic helminth Schistosoma mansoni is accompanied by the release of excretory/secretory molecules (ES) from cercariae which aid penetration of the skin. These ES molecules are potent stimulants of innate immune cells leading to activation of acquired immunity. At present however, it is not known which cells take up parasite antigen, nor its intracellular fate. Here, we develop a technique to label live infectious cercariae which permits the imaging of released antigens into macrophages (MΦ) and dendritic cells (DCs) both in vitro and in vivo. The amine reactive tracer CFDA-SE was used to efficiently label the acetabular gland contents of cercariae which are released upon skin penetration. These ES products, termed ‘0-3hRP’, were phagocytosed by MHC-II+ cells in a Ca+ and actin-dependent manner. Imaging of a labelled cercaria as it penetrates the host skin over 2 hours reveals the progressive release of ES material. Recovery of cells from the skin shows that CFDA-SE labelled ES was initially (3 hrs) taken up by Gr1+MHC-II− neutrophils, followed (24 hrs) by skin-derived F4/80+MHC-IIlo MΦ and CD11c+ MHC-IIhi DC. Subsequently (48 hrs), MΦ and DC positive for CFDA-SE were detected in the skin-draining lymph nodes reflecting the time taken for antigen-laden cells to reach sites of immune priming. Comparison of in vitro-derived MΦ and DC revealed that MΦ were slower to process 0-3hRP, released higher quantities of IL-10, and expressed a greater quantity of arginase-1 transcript. Combined, our observations on differential uptake of cercarial ES by MΦ and DC suggest the development of a dynamic but ultimately balanced response that can be potentially pushed towards immune priming (via DC) or immune regulation (via MΦ)

    The Distribution Of Chlorine And Iodine In Soil In The Vicinity Of Lead Mining And Smelting Operations, Bixby Area, S.E. Missouri, U.S.A.

    Get PDF
    Iodine and Cl are enriched in soils in the vicinity of the Magmont and Buick lead mines near Bixby, southeastern Missouri. The enrichments, up to 5.6 ppm I and 305 ppm Cl, are against regional background of 1.26 ppm I and 41 ppm Cl. The area of highest I and Cl is thought to reflect a zone of base metal sulphide mineralization occurring about 400 m below the surface. Iodine and Cl are also enriched in soils immediately adjacent to a tailings pond, hence these elements would appear to be leached from this source. A zone of enhanced I values (up to 2.65 ppm I) to the north of a lead smelter is superimposed on a much larger zone of lead enrichment (up to 12,000 ppm Pb) and is thought to represent I released from sulphide ores on smelting. © 1988

    The Mannose Receptor (CD206) is an important pattern recognition receptor (PRR) in the detection of the infective stage of the helminth Schistosoma mansoni and modulates IFNγ production.

    Get PDF
    In this study, infective larvae of the parasitic helminth Schistosoma mansoni were shown to contain a large number of glycosylated components specific for the Mannose Receptor (MR; CD206), which is an important pattern recognition receptor (PRR) of the innate immune system. MR ligands were particularly rich in excretory/secretory (E/S) material released during transformation of cercariae into schistosomula, a process critical for infection of the host. E/S material from carboxyfluorescein diacetate succinimidyl ester (CFDA-SE)-labelled cercariae showed enhanced binding by cells lines that over-express the MR. Conversely, uptake was significantly lower by bone marrow-derived macrophages (MΦ) from MR(-/-) mice, although they were more active as judged by enhanced pro-inflammatory cytokine production and CD40 expression. After natural percutaneous infection of MR(-/-) mice with CFDA-SE-labelled parasites, there were fewer cells in the skin and draining lymph nodes that were CFDA-SE(+) compared with wild-type mice, implying reduced uptake and presentation of larval parasite antigen. However, antigen-specific proliferation of skin draining lymph node cells was significantly enhanced and they secreted markedly elevated levels of IFNγ but decreased levels of IL-4. In conclusion, we show that the MR on mononuclear phagocytic cells, which are plentiful in the skin, plays a significant role in internalising E/S material released by the invasive stages of the parasite which in turn modulates their production of pro-inflammatory cytokines. In the absence of the MR, antigen-specific CD4(+) cells are Th1 biased, suggesting that ligation of the MR by glycosylated E/S material released by schistosome larvae modulates the production of CD4(+) cell specific IFNγ

    Maximizing realized yield by breeding for disease tolerance: A case study for Septoria tritici blotch

    Get PDF
    Disease-tolerant cultivars maintain yield in the presence of disease. When disease intensity is high, they can improve a grower's net return compared to less tolerant cultivars. Many authors report a trade-off, whereby higher fully protected yields are correlated with a lower disease tolerance. We analyse the question for breeders: to what extent should they breed for tolerance when it compromises maximizing fully protected yield? Field trials with 147 progeny from five parental crosses of wheat were used to measure yield and tolerance under a range of disease intensities from Septoria tritici blotch (STB; causal organism Zymoseptoria tritici) at a range of sites and seasons. The data define the variation for these traits from which breeders can select. A simple data-driven descriptive model was used to calculate the combination of tolerance and fully protected yield that maximizes actual yield for any given level of disease—quantified by loss of healthy canopy area duration (HAD-loss). This model was combined with data on the year-to-year variability of HAD-loss in the UK to calculate the tolerance and fully protected yield that maximizes the mean actual yield. We found that even when an effective fungicide treatment programme is applied, breeding for tolerance increases the mean actual yield. Some commercially available cultivars were found to have a level of tolerance that leads to yields close to the maximum yield in the presence of disease, others had a lower tolerance leading to suboptimal yields

    Optimal fungicide application timings for disease control are also an effective anti-resistance strategy: a case study for Zymoseptoria tritici (Mycosphaerella graminicola) on wheat

    Get PDF
    Strategies to slow fungicide resistance evolution often advocate early “prophylactic” fungicide application and avoidance of “curative” treatments where possible. There is little evidence to support such guidance. Fungicide applications are usually timed to maximize the efficiency of disease control during the yield-forming period. This article reports mathematical modeling to explore whether earlier timings might be more beneficial for fungicide resistance management compared with the timings that are optimal for efficacy. There are two key timings for fungicide treatment of winter wheat in the United Kingdom: full emergence of leaf three (counting down the canopy) and full emergence of the flag leaf (leaf 1). These timings (referred to as T1 and T2, respectively) maximize disease control on the upper leaves of the crop canopy that are crucial to yield. A differential equation model was developed to track the dynamics of leaf emergence and senescence, epidemic growth, fungicide efficacy, and selection for a resistant strain. The model represented Zymoseptoria tritici on wheat treated twice at varying spray timings. At all fungicide doses tested, moving one or both of the two sprays earlier than the normal T1 and T2 timings reduced selection but also reduced efficacy. Despite these opposing effects, at a fungicide dose just sufficient to obtain effective control, the T1 and T2 timings optimized fungicide effective life (the number of years that effective control can be maintained). At a higher dose, earlier spray timings maximized effective life but caused some reduction in efficacy, whereas the T1 and T2 timings maximized efficacy but resulted in an effective life 1 year shorter than the maximum achievable. </jats:p
    corecore