1,041 research outputs found

    Developing parasite control strategies in organic systems

    Get PDF
    This report was presented at the UK Organic Research 2002 Conference. Organic farmers have taken a lead in attempting to reduce dependence on pharmaceutical control of parasites in farmed livestock. Focussing on management and nutrition, the objective of this research is to further develop control strategies, which can support and increase the flexibility of clean grazing systems for sheep and cattle. The approach has been to combine on-farm epidemiological studies, with replicated experiments, in order to develop and demonstrate better systems of control applicable to UK organic farms. Preliminary data from the first years' epidemiological studies are presented in this paper

    Modelling the impacts of pasture contamination and stocking rate for the development of targeted selective treatment strategies for Ostertagia ostertagi infection in calves

    Get PDF
    A simulation study was carried out to assess whether variation in pasture contamination or stocking rate impact upon the optimal design of targeted selective treatment (TST) strategies. Two methods of TST implementation were considered: 1) treatment of a fixed percentage of a herd according to a given phenotypic trait, or 2) treatment of individuals that exceeded a threshold value for a given phenotypic trait. Four phenotypic traits, on which to base treatment were considered: 1) average daily bodyweight gain, 2) faecal egg count, 3) plasma pepsinogen, or 4) random selection. Each implementation method (fixed percentage or threshold treatment) and determinant criteria (phenotypic trait) was assessed in terms of benefit per R (BPR), the ratio of average benefit in weight gain to change in frequency of resistance alleles R (relative to an untreated population). The impact of pasture contamination on optimal TST strategy design was investigated by setting the initial pasture contamination to 100, 200 or 500 O. ostertagi L3/kg DM herbage; stocking rate was investigated at a low (3calves/ha), conventional (5 calves/ha) or high (7 calves/ha) stocking rates. When treating a fixed percentage of the herd, treatments according to plasma pepsinogen or random selection were identified as the most beneficial (i.e. resulted in the greatest BPR) for all levels of initial pasture contamination and all stocking rates. Conversely when treatments were administered according to threshold values ADG was most beneficial, and was identified as the best TST strategy (i.e. resulted in the greatest overall BPR) for all levels of initial pasture contamination and all stocking rates

    Modelling the consequences of targeted selective treatment strategies on performance and emergence of anthelmintic resistance amongst grazing calves

    Get PDF
    The development of anthelmintic resistance by helminths can be slowed by maintaining refugia on pasture or in untreated hosts. Targeted selective treatments (TST) may achieve this through the treatment only of individuals that would benefit most from anthelmintic, according to certain criteria. However TST consequences on cattle are uncertain, mainly due to difficulties of comparison between alternative strategies. We developed a mathematical model to compare: 1) the most ‘beneficial’ indicator for treatment selection and 2) the method of selection of calves exposed to Ostertagia ostertagi, i.e. treating a fixed percentage of the population with the lowest (or highest) indicator values versus treating individuals who exceed (or are below) a given indicator threshold. The indicators evaluated were average daily gain (ADG), faecal egg counts (FEC), plasma pepsinogen, combined FEC and plasma pepsinogen, versus random selection of individuals. Treatment success was assessed in terms of benefit per R (BPR), the ratio of average benefit in weight gain to change in frequency of resistance alleles R (relative to an untreated population). The optimal indicator in terms of BPR for fixed percentages of calves treated was plasma pepsinogen and the worst ADG; in the latter case treatment was applied to some individuals who were not in need of treatment. The reverse was found when calves were treated according to threshold criteria, with ADG being the best target indicator for treatment. This was also the most beneficial strategy overall, with a significantly higher BPR value than any other strategy, but its degree of success depended on the chosen threshold of the indicator. The study shows strong support for TST, with all strategies showing improvements on calves treated selectively, compared with whole-herd treatment at 3, 8, 13 weeks post-turnout. The developed model appeared capable of assessing the consequences of other TST strategies on calf populations

    A stochastic model to investigate the effects of control strategies on calves exposed to Ostertagia ostertagi

    Get PDF
    Predicting the effectiveness of parasite control strategies requires accounting for the responses of individual hosts and the epidemiology of parasite supra- and infra-populations. The first objective was to develop a stochastic model that predicted the parasitological interactions within a group of first season grazing calves challenged by Ostertagia ostertagi, by considering phenotypic variation amongst the calves and variation in parasite infra-population. Model behaviour was assessed using variations in parasite supra-population and calf stocking rate. The model showed the initial pasture infection level to have little impact on parasitological output traits, such as worm burdens and FEC, or overall performance of calves, whereas increasing stocking rate had a disproportionately large effect on both parasitological and performance traits. Model predictions were compared with published data taken from experiments on common control strategies, such as reducing stocking rates, the ‘dose and move’ strategy and strategic treatment with anthelmintic at specific times. Model predictions showed in most cases reasonable agreement with observations, supporting model robustness. The stochastic model developed is flexible, with the potential to predict the consequences of other nematode control strategies, such as targeted selective treatments on groups of grazing calves

    Growth, feed intake and diet selection in pigs: theory and experiments

    Get PDF
    A theory of growth and feed intake in the pig is proposed and the results of five experiments to test it are reported here. An attempt is first made to describe the potential growth in pigs, that is, growth under non-limiting conditions; the conditions needed to allow potential growth to be retained are then considered. Two ways of providing non-limiting feeding conditions are discussed: a single balanced feed and a set of feeds given as a choice. In addition, a model which predicts the voluntary feed intake of pigs is also developed and tested in experiments. The results from pigs offered single feeds in the first two experiments were consistent with the predictions of the model, which were that the rate of feed intake would increase as the protein content of the feeds was decreased. The size of the increase depended on the ability of the pig to lose heat. In these experiments, when pigs were offered a pair of feeds as a choice, a combination of which was non-limiting, the results suggested that this method cannot be successfully used to attain the potential growth of pigs. The diet selection results were characterised by a considerable variation in the diets selected by individual pigs, and only some pigs achieved what was estimated to be their potential rate of growth. It was suggested that pigs which failed to select a non-limiting diet did not have the necessary chance to choose. Experiment 3 evaluated a simple method of ensuring that pigs are given both the necessary choice, and the chance to choose. This was achieved by giving them the opportunity to sample the single feeds, which were to be offered as a choice, alone on alternate days for a short period of six days. Subsequently, pigs given a choice between two feeds were able to select a non-limiting diet. Experiment 4 incorporated the method established previously and consisted of a severe investigation into the rules of diet selection. It was concluded that pigs are able to avoid excess of nutrient, in this case protein, intake or to select the best possible diet in less favourable conditions, ie. a choice between two limiting feeds. The last experiment consisted of an extended test of the theory that a pig will select a diet which is a reflection of its degree of maturity, state and sex. Pigs made fat and delayed in growth in one period were subsequently given the opportunity to recover on a pair of feeds offered as a choice. The diets selected by the fat pigs satisfied their requirements for compensatory protein gain allowing only a slow rate of lipid gain. In addition, they met the different growth and fattening requirements by the two sexes. All these findings are discussed in relation to the use of choice-feeding as an independent test of other estimates of resource requirements, as a feeding technique when the potential growth of pigs is to be observed and as a help in predicting the feeding behaviour in pigs

    Automated Classification for Visual-Only Post-Mortem Inspection of Porcine Pathology

    Get PDF

    The first step towards genetic selection for host tolerance to infectious pathogens: Obtaining the tolerance phenotype through group estimates

    Get PDF
    Reliable phenotypes are paramount for meaningful quantification of genetic variation and for estimating individual breeding values on which genetic selection is based. In this paper we assert that genetic improvement of host tolerance to disease, although desirable, may be first of all handicapped by the ability to obtain unbiased tolerance estimates at a phenotypic level. In contrast to resistance, which can be inferred by appropriate measures of within host pathogen burden, tolerance is more difficult to quantify as it refers to change in performance with respect to changes in pathogen burden. For this reason, tolerance phenotypes have only been specified at the level of a group of individuals, where such phenotypes can be estimated using regression analysis. However, few studies have raised the potential bias in these estimates resulting from confounding effects between resistance and tolerance. Using a simulation approach, we demonstrate (i) how these group tolerance estimates depend on within group variation and co-variation in resistance, tolerance and vigour (performance in a pathogen free environment); and (ii) how tolerance estimates are affected by changes in pathogen virulence over the time course of infection and by the timing of measurements. We found that in order to obtain reliable group tolerance estimates, it is important to account for individual variation in vigour, if present, and that all individuals are at the same stage of infection when measurements are taken. The latter requirement makes estimation of tolerance based on cross-sectional field data challenging, as individuals become infected at different time points and the individual onset of infection is unknown. Repeated individual measurements of within host pathogen burden and performance would not only be valuable for inferring the infection status of individuals in field conditions but would also provide tolerance estimates that capture the entire time course of infection
    corecore