687 research outputs found

    Modelling the impacts of pasture contamination and stocking rate for the development of targeted selective treatment strategies for Ostertagia ostertagi infection in calves

    Get PDF
    A simulation study was carried out to assess whether variation in pasture contamination or stocking rate impact upon the optimal design of targeted selective treatment (TST) strategies. Two methods of TST implementation were considered: 1) treatment of a fixed percentage of a herd according to a given phenotypic trait, or 2) treatment of individuals that exceeded a threshold value for a given phenotypic trait. Four phenotypic traits, on which to base treatment were considered: 1) average daily bodyweight gain, 2) faecal egg count, 3) plasma pepsinogen, or 4) random selection. Each implementation method (fixed percentage or threshold treatment) and determinant criteria (phenotypic trait) was assessed in terms of benefit per R (BPR), the ratio of average benefit in weight gain to change in frequency of resistance alleles R (relative to an untreated population). The impact of pasture contamination on optimal TST strategy design was investigated by setting the initial pasture contamination to 100, 200 or 500 O. ostertagi L3/kg DM herbage; stocking rate was investigated at a low (3calves/ha), conventional (5 calves/ha) or high (7 calves/ha) stocking rates. When treating a fixed percentage of the herd, treatments according to plasma pepsinogen or random selection were identified as the most beneficial (i.e. resulted in the greatest BPR) for all levels of initial pasture contamination and all stocking rates. Conversely when treatments were administered according to threshold values ADG was most beneficial, and was identified as the best TST strategy (i.e. resulted in the greatest overall BPR) for all levels of initial pasture contamination and all stocking rates

    Modelling the consequences of targeted selective treatment strategies on performance and emergence of anthelmintic resistance amongst grazing calves

    Get PDF
    The development of anthelmintic resistance by helminths can be slowed by maintaining refugia on pasture or in untreated hosts. Targeted selective treatments (TST) may achieve this through the treatment only of individuals that would benefit most from anthelmintic, according to certain criteria. However TST consequences on cattle are uncertain, mainly due to difficulties of comparison between alternative strategies. We developed a mathematical model to compare: 1) the most ‘beneficial’ indicator for treatment selection and 2) the method of selection of calves exposed to Ostertagia ostertagi, i.e. treating a fixed percentage of the population with the lowest (or highest) indicator values versus treating individuals who exceed (or are below) a given indicator threshold. The indicators evaluated were average daily gain (ADG), faecal egg counts (FEC), plasma pepsinogen, combined FEC and plasma pepsinogen, versus random selection of individuals. Treatment success was assessed in terms of benefit per R (BPR), the ratio of average benefit in weight gain to change in frequency of resistance alleles R (relative to an untreated population). The optimal indicator in terms of BPR for fixed percentages of calves treated was plasma pepsinogen and the worst ADG; in the latter case treatment was applied to some individuals who were not in need of treatment. The reverse was found when calves were treated according to threshold criteria, with ADG being the best target indicator for treatment. This was also the most beneficial strategy overall, with a significantly higher BPR value than any other strategy, but its degree of success depended on the chosen threshold of the indicator. The study shows strong support for TST, with all strategies showing improvements on calves treated selectively, compared with whole-herd treatment at 3, 8, 13 weeks post-turnout. The developed model appeared capable of assessing the consequences of other TST strategies on calf populations

    A stochastic model to investigate the effects of control strategies on calves exposed to Ostertagia ostertagi

    Get PDF
    Predicting the effectiveness of parasite control strategies requires accounting for the responses of individual hosts and the epidemiology of parasite supra- and infra-populations. The first objective was to develop a stochastic model that predicted the parasitological interactions within a group of first season grazing calves challenged by Ostertagia ostertagi, by considering phenotypic variation amongst the calves and variation in parasite infra-population. Model behaviour was assessed using variations in parasite supra-population and calf stocking rate. The model showed the initial pasture infection level to have little impact on parasitological output traits, such as worm burdens and FEC, or overall performance of calves, whereas increasing stocking rate had a disproportionately large effect on both parasitological and performance traits. Model predictions were compared with published data taken from experiments on common control strategies, such as reducing stocking rates, the ‘dose and move’ strategy and strategic treatment with anthelmintic at specific times. Model predictions showed in most cases reasonable agreement with observations, supporting model robustness. The stochastic model developed is flexible, with the potential to predict the consequences of other nematode control strategies, such as targeted selective treatments on groups of grazing calves

    Maintaining the validity of inference from linear mixed models in stepped-wedge cluster randomized trials under misspecified random-effects structures

    Full text link
    Linear mixed models are commonly used in analyzing stepped-wedge cluster randomized trials (SW-CRTs). A key consideration for analyzing a SW-CRT is accounting for the potentially complex correlation structure, which can be achieved by specifying a random effects structure. Common random effects structures for a SW-CRT include random intercept, random cluster-by-period, and discrete-time decay. Recently, more complex structures, such as the random intervention structure, have been proposed. In practice, specifying appropriate random effects can be challenging. Robust variance estimators (RVE) may be applied to linear mixed models to provide consistent estimators of standard errors of fixed effect parameters in the presence of random-effects misspecification. However, there has been no empirical investigation of RVE for SW-CRT. In this paper, we first review five RVEs (both standard and small-sample bias-corrected RVEs) that are available for linear mixed models. We then describe a comprehensive simulation study to examine the performance of these RVEs for SW-CRTs with a continuous outcome under different data generators. For each data generator, we investigate whether the use of a RVE with either the random intercept model or the random cluster-by-period model is sufficient to provide valid statistical inference for fixed effect parameters, when these working models are subject to misspecification. Our results indicate that the random intercept and random cluster-by-period models with RVEs performed similarly. The CR3 RVE estimator, coupled with the number of clusters minus two degrees of freedom correction, consistently gave the best coverage results, but could be slightly anti-conservative when the number of clusters was below 16. We summarize the implications of our results for linear mixed model analysis of SW-CRTs in practice.Comment: Correct figure legend and table Typo

    Developments in automated flexible gauging and the uncertainty associated with comparative coordinate measurement

    Get PDF
    Traditional manufacturing uses coordinate measuring machines (CMMs) or component-specific gauging for in-process and post-process inspection. In assessing the fitness for purpose of these measuring systems, it is necessary to evaluate the uncertainty associated with CMM measurement. However, this is not straightforward since the measurement results are subject to a large range of factors including systematic and environmental effects that are difficult to quantify. In addition, machine tool errors and thermal effects of the machine and component can have a significant impact on the comparison between on-machine measurement, in-process measurement and post-process inspection. Coordinate measurements can also be made in a gauging/comparator mode in which measurements of a work piece are compared with those of a calibrated master artefact, and many of the difficulties associated with evaluating the measurement uncertainties are avoided since many of the systematic effects cancel out. Therefore, the use of flexible gauging either as part of an automated or manually-served workflow is particularly beneficial

    Evaluation of automated flexible gauge performance using experimental designs

    Get PDF
    An essential part of assessing whether a measurement or gauging system meets its intended purpose is to estimate the measurement uncertainties. This paper employs the design of experiments (DOE) approach to implement a practical analysis of measurement uncertainty of Renishaw Equator automated flexible gauge. The factors of interest are measurement strategy, part location, and environmental effects. The experimental results show the ability of the versatile gauge to effectively meet its measurement capability in both discrete-point probing and scanning measuring modes within its whole measuring volume and, in particular, at high scanning speeds and under workshop conditions

    Modelling uncertainty associated with comparative coordinate measurement through analysis of variance techniques

    Get PDF
    Over the last few years, various techniques and metrological instruments have been proposed to achieve accurate process control on the shop floor at low cost. An efficient solution that has been recently adopted for this complex task is to perform coordinate measurement in comparator mode in order to eliminate the influence of systematic effects associated with the measurement system. In this way, more challenging parts can be inspected in the shop floor environment and higher quality products can be produced while also enabling feedback to the production loop. This paper is concerned with the development of a statistical model for uncertainty associated with comparative coordinate measurement through analysis of variance (ANOVA) techniques. It employs the Renishaw Equator comparative gauging system and a production part with thirteen circular features of three different diameters. An experimental design is applied to investigate the influence of two key factors and their interaction on the comparator measurement uncertainty. The factors of interest are the scanning speed and the sampling point density. In particular, three different scanning speeds and two different sampling point densities are considered. The measurands of interest are the circularity of each circular feature. The present experimental design is meant to be representative of the actual working conditions in which the automated flexible gauge is used. The Equator has been designed for high speed comparative gauging on the shop floor with possibly wide temperature variation. Therefore, two replicates are used at different temperature conditions to decouple the influence of environmental effects and thus drawing more refined conclusions on the statistical significance

    The spatial distribution of esophageal and gastric cancer in Caspian region of Iran: An ecological analysis of diet and socio-economic influences

    Get PDF
    Recent studies have suggested a systematic geographic pattern of esophageal cancer (EC) and gastric cancer (GC) incidence in the Caspian region of Iran. The aims of this study were to investigate the association between these cancers and the region's dietary and socioeconomic risk factors and to map EC and GC after adjustment for the risk factors and the removal of random and geographic variations from area specific age standardised incidence ratios (SIRs)

    Predicting the Fine Particle Fraction of Dry Powder Inhalers Using Artificial Neural Networks

    Get PDF
    This document is the Accepted Manuscript version of a Published Work that appeared in final form in Journal of Pharmaceutical Sciences after peer review and technical editing by the publisher. Under embargo. Embargo end date: 9 November 2017. The version of record, Joanna Muddle, Stewart B. Kirton, Irene Parisini, Andrew Muddle, Darragh Murnane, Jogoth Ali, Marc Brown, Clive Page and Ben Forbes, ‘Predicting the Fine Particle Fraction of Dry Powder Inhalers Using Artificial Neural Networks’, Journal of Pharmaceutical Sciences, Vol 106(1): 313-321, first published online on 9 November 2016, is available online via doi: http://dx.doi.org/10.1016/j.xphs.2016.10.002 0022-3549/© 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.Dry powder inhalers are increasingly popular for delivering drugs to the lungs for the treatment of respiratory diseases, but are complex products with multivariate performance determinants. Heuristic product development guided by in vitro aerosol performance testing is a costly and time-consuming process. This study investigated the feasibility of using artificial neural networks (ANNs) to predict fine particle fraction (FPF) based on formulation device variables. Thirty-one ANN architectures were evaluated for their ability to predict experimentally determined FPF for a self-consistent dataset containing salmeterol xinafoate and salbutamol sulfate dry powder inhalers (237 experimental observations). Principal component analysis was used to identify inputs that significantly affected FPF. Orthogonal arrays (OAs) were used to design ANN architectures, optimized using the Taguchi method. The primary OA ANN r2 values ranged between 0.46 and 0.90 and the secondary OA increased the r2 values (0.53-0.93). The optimum ANN (9-4-1 architecture, average r2 0.92 ± 0.02) included active pharmaceutical ingredient, formulation, and device inputs identified by principal component analysis, which reflected the recognized importance and interdependency of these factors for orally inhaled product performance. The Taguchi method was effective at identifying successful architecture with the potential for development as a useful generic inhaler ANN model, although this would require much larger datasets and more variable inputs.Peer reviewe

    Fluorescent IGF-II analogues for FRET-based investigations into the binding of IGF-II to the IGF-1R

    Get PDF
    This article is licensed under a Creative Commons Attribution 3.0 Unported Licence. Material from this article can be used in other publications provided that the correct acknowledgement is given with the reproduced material.The interaction of IGF-II with the insulin receptor (IR) and type 1 insulin-like growth factor receptor (IGF-1R) has recently been identified as potential therapeutic target for the treatment of cancer. Understanding the interactions of IGF-II with these receptors is required for the development of potential anticancer therapeutics. This work describes an efficient convergent synthesis of native IGF-II and two nonnative IGF-II analogues with coumarin fluorescent probes incorporated at residues 19 and 28. These fluorescent analogues bind with nanomolar affinities to the IGF-1R and are suitable for use in fluorescence resonance energy transfer (FRET) studies. From these studies the F19Cou IGF-II and F28Cou IGF-II proteins were identified as good probes for investigating the binding interactions of IGF-II with the IGF-1R and its other high affinity binding partners
    • …
    corecore