11 research outputs found

    Development of a framework for genotyping bovine-derived Cryptosporidium parvum, using a multilocus fragment typing tool

    Get PDF
    Background: There is a need for an integrated genotyping approach for C. parvum; no sufficiently discriminatory scheme to date has been fully validated or widely adopted by veterinary or public health researchers. Multilocus fragment typing (MLFT) can provide good differentiation and is relatively quick and cheap to perform. A MLFT tool was assessed in terms of its typeability, specificity, precision (repeatability and reproducibility), accuracy and ability to genotypically discriminate bovine-derived Cryptosporidium parvum. Methods: With the aim of working towards a consensus, six markers were selected for inclusion based on their successful application in previous studies: MM5, MM18, MM19, TP14, MS1 and MS9. Alleles were assigned according to the fragment sizes of repeat regions amplified, as determined by capillary electrophoresis. In addition, a region of the GP60 gene was amplified and sequenced to determine gp60 subtype and this was added to the allelic profiles of the 6 markers to determine the multilocus genotype (MLG). The MLFT tool was applied to 140 C. parvum samples collected in two cross-sectional studies of UK calves, conducted in Cheshire in 2004 (principally dairy animals) and Aberdeenshire/Caithness in 2011 (beef animals). Results: Typeability was 84 %. The primers did not amplify tested non-parvum species frequently detected in cattle. In terms of repeatability, within- and between-run fragment sizes showed little variability. Between laboratories, fragment sizes differed but allele calling was reproducible. The MLFT had good discriminatory ability (Simpson’s Index of Diversity, SID, was 0.92), compared to gp60 sequencing alone (SID 0.44). Some markers were more informative than others, with MS1 and MS9 proving monoallelic in tested samples. Conclusions: Further inter-laboratory trials are now warranted with the inclusion of human-derived C. parvum samples, allowing progress towards an integrated, standardised typing scheme to enable source attribution and to determine the role of livestock in future outbreaks of human C. parvum

    Profiling the Responses of Soccer Substitutes: A Review of Current Literature.

    Get PDF
    Depending upon competition regulations, the laws of soccer allow between three and an unlimited number of substitutions that can be made on either a permanent or rolling basis. Substitutes are typically introduced to minimise/offset the effects of fatigue, alter tactics, replace players deemed as underperforming or injured, and/or give playing time to youth players or to squad members returning from injury. While the match-day practices of substitutes include participation in the pre-match warm-up, and sporadic periods of rewarm-up activity, it is currently unclear as to whether these pre-entry preparations facilitate optimal match performance thereafter. Acknowledging the contextual factors that possibly influence substitutes' performance, this review summarises the presently available literature on soccer substitutes, and makes recommendations for future research. Literature searching and screening yielded 13 studies, which have typically focused on characterising: (1) the patterns, including timing, of substitutes' introduction; (2) indices of match-performance; and (3) the emotional experiences of soccer substitutes. The majority of substitutions occur after the first-half has ended (i.e. at half-time or during the second-half), with introduced players exceeding the second-half physical performances of those who started the match. Observations of progressive improvements in running performance as playing time increases, and findings that substitutes mostly experience negative emotions, highlight the potential inadequacies of pre-match preparations, and present future research opportunities. Additional work is therefore needed to confirm these findings and to determine the efficacy of current preparation strategies, thereby providing opportunities to assess then address substitutes' pre-pitch entry preparations, on-field performance and emotional responses

    The central amygdala controls learning in the lateral amygdala

    No full text
    Experience-driven synaptic plasticity in the lateral amygdala is thought to underlie the formation of associations between sensory stimuli and an ensuing threat. However, how the central amygdala participates in such a learning process remains unclear. Here we show that PKC-delta-expressing central amygdala neurons are essential for the synaptic plasticity underlying learning in the lateral amygdala, as they convey information about the unconditioned stimulus to lateral amygdala neurons during fear conditioning

    Iron intake and markers of iron status and risk of Barrett's esophagus and esophageal adenocarcinoma

    No full text
    OBJECTIVE: To investigate the association between iron intake and iron status with Barrett’s esophagus (BE) and esophageal adenocarcinoma (EAC). METHODS: 220 BE patients, 224 EAC patients, and 256 frequency-matched controls completed a lifestyle and food frequency questionnaire, and provided serum and toenail samples between 2002 and 2005. Using multiple logistic regression, odds ratios (OR) and 95% confidence intervals (95%CI) were calculated within quartiles of intake/status. RESULTS: Comparing the fourth to the first quartile, ferritin (OR 0.47; 95%CI: 0.23, 0.97) and transferrin saturation (OR 0.41; 95%CI: 0.20, 0.82) were negatively associated with BE; whilst total iron binding capacity was positively associated per 50 µg/dl increment (OR 1.47; 95%CI: 1.12, 1.92). Comparing the fourth to the first quartile, iron intake (OR 0.50; 95%CI: 0.25, 0.98), non-heme iron intake per 10 mg/day increment (OR 0.29; 95%CI: 0.08, 0.99), and toenail iron (OR 0.40; 95%CI: 0.17, 0.93) were negatively associated with EAC; whilst heme iron intake was positively associated (OR 3.11 95%CI: 1.46, 6.61). PRINCIPAL CONCLUSION: In contrast to the hypothesis that increased iron intakes and higher iron stores are a risk factor for BE and EAC, this study suggests that higher iron intakes and stores may have a protective association with BE and EAC, with the exception of what was found for heme iron intake
    corecore