1,346 research outputs found

    Effect of autolysis on the specificity of bovine spongiform encephalopathy rapid tests

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Routine rapid testing for Bovine Spongiform Encephalopathy (BSE) has highlighted some problems with BSE rapid test performance, the most significant being the number of initially reactive samples and the false positive results on autolyzed tissue. This point is important for BSE active surveillance in risk populations, because tissue autolysis is often unavoidable in routine cases. A robust test suitable for use on field material is therefore needed. To date, very limited information regarding the effect of autolysis on the robustness of rapid tests has been documented; therefore, the National Reference Centre for Animal Encephalopathies (CEA) rapid test laboratory selected 450 autolyzed and negative brain stem samples from fallen stock bovines older than 24 months to assess the specificity of four tests approved for BSE active surveillance: Biorad TeSeE, Enfer TSE version 2.0, Prionics<sup>® </sup>Check LIA, and IDEXX Herd Check BSE Antigen Kit EIA. The samples were graded according to the degree of autolysis and then dissected into five portions, four of which randomly assigned to processing by rapid tests and one to be available for confirmatory Western blot analysis.</p> <p>Findings</p> <p>The specificity of the four systems was 100% for all three grades of autolysis, while the percentage of initially reactive results was 0.00 (95%CI 0.00-0.82), 0.22 (95%CI 0.006-1.23), 0.44 (95%CI 0.05-1.60), and 0.89 (95%CI 0.24-2.26) for the Biorad TeSeE, the Prionics<sup>® </sup>Check LIA, the IDEXX Herd Check BSE and the Enfer TSE tests, respectively. No association with the degree of autolysis could be drawn.</p> <p>Conclusions</p> <p>The present study demonstrates that the four rapid tests can be considered well-running diagnostic tools regardless of tissue quality; nevertheless, the number of initial reactive samples reported for some systems must not be underestimated in routine testing.</p> <p>Furthermore the compliance with the reported performance can be guaranteed only when an ongoing high careful batch quality control system is in place.</p

    Dietary supplementation with ferric tyrosine improves zootechnical performance and reduces caecal Campylobacter spp. load in poultry

    Get PDF
    The objective of this study was to evaluate the effect of ferric tyrosine on the reduction of Campylobacter spp. and zootechnical performance in broilers exposed to Campylobacter spp. using a natural challenge model to simulate commercial conditions. Additionally, the minimum inhibitory concentrations (MIC) of ferric tyrosine against common enteropathogens were evaluated. On day 0, 840 healthy male day-old birds (Ross 308) were randomly allocated to 6 replicate pens of 35 birds and fed diets containing different concentrations of ferric tyrosine (0, 0.02, 0.05 and 0.2 g/kg) in mash form for 42 days. Overall, broilers fed diets containing ferric tyrosine showed significantly improved body weight at day 42 and weight gain compared to the control group. However, birds fed ferric tyrosine ate significantly more than the control birds so significant improvements in FCR were not observed. Microbiological analyses of caecal samples collected on day 42 of the study showed, per gram sample, 2-3 log10 reduction in Campylobacter spp. and 1 log10 reduction in Escherichia coli in the groups fed diets containing ferric tyrosine compared to the control and Salmonella enterica, indicating that ferric tyrosine does not exert antimicrobial activity. Collectively, these results show that birds fed ferric tyrosine grew faster and consumed more feed compared to the control birds indicating potential benefits of faster attainment of slaughter weight with no significant reduction on feed efficiency. Moreover, ferric tyrosine significantly reduces caecal Campylobacter spp. and E. coli indicating potential as a non-antibiotic feed additive to lower the risk of Campylobacter infections transmitted through the food chain

    Characterizing Nanoparticles in Biological Matrices: Tipping Points in Agglomeration State and Cellular Delivery In Vitro.

    Get PDF
    Understanding the delivered cellular dose of nanoparticles is imperative in nanomedicine and nanosafety, yet is known to be extremely complex because of multiple interactions between nanoparticles, their environment, and the cells. Here, we use 3-D reconstruction of agglomerates preserved by cryogenic snapshot sampling and imaged by electron microscopy to quantify the "bioavailable dose" that is presented at the cell surface and formed by the process of individual nanoparticle sequestration into agglomerates in the exposure media. Critically, using 20 and 40 nm carboxylated polystyrene-latex and 16 and 85 nm silicon dioxide nanoparticles, we show that abrupt, dose-dependent "tipping points" in agglomeration state can arise, subsequently affecting cellular delivery and increasing toxicity. These changes are triggered by shifts in the ratio of the total nanoparticle surface area to biomolecule abundance, with the switch to a highly agglomerated state effectively changing the test article midassay, challenging the dose-response paradigm for nanosafety experiments. By characterizing nanoparticle numbers per agglomerate, we show these tipping points can lead to the formation of extreme agglomeration states whereby 90% of an administered dose is contained and delivered to the cells by just the top 2% of the largest agglomerates. We thus demonstrate precise definition, description, and comparison of the nanoparticle dose formed in different experimental environments and show that this description is critical to understanding cellular delivery and toxicity. We further empirically "stress-test" the commonly used dynamic light scattering approach, establishing its limitations to present an analysis strategy that significantly improves the usefulness of this popular nanoparticle characterization technique

    Expert opinion as 'validation' of risk assessment applied to calf welfare

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Recently, a Risk Assessment methodology was applied to animal welfare issues in a report of the European Food Safety Authority (EFSA) on intensively housed calves.</p> <p>Methods</p> <p>Because this is a new and potentially influential approach to derive conclusions on animal welfare issues, a so-called semantic-modelling type 'validation' study was conducted by asking expert scientists, who had been involved or quoted in the report, to give welfare scores for housing systems and for welfare hazards.</p> <p>Results</p> <p>Kendall's coefficient of concordance among experts (n = 24) was highly significant (P < 0.001), but low (0.29 and 0.18 for housing systems and hazards respectively). Overall correlations with EFSA scores were significant only for experts with a veterinary or mixed (veterinary and applied ethological) background. Significant differences in welfare scores were found between housing systems, between hazards, and between experts with different backgrounds. For example, veterinarians gave higher overall welfare scores for housing systems than ethologists did, probably reflecting a difference in their perception of animal welfare.</p> <p>Systems with the lowest scores were veal calves kept individually in so-called "baby boxes" (veal crates) or in small groups, and feedlots. A suckler herd on pasture was rated as the best for calf welfare. The main hazards were related to underfeeding, inadequate colostrum intake, poor stockperson education, insufficient space, inadequate roughage, iron deficiency, inadequate ventilation, poor floor conditions and no bedding. Points for improvement of the Risk Assessment applied to animal welfare include linking information, reporting uncertainty and transparency about underlying values.</p> <p>Conclusion</p> <p>The study provides novel information on expert opinion in relation to calf welfare and shows that Risk Assessment applied to animal welfare can benefit from a semantic modelling approach.</p

    Mycotoxin exposure and human cancer risk : a systematic review of epidemiological studies

    No full text
    In recent years, there has been an increasing interest in investigating the carcinogenicity of mycotoxins in humans. This systematic review aims to provide an overview of data linking exposure to different mycotoxins with human cancer risk. Publications (2019 and earlier) of case–control or longitudinal cohort studies were identified in PubMed and EMBASE. These articles were then screened by independent reviewers and their quality was assessed according to the Newcastle–Ottawa scale. Animal, cross‐sectional, and molecular studies satisfied criteria for exclusion. In total, 14 articles were included: 13 case–control studies and 1 longitudinal cohort study. Included articles focused on associations of mycotoxin exposure with primary liver, breast, and cervical cancer. Overall, a positive association between the consumption of aflatoxin‐contaminated foods and primary liver cancer risk was verified. Two case–control studies in Africa investigated the relationship between zearalenone and its metabolites and breast cancer risk, though conflicting results were reported. Two case–control studies investigated the association between hepatocellular carcinoma and fumonisin B1 exposure, but no significant associations were observed. This systematic review incorporates several clear observations of dose‐dependent associations between aflatoxins and liver cancer risk, in keeping with IARC Monograph conclusions. Only few human epidemiological studies investigated the associations between mycotoxin exposures and cancer risk. To close this gap, more in‐depth research is needed to unravel evidence for other common mycotoxins, such as deoxynivalenol and ochratoxin A. The link between mycotoxin exposures and cancer risk has mainly been established in experimental studies, and needs to be confirmed in human epidemiological studies to support the evidence‐based public health strategies

    Good Laboratory Practices: Myers et al. Respond

    Get PDF
    Reproduced with permission from Environmental Health Perspectives. DOI:10.1289/ehp.0900884RMyers et al. respond to a letter written by Becker et al. regarding Myers' article "Why public health agencies cannot depend on Good Laboratory Practices as a criterion for selecting data: the case of bisphenol A.

    Evaluation of a spring-powered captive bolt gun for killing kangaroo pouch young

    Get PDF
    Context: During commercial harvesting or non-commercial kangaroo culling programs, dependent young of shot females are required to be euthanased to prevent suffering and because they would be unlikely to survive. However, the current method for killing pouch young, namely a single, forceful blow to the base of the skull, is applied inconsistently by operators and perceived by the public to be inhumane. Aims: To determine whether an alternative method for killing pouch young, namely a spring-operated captive bolt gun, is effective at causing insensibility in kangaroo pouch young. Methods: Trials of spring-operated captive bolt guns were conducted first on the heads of 15 dead kangaroo young and then on 21 live pouch young during commercial harvesting. We assessed the effectiveness at causing insensibility in live animals and damage caused to specific brain areas. We also measured depth of bolt penetration and skull thickness. Performance characteristics (e.g. bolt velocity) of two types of spring-operated guns were also measured and compared with cartridge-powered devices. Key results: When tested on the heads of dead animals, the spring-operated captive bolt gun consistently produced a large entrance cavity and a well defined wound tract, which extended into the cerebrum, almost extending the full thickness of the brain, including the brainstem. When tested on live pouch young, the captive bolt gun caused immediate insensibility in only 13 of 21 animals. This 62% success rate is significantly below the 95% minimum acceptable threshold for captive bolt devices in domestic animal abattoirs. Failure to stun was related to bolt placement, but other factors such as bolt velocity, bolt diameter and skull properties such as thickness and hardness might have also contributed. Spring-operated captive bolt guns delivered 20 times less kinetic energy than did cartridge-powered devices. Conclusions: Spring-operated captive bolt guns cannot be recommended as an acceptable or humane method for stunning or killing kangaroo pouch young. Implications: Captive bolt guns have potential as a practical alternative to blunt head trauma for effective euthanasia and reducing animal (and observer) distress. However, operators must continue to use the existing prescribed killing methods until cartridge-powered captive bolt guns have been trialled as an alternative bolt propelling method. Additional keywords: animal welfare, blunt trauma, culling, euthanasia, humaneness, kangaroo harvesting

    Towards a Pathogenic Escherichia coli Detection Platform Using Multiplex SYBR®Green Real-Time PCR Methods and High Resolution Melting Analysis

    Get PDF
    Escherichia coli is a group of bacteria which has raised a lot of safety concerns in recent years. Five major intestinal pathogenic groups have been recognized amongst which the verocytotoxin or shiga-toxin (stx1 and/or stx2) producing E. coli (VTEC or STEC respectively) have received a lot of attention recently. Indeed, due to the high number of outbreaks related to VTEC strains, the European Food Safety Authority (EFSA) has requested the monitoring of the “top-five” serogroups (O26, O103, O111, O145 and O157) most often encountered in food borne diseases and addressed the need for validated VTEC detection methods. Here we report the development of a set of intercalating dye Real-time PCR methods capable of rapidly detecting the presence of the toxin genes together with intimin (eae) in the case of VTEC, or aggregative protein (aggR), in the case of the O104:H4 strain responsible for the outbreak in Germany in 2011. All reactions were optimized to perform at the same annealing temperature permitting the multiplex application in order to minimize the need of material and to allow for high-throughput analysis. In addition, High Resolution Melting (HRM) analysis allowing the discrimination among strains possessing similar virulence traits was established. The development, application to food samples and the flexibility in use of the methods are thoroughly discussed. Together, these Real-time PCR methods facilitate the detection of VTEC in a new highly efficient way and could represent the basis for developing a simple pathogenic E. coli platform
    corecore