192 research outputs found

    Anti-microbial Use in Animals: How to Assess the Trade-offs

    Get PDF
    Antimicrobials are widely used in preventive and curative medicine in animals. Benefits from curative use are clear – it allows sick animals to be healthy with a gain in human welfare. The case for preventive use of antimicrobials is less clear cut with debates on the value of antimicrobials as growth promoters in the intensive livestock industries. The possible benefits from the use of antimicrobials need to be balanced against their cost and the increased risk of emergence of resistance due to their use in animals. The study examines the importance of animals in society and how the role and management of animals is changing including the use of antimicrobials. It proposes an economic framework to assess the trade-offs of anti-microbial use and examines the current level of data collection and analysis of these trade-offs. An exploratory review identifies a number of weaknesses. Rarely are we consistent in the frameworks applied to the economic assessment anti-microbial use in animals, which may well be due to gaps in data or the prejudices of the analysts. There is a need for more careful data collection that would allow information on (i) which species and production systems antimicrobials are used in, (ii) what active substance of antimicrobials and the application method and (iii) what dosage rates. The species need to include companion animals as well as the farmed animals as it is still not known how important direct versus indirect spread of resistance to humans is. In addition, research is needed on pricing antimicrobials used in animals to ensure that prices reflect production and marketing costs, the fixed costs of anti-microbial development and the externalities of resistance emergence. Overall, much work is needed to provide greater guidance to policy, and such work should be informed by rigorous data collection and analysis systems

    Preliminary characterization of a Moroccan honey with a predominance of Bupleurum spinosum pollen

    Get PDF
    Honey with Bupleurum spinosum (zandaz) as a main pollen source has not been the subject of previous detailed study. Therefore, twelve Moroccan samples of this honey were subjected to melissopalynological, physicochemical and microbiological quality characterization, as well as antioxidant activity assessment. From a quality point of view, almost all samples were within the limits established by Codex Alimentarius, and/or the European legislation. All samples presented predominance of B. spinosum pollen (more than 48%). Relatively high levels of trehalose (1.3-4.0 g/100 g) and melezitose (1.5-2.8 g/100 g) were detected. Those sugars, not common in monofloral honeys, could be used as an important factor to discriminate zandaz honey. Flavonoid content correlated positively with the honey color, melanoidin and polyphenol content, and negatively with the IC50 values of scavenging ABTS (2,2' - azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) free radicals, while proline amount correlated negatively with IC50 values of nitric oxide scavenging activity and chelating power. This correlation supports the use of anti-oxidant activities as important variables for PCA (principal component analysis). Both components explained 70% from the given data, and showed certain homogeneity upon analyzed samples independent of the region, suggesting the importance of B. spinosum nectar in the resulting honey characteristics.Fundacao para a Ciencia e Tecnologia for Research Center [UID/BIM/04773/2013 CBMR 1334, UID/AGR/00239/2013, UID/BIA/04050/2013 (POCI-01-0145-FEDER-007569)]; ERDF through the COMPETE - Programa Operacional Competitividade e Internacionalizacao (POCI

    Population exposure to trace elements in the Kilembe copper mine area, Western Uganda: a pilot study

    Get PDF
    The mining and processing of copper in Kilembe, Western Uganda, from 1956 to 1982 left over 15 Mt. of tailings containing cupriferous and cobaltiferous pyrite dumped within a mountain river valley. This pilot study was conducted to assess the nature and extent of risk to local populations from metal contamination arising from those mining activities. We determined trace element concentrations in mine tailings, soils, locally cultivated foods,house dust, drinking water and human biomarkers (toenails) using ICP-MS analysis of acid digested samples. The results showed that tailings, containing higher concentrations of Co, Cu, Ni and As compared with world average crust values had eroded and contaminated local soils. Pollution load indices revealed that 51% of agricultural soils sampled were contaminated with trace elements. Local water supplies were contaminated, with Co concentrations that exceeded Wisconsin (US) thresholds in 25% of domestic water supplies and 40% of Nyamwamba river water samples. Zinc exceeded WHO/FAO thresholds of 99.4 mg kg−1 in 36% of Amaranthus vegetable samples, Cu exceeded EC thresholds of 20 mg kg−1 in 19% of Amaranthus while Pb exceeded WHO thresholds of 0.3 mg kg−1 in 47% of Amaranthus vegetables. In bananas, 20% of samples contained Pb concentrations that exceeded the WHO/FAO recommended threshold of 0.3 mg kg−1. However, risk assessment of local foods and water, based on hazard quotients (HQ values) revealed no potential health effects. The high external contamination of volunteers' toenails with some elements (even after a washing process) calls into question their use as a biomarker for metal exposure in human populations where feet are frequently exposed to soil

    Value of eight-amino-acid matches in predicting the allergenicity status of proteins: an empirical bioinformatic investigation

    Get PDF
    The use of biotechnological techniques to introduce novel proteins into food crops (transgenic or GM crops) has motivated investigation into the properties of proteins that favor their potential to elicit allergic reactions. As part of the allergenicity assessment, bioinformatic approaches are used to compare the amino-acid sequence of candidate proteins with sequences in a database of known allergens to predict potential cross reactivity between novel food proteins and proteins to which people have become sensitized. Two criteria commonly used for these queries are searches over 80-amino-acid stretches for >35% identity, and searches for 8-amino-acid contiguous matches. We investigated the added value provided by the 8-amino-acid criterion over that provided by the >35%-identity-over-80-amino-acid criterion, by identifying allergens pairs that only met the former criterion, but not the latter criterion. We found that the allergen-sequence pairs only sharing 8-amino-acid identity, but not >35% identity over 80 amino acids, were unlikely to be cross reactive allergens. Thus, the common search for 8-amino-acid identity between novel proteins and known allergens appears to be of little additional value in assessing the potential allergenicity of novel proteins

    Allergic sensitization: screening methods

    Get PDF
    Experimental in silico, in vitro, and rodent models for screening and predicting protein sensitizing potential are discussed, including whether there is evidence of new sensitizations and allergies since the introduction of genetically modified crops in 1996, the importance of linear versus conformational epitopes, and protein families that become allergens. Some common challenges for predicting protein sensitization are addressed: (a) exposure routes; (b) frequency and dose of exposure; (c) dose-response relationships; (d) role of digestion, food processing, and the food matrix; (e) role of infection; (f) role of the gut microbiota; (g) influence of the structure and physicochemical properties of the protein; and (h) the genetic background and physiology of consumers. The consensus view is that sensitization screening models are not yet validated to definitively predict the de novo sensitizing potential of a novel protein. However, they would be extremely useful in the discovery and research phases of understanding the mechanisms of food allergy development, and may prove fruitful to provide information regarding potential allergenicity risk assessment of future products on a case by case basis. These data and findings were presented at a 2012 international symposium in Prague organized by the Protein Allergenicity Technical Committee of the International Life Sciences Institute’s Health and Environmental Sciences Institute

    Working conditions and public health risks in slaughterhouses in western Kenya

    Get PDF
    Background: Inadequate facilities and hygiene at slaughterhouses can result in contamination of meat and occupational hazards to workers. The objectives of this study were to assess current conditions in slaughterhouses in western Kenya and the knowledge, and practices of the slaughterhouse workers toward hygiene and sanitation. Methods: Between February and October 2012 all consenting slaughterhouses in the study area were recruited. A standardised questionnaire relating to facilities and practices in the slaughterhouse was administered to the foreperson at each site. A second questionnaire was used to capture individual slaughterhouse workers’ knowledge, practices and recent health events. Results: A total of 738 slaughterhouse workers from 142 slaughterhouses completed questionnaires. Many slaughterhouses had poor infrastructure, 65% (95% CI 63–67%) had a roof, cement floor and walls, 60% (95% CI 57–62%) had a toilet and 20% (95% CI 18–22%) had hand-washing facilities. The meat inspector visited 90% (95% CI 92–95%) of slaughterhouses but antemortem inspection was practiced at only 7% (95% CI 6–8%). Nine percent (95% CI 7–10%) of slaughterhouses slaughtered sick animals. Only half of workers wore personal protective clothing - 53% (95% CI 51–55%) wore protective coats and 49% (95% CI 46–51%) wore rubber boots. Knowledge of zoonotic disease was low with only 31% (95% CI 29–33%) of workers aware that disease could be transmitted from animals. Conclusions: The current working conditions in slaughterhouses in western Kenya are not in line with the recommendations of the Meat Control Act of Kenya. Current facilities and practices may increase occupational exposure to disease or injury and contaminated meat may enter the consumer market. The findings of this study could enable the development of appropriate interventions to minimise public health risks. Initially, improvements need to be made to facilities and practices to improve worker safety and reduce the risk of food contamination. Simultaneously, training programmes should target workers and inspectors to improve awareness of the risks. In addition, education of health care workers should highlight the increased risks of injury and disease in slaughterhouse workers. Finally, enhanced surveillance, targeting slaughterhouse workers could be used to detect disease outbreaks. This “One Health” approach to disease surveillance is likely to benefit workers, producers and consumers

    Can we define a level of protection for allergic consumers that everyone can accept?

    Get PDF
    Substantial progress has been made in characterising the risk associated with exposure to allergens in food. However, absence of agreement on what risk is tolerable has made it difficult to set quantitative limits to manage that risk and protect allergic consumers effectively. This paper reviews scientific progress in the area and the diverse status of allergen management approaches and lack of common standards across different jurisdictions, including within the EU. This lack of regulation largely explains why allergic consumers find Precautionary Allergen Labelling confusing and cannot rely on it. We reviewed approaches to setting quantitative limits for a broad range of food safety hazards to identify the reasoning leading to their adoption. This revealed a diversity of approaches from pragmatic to risk-based, but we could not find clear evidence of the process leading to the decision on risk acceptability. We propose a framework built around the criteria suggested by Murphy and Gardoni (2008) for approaches to defining tolerable risks. Applying these criteria to food allergy, we concluded that sufficient knowledge exists to implement the framework, including sufficient expertise across the whole range of stakeholders to allow opinions to be heard and respected, and a consensus to be achieved

    The Evolution and Cultural Framing of Food Safety Management Systems – Where from and Where next?

    Get PDF
    The aim of this paper is to review the development of food safety management systems (FSMS) from their origins in the 1950s to the present. The food safety challenges in modern food supply systems are explored and it is argued that there is the need for a more holistic thinking approach to food safety management. The narrative review highlights that whilst the transactional elements of how FSMS are developed, validated, implemented, monitored and verified remains largely unchanged, how organizational culture frames the operation and efficacy of FSMS is becoming more important. The evolution of a wider academic and industry understanding of both the influence of food safety culture (FS-culture) and also how such culture frames and enables, or conversely restricts the efficacy of the FSMS is crucial for consumer wellbeing. Potential research gaps worthy of further study are identified as well as recommendations given for the application of the research findings within the food industry

    Influence of psychological factors in food risk assessment - A review

    Get PDF
    Background: Typically, food-related risk assessments are carried out within a four step, technical framework, as detailed by the Codex Alimentarius Commission (World Health Organization/ Food and Agricultural Organization of the United Nations, 2015). However, the technical framework presumes a level of ‘objective risk’ and does not take into account that risk is complex and psychologically constructed, something which is rarely acknowledged within risk analysis as a whole. It is well documented that people's perceptions of risk are based on more than merely probability of occurrence, but reflect other non-technical psychological factors (e.g., risk origin, severity, controllability, familiarity). Moreover, the basis of these risk perceptions is largely similar for experts and non-experts. Scope and approach: In this review, we consider each stage of the risk assessment process from a psychological perspective, reviewing research on non-technical factors which could affect assessments of risk and subsequent risk management decisions, with a particular focus on food safety. Key Findings and Conclusions: We identify 12 factors from the psychological literature which could potentially influence how risks are assessed and characterised. Drawing on insights from this research, we propose a number of recommendations to standardise approaches in risk assessment. Acknowledging and working with the subjectivity of risk is key to ensuring the efficacy of the wider risk analysis process
    corecore