88 research outputs found

    Poultry as a host for the zoonotic pathogen Campylobacter jejuni

    Get PDF
    Campylobacteriosis is the most reported foodborne gastroenteritic disease and poses a serious health burden in industrialized countries. Disease in humans is mainly caused by the zoonotic pathogen Campylobacter jejuni. Due to its wide-spread occurrence in the environment, the epidemiology of Campylobacter remains poorly understood. It is generally accepted, however, that chickens are a natural host for Campylobacter jejuni, and for Campylobacter spp. in general, and that colonized broiler chicks are the primary vector for transmitting this pathogen to humans. Several potential sources and vectors for transmitting C. jejuni to broiler flocks have been identified. Initially, one or a few broilers can become colonized at an age of >2 weeks until the end of rearing, after which the infection will rapidly spread throughout the entire flock. Such a flock is generally colonized until slaughter and infected birds carry a very high C. jejuni load in their gastrointestinal tract, especially the ceca. This eventually results in contaminated carcasses during processing, which can transmit this pathogen to humans. Recent genetic typing studies showed that chicken isolates can frequently be linked to human clinical cases of Campylobacter enteritis. However, despite the increasing evidence that the chicken reservoir is the number one risk factor for disease in humans, no effective strategy exists to reduce Campylobachter prevalence in poultry flocks, which can in part be explained by the incomplete understanding of the epidemiology of C. jejuni in broiler flocks. As a result, the number of human campylobacteriosis cases associated with the chicken vector remains strikingly high

    Campylobacter control in poultry by current intervention measures ineffective: urgent need for intensified fundamental research

    Get PDF
    International audienceCampylobacter-contaminated poultry meat is an important source of foodborne gastroenteritis and poses a serious health burden in industrialized countries. Broiler chickens are commonly regarded as a natural host for this pathogen and infected birds carry a very high Campylobacter load in their gastrointestinal tract, especially the ceca. This results in contaminated carcasses during processing. While hygienic measures at the farm and control measures during carcass processing can have some effect on the reduction of Campylobacter numbers on the retail product, intervention at the farm level by reducing colonization of the ceca should be taken into account in the overall control policy. This review gives an up-to-date overview of suggested on-farm control measures to reduce the prevalence and colonization of Campylobacter in poultry

    Colonization factors of Campylobacter jejuni in the chicken gut

    Get PDF
    Campylobacter contaminated broiler chicken meat is an important source of foodborne gastroenteritis and poses a serious health burden in industrialized countries. Broiler chickens are commonly regarded as a natural host for this zoonotic pathogen and infected birds carry a very high C. jejuni load in their gastrointestinal tract, especially the ceca. This eventually results in contaminated carcasses during processing. Current intervention methods fail to reduce the colonization of broiler chicks by C. jejuni due to an incomplete understanding on the interaction between C. jejuni and its avian host. Clearly, C. jejuni developed several survival and colonization mechanisms which are responsible for its highly adapted nature to the chicken host. But how these mechanisms interact with one another, leading to persistent, high-level cecal colonization remains largely obscure. A plethora of mutagenesis studies in the past few years resulted in the identification of several of the genes and proteins of C. jejuni involved in different aspects of the cellular response of this bacterium in the chicken gut. In this review, a thorough, up-to-date overview will be given of the survival mechanisms and colonization factors of C. jejuni identified to date. These factors may contribute to our understanding on how C. jejuni survival and colonization in chicks is mediated, as well as provide potential targets for effective subunit vaccine development

    Microbiological risk assessment

    Get PDF
    Microbiological risk assessment is defined by the CODEX Alimentarius Commission as 'a scientifically based process consisting of the following steps: (i) hazard identification; (ii) hazard characterisation; (iii) exposure assessment; and (iv) risk characterisation'. It is one of the components of microbiological risk analysis, which has the overall objective to minimise food-borne risks to consumers. It is a complex discipline that continues to evolve and challenges and new opportunities were discussed during the breakout session 'Microbiological risk assessment' held at the EFSA 2nd Scientific Conference 'Shaping the Future of Food Safety, Together' (Milan, Italy, 14–16 October 2015). Discussions focussed on the estimation of the global burden of food-borne disease, the prioritisation of microbiological risks taking into account uncertainty, the challenges in risk assessment when dealing with viruses, the contribution of typing methods to risk assessment and approaches to deal with uncertainty in risk assessment in emergency situations. It was concluded that the results of the global burden of food-borne disease study provide, for the first time, a comprehensive comparison of risks due to different hazards and this will be an important input to food safety strategies at the global, regional and national levels. Risk ranking methodologies are an important tool for priority setting. It is important to consider the underestimation (e.g. due to bias in reporting). Typing methods for microbial hazards inevitably impact on risk assessment and can have an important influence on the accuracy of source attribution studies. Due to their high genetic diversity and the limitations of current diagnostic methods, it is still challenging to obtain robust evidence for food-borne outbreaks caused by viruses and more research is needed on the use of whole genome sequencing in this area. The lessons learnt from the recent enterohaemorrhagic Escherichia coli (EHEC) outbreak in Germany include the need for more effective and timely connections within and between institutions as responses unfold

    The public health risk posed by Listeria monocytogenes in frozen fruit and vegetables including herbs, blanched during processing

    Get PDF
    A multi-country outbreak ofListeria monocytogenesST6 linked to blanched frozen vegetables (bfV)took place in the EU (2015–2018). Evidence of food-borne outbreaks shows thatL. monocytogenesisthe most relevant pathogen associated with bfV. The probability of illness per serving of uncooked bfV,for the elderly (65–74 years old) population, is up to 3,600 times greater than cooked bfV and verylikely lower than any of the evaluated ready-to-eat food categories. The main factors affectingcontamination and growth ofL. monocytogenesin bfV during processing are the hygiene of the rawmaterials and process water; the hygienic conditions of the food processing environment (FPE); andthe time/Temperature (t/T) combinations used for storage and processing (e.g. blanching, cooling).Relevant factors after processing are the intrinsic characteristics of the bfV, the t/T combinations usedfor thawing and storage and subsequent cooking conditions, unless eaten uncooked. Analysis of thepossible control options suggests that application of a complete HACCP plan is either not possible orwould not further enhance food safety. Instead, specific prerequisite programmes (PRP) andoperational PRP activities should be applied such as cleaning and disinfection of the FPE, water control,t/T control and product information and consumer awareness. The occurrence of low levels ofL. monocytogenesat the end of the production process (e.g.<10 CFU/g) would be compatible with thelimit of 100 CFU/g at the moment of consumption if any labelling recommendations are strictly followed(i.e. 24 h at 5°C). Under reasonably foreseeable conditions of use (i.e. 48 h at 12°C),L. monocytogeneslevels need to be considerably lower (not detected in 25 g). Routine monitoring programmes forL. monocytogenesshould be designed following a risk-based approach and regularly revised based ontrend analysis, being FPE monitoring a key activity in the frozen vegetable industry

    Microbiological safety of aged meat

    Get PDF
    The impact of dry-ageing of beef and wet-ageing of beef, pork and lamb on microbiological hazards and spoilage bacteria was examined and current practices are described. As ‘standard fresh’ and wet-aged meat use similar processes these were differentiated based on duration. In addition to a description of the different stages, data were collated on key parameters (time, temperature, pH and aw) using a literature survey and questionnaires. The microbiological hazards that may be present in all aged meats included Shiga toxin-producing Escherichia coli (STEC), Salmonella spp., Staphylococcus aureus, Listeria monocytogenes, enterotoxigenic Yersinia spp., Campylobacter spp. and Clostridium spp. Moulds, such as Aspergillus spp. and Penicillium spp., may produce mycotoxins when conditions are favourable but may be prevented by ensuring a meat surface temperature of −0.5 to 3.0°C, with a relative humidity (RH) of 75–85% and an airflow of 0.2–0.5 m/s for up to 35 days. The main meat spoilage bacteria include Pseudomonas spp., Lactobacillus spp. Enterococcus spp., Weissella spp., Brochothrix spp., Leuconostoc spp., Lactobacillus spp., Shewanella spp. and Clostridium spp. Under current practices, the ageing of meat may have an impact on the load of microbiological hazards and spoilage bacteria as compared to standard fresh meat preparation. Ageing under defined and controlled conditions can achieve the same or lower loads of microbiological hazards and spoilage bacteria than the variable log10 increases predicted during standard fresh meat preparation. An approach was used to establish the conditions of time and temperature that would achieve similar or lower levels of L. monocytogenes and Yersinia enterocolitica (pork only) and lactic acid bacteria (representing spoilage bacteria) as compared to standard fresh meat. Finally, additional control activities were identified that would further assure the microbial safety of dry-aged beef, based on recommended best practice and the outputs of the equivalence assessment.info:eu-repo/semantics/publishedVersio

    The use of the so-called ‘superchilling’ technique for the transport of fresh fishery products

    Get PDF
    Superchilling entails lowering the fish temperature to between the initial freezing point of the fish and about 1–2°C lower. The temperature of superchilled fresh fishery products (SFFP) in boxes without ice was compared to that of products subject to the currently authorised practice in boxes with ice (CFFP) under the same conditions of on-land storage and/or transport. A heat transfer model was developed and made available as a tool to identify under which initial configurations of SFFP the fish temperature, at any time of storage/transport, is lower or equal to CFFP. A minimum degree of superchilling, corresponding to an ice fraction in the fish matrix of SFFP equal or higher than the proportion of ice added per mass of fish in CFFP, will ensure with 99–100% certainty (almost certain) that the fish temperature of SFFP and the consequent increase of relevant hazards will be lower or equal to that of CFFP. In practice, the degree of superchilling can be estimated using the fish temperature after superchilling and its initial freezing point, which are subject to uncertainties. The tool can be used as part of ‘safety-by-design’ approach, with the reliability of its outcome being dependent on the accuracy of the input data. An evaluation of methods capable of detecting whether a previously frozen fish is commercially presented as ‘superchilled’ was carried out based on, amongst others, their applicability for different fish species, ability to differentiate fresh fish from fish frozen at different temperatures, use as a stand-alone method, ease of use and classification performance. The methods that were considered ‘fit for purpose’ are Hydroxyacyl-coenzyme A dehydrogenase (HADH) test, α-glucosidase test, histology, ultraviolet–visible–near–infrared (UV-VIS/NIR) spectroscopy and hyperspectral imaging. These methods would benefit from standardisation, including the establishment of threshold values or classification algorithms to provide a practical routine test.info:eu-repo/semantics/publishedVersio

    Update and review of control options for Campylobacter in broilers at primary production

    Get PDF
    The 2011 EFSA opinion on Campylobacter was updated using more recent scientific data. The relative risk reduction in EU human campylobacteriosis attributable to broiler meat was estimated for on‐farm control options using Population Attributable Fractions (PAF) for interventions that reduce Campylobacter flock prevalence, updating the modelling approach for interventions that reduce caecal concentrations and reviewing scientific literature. According to the PAF analyses calculated for six control options, the mean relative risk reductions that could be achieved by adoption of each of these six control options individually are estimated to be substantial but the width of the confidence intervals of all control options indicates a high degree of uncertainty in the specific risk reduction potentials. The updated model resulted in lower estimates of impact than the model used in the previous opinion. A 3‐log10 reduction in broiler caecal concentrations was estimated to reduce the relative EU risk of human campylobacteriosis attributable to broiler meat by 58% compared to an estimate larger than 90% in the previous opinion. Expert Knowledge Elicitation was used to rank control options, for weighting and integrating different evidence streams and assess uncertainties. Medians of the relative risk reductions of selected control options had largely overlapping probability intervals, so the rank order was uncertain: vaccination 27% (90% probability interval (PI) 4–74%); feed and water additives 24% (90% PI 4–60%); discontinued thinning 18% (90% PI 5–65%); employing few and well‐trained staff 16% (90% PI 5–45%); avoiding drinkers that allow standing water 15% (90% PI 4–53%); addition of disinfectants to drinking water 14% (90% PI 3–36%); hygienic anterooms 12% (90% PI 3–50%); designated tools per broiler house 7% (90% PI 1–18%). It is not possible to quantify the effects of combined control activities because the evidence‐derived estimates are inter‐dependent and there is a high level of uncertainty associated with each.info:eu-repo/semantics/publishedVersio

    Guidance on date marking and related food information: part 1 (date marking)

    Get PDF
    A risk‐based approach was developed to be followed by food business operators (FBO) when deciding on the type of date marking (i.e. ‘best before’ date or ‘use by’ date), setting of shelf‐life (i.e. time) and the related information on the label to ensure food safety. The decision on the type of date marking needs to be taken on a product‐by‐product basis, considering the relevant hazards, product characteristics, processing and storage conditions. The hazard identification is food product‐specific and should consider pathogenic microorganisms capable of growing in prepacked temperature‐controlled foods under reasonably foreseeable conditions. The intrinsic (e.g. pH and aw), extrinsic (e.g. temperature and gas atmosphere) and implicit (e.g. interactions with competing background microbiota) factors of the food determine which pathogenic and spoilage microorganisms can grow in the food during storage until consumption. A decision tree was developed to assist FBOs in deciding the type of date marking for a certain food product. When setting the shelf‐life, the FBO needs to consider reasonably foreseeable conditions of distribution, storage and use of the food. Key steps of a case‐by‐case procedure to determine and validate the shelf‐life period are: (i) identification of the relevant pathogenic/spoilage microorganism and its initial level, (ii) characterisation of the factors of the food affecting the growth behaviour and (iii) assessment of the growth behaviour of the pathogenic/spoilage microorganism in the food product during storage until consumption. Due to the variability between food products and consumer habits, it was not appropriate to present indicative time limits for food donated or marketed past the ‘best before’ date. Recommendations were provided relating to training activities and support, using ‘reasonably foreseeable conditions’, collecting time–temperature data during distribution, retail and domestic storage of foods and developing Appropriate Levels of Protection and/or Food Safety Objectives for food–pathogen combinations.info:eu-repo/semantics/publishedVersio
    • 

    corecore