23 research outputs found

    Consumers' behavior in quantitative microbial risk assessment for pathogens in raw milk: Incorporation of the likelihood of consumption as a function of storage time and temperature

    Get PDF
    Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time\u2013temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time\u2013temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens

    Compassion on University Degree Programmes at a UK University: The Neuroscience of Effective Group work

    Get PDF
    This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/ licences/by/4.0/legalcodePurpose The purpose of this paper is to explore the neuroscience that underpins the psychology of compassion as a competency. We explain why this cognitive competency is now taught and assessed on modules of different degree subjects in a UK university. Design/methodology/approach The paper is divided into first, an exploration of recent psychology and neuroscience literature that illuminates the differences, and relationship, between empathy and compassion for safeness building in teams. Within that, the role of oxytocin in achieving social and intellectual rewards though the exercise of cognitive flexibility, working memory and impulsive inhibitory control (Zelazo, et al, 2016) is also identified. The literature findings are compared against relevant qualitative data from the above university’s, so far, nine years of mixed methods action research on compassion-focused pedagogy (CfP). Findings These are that the concept and practice of embedding compassion as a cognitive competency into assessed university group work is illuminated and rationalised by research findings in neuroscience. Research limitations/implications The limitations of the study are that, so far, fMRI research methods have not been used to investigate student subjects involved in the compassion-focused pedagogy now in use. Practical implications The paper has implications for theory, policy and practice in relation to managing the increasing amount of group work that accompanies widening participation in Higher Education. Originality/value A review of this kind specifically for student assessed group and its implications for student academic achievement and mental health has not, apparently, been publishedPeer reviewe

    Unravelling transmission of Mycobacterium avium subspecies paratuberculosis to dairy calves: results of a lifelong longitudinal study

    Get PDF
    Johne's disease (JD) is a chronic disease of ruminants endemic in the UK and other countries and responsible for large economic losses for the dairy sector. JD is caused by Mycobacterium avium subspecies paratuberculosis (MAP), which typically infects calves that remain latently infected during a long period, making early detection of infection challenging. Cow to calf transmission can occur in-utero, via milk/colostrum or faecal-orally. Understanding of the different transmission routes to calves is important in informing control recommendations. Our aim in this longitudinal study was to measure the association between the transmission routes via the dam and the environment on a calf subsequently testing serologically positive for MAP. The study population comprised of 439 UK dairy calves from 6 herds enrolled between 2012 and 2013. These calves were followed up from birth until 2023. At birth individual calf data was captured. During follow-up, individuals entering the milking herd were quarterly tested for the presence of MAP antibodies using milk ELISA. Cox regression models were used to measure the association between exposure from the dam (in-utero and/or colostrum) or from the environment (long time in dirty yard) and time to first detection of MAP infection. An association between calves born to positive dams and probability of having a MAP positive test result remained after excluding potential MAP transmission via colostrum (Hazard ratio: 2.24; 95% CI: 1.14 - 4.41). Calves unlikely to be infected with MAP via the in-utero or colostrum route, had 3.68 (95% CI: 3.68 1.45-9.33) higher hazard of a positive test result when they stayed longer in a dirty calving area. The effect of the dam infection status on transmission to calves precedes the dam's seroconversion and persists after excluding the potential role of transmission via colostrum. The association between time spent in a dirty calving area and probability of a MAP positive test result highlights the role of environmental contamination as a source of infection in addition to the dam. [Abstract copyright: Copyright © 2023 The Authors. Published by Elsevier B.V. All rights reserved.

    Risk-based inspection as a cost-effective strategy to reduce human exposure to cysticerci of Taenia saginata in low-prevalence settings

    Get PDF
    Taenia saginata cysticercus is the larval stage of the zoonotic parasite Taenia saginata, with a life-cycle involving both cattle and humans. The public health impact is considered low. The current surveillance system, based on post-mortem inspection of carcasses has low sensitivity and leads to considerable economic burden. Therefore, in the interests of public health and food production efficiency, this study aims to explore the potential of risk-based and cost-effective meat inspection activities for the detection and control of T. saginata cysticercus in low prevalence settings

    Microbiological risk ranking of foodborne pathogens and food products in scarce-data settings

    Get PDF
    In the absence of epidemiological, microbiological or outbreak data, systematic identification of the hazards and food products posing the higher risk to the consumers is challenging. It is usually in Low- and Middle-Income Countries (LMICs), where the burden of foodborne disease is highest that data tend to be particularly scarce. In this study, we propose qualitative risk-ranking methods for pathogens and food products that can be used in settings where scarcity of data on the frequency/concentration of pathogens in foodstuff is a barrier towards the use of classical risk assessment frameworks. The approach integrates the existing knowledge on foodborne pathogens, manufacturing processes and intrinsic/extrinsic properties of food products with key context-specific information regarding the supply chain(s), characteristics of the Food Business Operators (FBOs) and cultural habits to identify: (i) the pathogens that should be considered as a “High” food safety priority and (ii) the food products posing the higher risk of consumer exposure to microbiological hazards via the oral (ingestion) route. When applied to the dairy sector of Andhra Pradesh (India) as a case study, Shiga toxin-producing E. coli, Salmonella spp., S. aureus and L. monocytogenes were identified as a “High” food safety priority across all FBOs. C. sakazakii was identified as a “High” priority for the FBOs producing infant formula/milk powder whilst Shigella spp., and Cryptosporidium spp. a “High” priority when considering the FBOs operating in the unregulated sector. Given the diversity of dairy products considered in the assessment, cluster analysis was used to identify products that shared similar intrinsic/extrinsic features known to drive the microbiological risk. The risk ranking was then done integrating the results of the cluster analysis with context-specific information. Products manufactured/retailed by FBOs in the informal market were considered as posing a “High” risk for the consumers due to a widespread lack of compliance to sanitary regulations. For dairy products produced by FBOs operating in the middle and formal end of the formal-informal spectrum, the risk of consumers exposure to microbiological hazards ranged from “Moderate” to “Extremely low” depending on the FBO and the intrinsic/extrinsic properties of the products. While providing risk estimates of lower precision if compared to data-driven risk assessments, the proposed method maximises the value of the information that can be easily gathered in LMICs and provide informative outputs to support food safety decision-making in contexts where resources to be allocated for prevention of foodborne diseases are limited and the food system is complex

    Update and review of control options for Campylobacter in broilers at primary production

    Get PDF
    The 2011 EFSA opinion on Campylobacter was updated using more recent scientific data. The relative risk reduction in EU human campylobacteriosis attributable to broiler meat was estimated for on‐farm control options using Population Attributable Fractions (PAF) for interventions that reduce Campylobacter flock prevalence, updating the modelling approach for interventions that reduce caecal concentrations and reviewing scientific literature. According to the PAF analyses calculated for six control options, the mean relative risk reductions that could be achieved by adoption of each of these six control options individually are estimated to be substantial but the width of the confidence intervals of all control options indicates a high degree of uncertainty in the specific risk reduction potentials. The updated model resulted in lower estimates of impact than the model used in the previous opinion. A 3‐log10 reduction in broiler caecal concentrations was estimated to reduce the relative EU risk of human campylobacteriosis attributable to broiler meat by 58% compared to an estimate larger than 90% in the previous opinion. Expert Knowledge Elicitation was used to rank control options, for weighting and integrating different evidence streams and assess uncertainties. Medians of the relative risk reductions of selected control options had largely overlapping probability intervals, so the rank order was uncertain: vaccination 27% (90% probability interval (PI) 4–74%); feed and water additives 24% (90% PI 4–60%); discontinued thinning 18% (90% PI 5–65%); employing few and well‐trained staff 16% (90% PI 5–45%); avoiding drinkers that allow standing water 15% (90% PI 4–53%); addition of disinfectants to drinking water 14% (90% PI 3–36%); hygienic anterooms 12% (90% PI 3–50%); designated tools per broiler house 7% (90% PI 1–18%). It is not possible to quantify the effects of combined control activities because the evidence‐derived estimates are inter‐dependent and there is a high level of uncertainty associated with each.info:eu-repo/semantics/publishedVersio

    The efficacy and safety of high-pressure processing of food

    Get PDF
    High-pressure processing (HPP) is a non-thermal treatment in which, for microbial inactivation, foodsare subjected to isostatic pressures (P) of 400–600 MPa with common holding times (t) from 1.5 to6 min. The main factors that influence the efficacy (log10reduction of vegetative microorganisms) ofHPP when applied to foodstuffs are intrinsic (e.g. water activity and pH), extrinsic (P and t) andmicroorganism-related (type, taxonomic unit, strain and physiological state). It was concluded thatHPP of food will not present any additional microbial or chemical food safety concerns when comparedto other routinely applied treatments (e.g. pasteurisation). Pathogen reductions in milk/colostrumcaused by the current HPP conditions applied by the industry are lower than those achieved by thelegal requirements for thermal pasteurisation. However, HPP minimum requirements (P/t combinations)could be identified to achieve specific log10reductions of relevant hazards based on performancecriteria (PC) proposed by international standard agencies (5–8 log10reductions). The most stringentHPP conditions used industrially (600 MPa, 6 min) would achieve the above-mentioned PC, except forStaphylococcus aureus. Alkaline phosphatase (ALP), the endogenous milk enzyme that is widely used to verify adequate thermal pasteurisation of cows’milk, is relatively pressure resistant and its usewould be limited to that of an overprocessing indicator. Current data are not robust enough to supportthe proposal of an appropriate indicator to verify the efficacy of HPP under the current HPP conditionsapplied by the industry. Minimum HPP requirements to reduceListeria monocytogeneslevels byspecific log10reductions could be identified when HPP is applied to ready-to-eat (RTE) cooked meatproducts, but not for other types of RTE foods. These identified minimum requirements would result inthe inactivation of other relevant pathogens (SalmonellaandEscherichia coli) in these RTE foods to asimilar or higher extent.info:eu-repo/semantics/publishedVersio

    The efficacy and safety of high‐pressure processing of food

    Get PDF
    [EN]High-pressure processing (HPP) is a non-thermal treatment in which, for microbial inactivation, foods are subjected to isostatic pressures (P) of 400–600 MPa with common holding times (t) from 1.5 to 6 min. The main factors that influence the efficacy (log10 reduction of vegetative microorganisms) of HPP when applied to foodstuffs are intrinsic (e.g. water activity and pH), extrinsic (P and t) and microorganism-related (type, taxonomic unit, strain and physiological state). It was concluded that HPP of food will not present any additional microbial or chemical food safety concerns when compared to other routinely applied treatments (e.g. pasteurisation). Pathogen reductions in milk/colostrum caused by the current HPP conditions applied by the industry are lower than those achieved by the legal requirements for thermal pasteurisation. However, HPP minimum requirements (P/t combinations) could be identified to achieve specific log10 reductions of relevant hazards based on performance criteria (PC) proposed by international standard agencies (5–8 log10 reductions). The most stringent HPP conditions used industrially (600 MPa, 6 min) would achieve the above-mentioned PC, except for Staphylococcus aureus. Alkaline phosphatase (ALP), the endogenous milk enzyme that is widely used to verify adequate thermal pasteurisation of cows’ milk, is relatively pressure resistant and its use would be limited to that of an overprocessing indicator. Current data are not robust enough to support the proposal of an appropriate indicator to verify the efficacy of HPP under the current HPP conditions applied by the industry. Minimum HPP requirements to reduce Listeria monocytogenes levels by specific log10 reductions could be identified when HPP is applied to ready-to-eat (RTE) cooked meat products, but not for other types of RTE foods. These identified minimum requirements would result in the inactivation of other relevant pathogens (Salmonella and Escherichia coli) in these RTE foods to a similar or higher extent.S

    The efficacy and safety of high-pressure processing of food

    Get PDF
    High-pressure processing (HPP) is a non-thermal treatment in which, for microbial inactivation, foods are subjected to isostatic pressures (P) of 400–600 MPa with common holding times (t) from 1.5 to 6 min. The main factors that influence the efficacy (log10 reduction of vegetative microorganisms) of HPP when applied to foodstuffs are intrinsic (e.g. water activity and pH), extrinsic (P and t) and microorganism-related (type, taxonomic unit, strain and physiological state). It was concluded that HPP of food will not present any additional microbial or chemical food safety concerns when compared to other routinely applied treatments (e.g. pasteurisation). Pathogen reductions in milk/colostrum caused by the current HPP conditions applied by the industry are lower than those achieved by the legal requirements for thermal pasteurisation. However, HPP minimum requirements (P/t combinations) could be identified to achieve specific log10 reductions of relevant hazards based on performance criteria (PC) proposed by international standard agencies (5–8 log10 reductions). The most stringent HPP conditions used industrially (600 MPa, 6 min) would achieve the above-mentioned PC, except for Staphylococcus aureus. Alkaline phosphatase (ALP), the endogenous milk enzyme that is widely used to verify adequate thermal pasteurisation of cows’ milk, is relatively pressure resistant and its use would be limited to that of an overprocessing indicator. Current data are not robust enough to support the proposal of an appropriate indicator to verify the efficacy of HPP under the current HPP conditions applied by the industry. Minimum HPP requirements to reduce Listeria monocytogenes levels by specific log10 reductions could be identified when HPP is applied to ready-to-eat (RTE) cooked meat products, but not for other types of RTE foods. These identified minimum requirements would result in the inactivation of other relevant pathogens (Salmonella and Escherichia coli) in these RTE foods to a similar or higher extent.info:eu-repo/semantics/publishedVersio
    corecore