13 research outputs found
Modelling the growth of Clostridium perfringens during the cooling of bulk meat
A dynamic predictive model was developed to describe the effects of temperature, pH and NaCl concentration on the growth of Clostridium perfringens type A. The model for the specific growth rate was based on 81 growth curves generated in our laboratory or obtained from the publicly available ComBase database. Growth curves obtained during cooling were fitted with the dynamic model of Baranyi and Roberts. This made it possible to determine the parameter value reflecting the physiological state of C. perfringens after heating profiles typically applied to bulk meat. The model with the obtained parameters provided a good description of growth of C. perfringens in 24 heating/cooling curves generated specifically for this work (various non-isothermal treatments with a range of combinations of pH and NaCl concentration), and also for existing literature data. The dynamic model was implemented in Perfringens Predictor, a web-based application that can be accessed free of charge via www.combase.cc. It is anticipated that the use of this model and Perfringens Predictor will contribute to a reduction in the food poisoning incidence associated with C. perfringens
Development and application of a new method for specific and sensitive enumeration of spores of nonproteolytic Clostridium botulinum types B, E, and F in foods and food materials
The highly potent botulinum neurotoxins are responsible for botulism, a severe neuroparalytic disease. Strains of nonproteolytic Clostridium botulinum form neurotoxins of types B, E, and F and are the main hazard associated with minimally heated refrigerated foods. Recent developments in quantitative microbiological risk assessment (QMRA) and food safety objectives (FSO) have made food safety more quantitative and include, as inputs, probability distributions for the contamination of food materials and foods. A new method that combines a selective enrichment culture with multiplex PCR has been developed and validated to enumerate specifically the spores of nonproteolytic C. botulinum. Key features of this new method include the following: (i) it is specific for nonproteolytic C. botulinum (and does not detect proteolytic C. botulinum), (ii) the detection limit has been determined for each food tested (using carefully structured control samples), and (iii) a low detection limit has been achieved by the use of selective enrichment and large test samples. The method has been used to enumerate spores of nonproteolytic C. botulinum in 637 samples of 19 food materials included in pasta-based minimally heated refrigerated foods and in 7 complete foods. A total of 32 samples (5 egg pastas and 27 scallops) contained spores of nonproteolytic C. botulinum type B or F. The majority of samples contained <100 spores/kg, but one sample of scallops contained 444 spores/kg. Nonproteolytic C. botulinum type E was not detected. Importantly, for QMRA and FSO, the construction of probability distributions will enable the frequency of packs containing particular levels of contamination to be determined
Can Health-care Assistant Training improve the relational care of older people? (CHAT) A development and feasibility study of a complex intervention
Background: Older people account for an increasing proportion of those receiving NHS acute care. The quality of health care delivered to older people has come under increased scrutiny. Health-care assistants (HCAs) provide much of the direct care of older people in hospital. Patientsâ experience of care tends to be based on the relational aspects of that care including dignity, empathy and emotional support. Objective(s): We aimed to understand the relational care training needs of HCAs caring for older people, design a relational care training intervention for HCAs and assess the feasibility of a cluster randomised controlled trial to test the new intervention against HCA training as usual (TAU). Design: (1) A telephone survey of all NHS hospital trusts in England to assess current HCA training provision, (2) focus groups of older people and carers, (3) semistructured interviews with HCAs and other care staff to establish training needs and inform intervention development and (4) a feasibility cluster randomised controlled trial. Setting: (1) All acute NHS hospital trusts in England, and (2â4) three acute NHS hospital trusts in England and the populations they serve. Participants: (1) Representatives of 113 out of the total of 161 (70.2%) NHS trusts in England took part in the telephone survey, (2) 29 older people or carer participants in three focus groups, (3) 30 HCA and 24 âother staffâ interviewees and (4) 12 wards (four per trust), 112 HCAs, 92 patients during the prerandomisation period and 67 patients during the postrandomisation period. Interventions: For the feasibility trial, a training intervention (Older Peopleâs Shoesâą) for HCAs developed as part of the study was compared with HCA TAU. Main outcome measures: Patient-level outcomes were the experience of emotional care and quality of life during patientsâ hospital stay, as measured by the Patient Evaluation of Emotional Care during Hospitalisation and the EuroQol-5 Dimensions questionnaires. HCA outcomes were empathy, as measured by the Toronto Empathy Questionnaire, and attitudes towards older people, as measured by the Age Group Evaluation and Description Inventory. Ward-level outcomes were the quality of HCAâpatient interaction, as measured by the Quality of Interaction Scale. Results: (1) One-third of trust telephone survey participants reported HCA training content that we considered to be ârelational careâ. Training for HCAs is variable across trusts and is focused on new recruits. The biggest challenge for HCA training is getting HCAs released from ward duties. (2) Older people and carers are aware of the pressures that ward staff are under but good relationships with care staff determine whether or not their experience of hospital is positive. (3) HCAs have training needs related to âdifficult conversationsâ with patients and relatives; they have particular preferences for learning styles that are not always reflected in available training. (4) In the feasibility trial, 187 of the 192 planned ward observation sessions were completed; the response to HCA questionnaires at baseline and at 8 and 12 weeks post randomisation was 64.2%, 46.4% and 35.7%, respectively, and 57.2% of eligible patients returned completed questionnaires. Limitations: This was an intervention development and feasibility study so no conclusions can be drawn about the clinical effectiveness or cost-effectiveness of the intervention. Conclusions: The intervention had high acceptability among nurse trainers and HCA learners. Viability of a definitive trial is conditional on overcoming specific methodological (patient recruitment processes) and contextual (involvement of wider ward team) challenges.</p
A systematic review of the clinical, public health and cost-effectiveness of rapid diagnostic tests for the detection and identification of bacterial intestinal pathogens in faeces and food
OBJECTIVES: To determine the diagnostic accuracy of tests for the rapid diagnosis of bacterial food poisoning in clinical and public health practice and to estimate the cost-effectiveness of these assays in a hypothetical population in order to inform policy on the use of these tests. DATA SOURCES: Studies evaluating diagnostic accuracy of rapid tests were retrieved using electronic databases and handsearching reference lists and key journals. Hospital laboratories and test manufacturers were contacted for cost data, and clinicians involved in the care of patients with food poisoning were invited to discuss the conclusions of this review using the nominal group technique. REVIEW METHODS: A systematic review of the current medical literature on assays used for the rapid diagnosis of bacterial food poisoning was carried out. Specific organisms under review were Salmonella, Campylobacter, Escherichia coli O157, Staphylococcus aureus, Clostridium perfringens and Bacillus cereus. Data extraction was undertaken using standardised data extraction forms. Where a sufficient number of studies evaluating comparable tests were identified, meta-analysis was performed. A decision analytic model was developed, using effectiveness data from the review and cost data from hospitals and manufacturers, which contributed to an assessment of the cost-effectiveness of rapid tests in a hypothetical UK population. Finally, diagnostic accuracy and cost-effectiveness results were presented to a focus group of GPs, microbiologists and consultants in communicable disease control, to assess professional opinion on the use of rapid tests in the diagnosis of food poisoning. RESULTS: Good test performance levels were observed with rapid test methods, especially for polymerase chain reaction (PCR) assays. The estimated levels of diagnostic accuracy using the area under the curve of the summary receiver operating characteristic curve was very high. Indeed, although traditional culture is the natural reference test to use for comparative statistical analysis, on many occasions the rapid test outperforms culture, detecting additional 'truly' positive cases of food-borne illness. The significance of these additional positives requires further investigation. Economic modelling suggests that adoption of rapid tests in combination with routine culture is unlikely to be cost-effective, however, as the cost of rapid technologies decreases; total replacement with rapid technologies may be feasible. CONCLUSIONS: Despite the relatively poor quality of reporting of studies evaluating rapid detection methods, the reviewed evidence shows that PCR for Campylobacter, Salmonella and E. coli O157 is potentially very successful in identifying pathogens, possibly detecting more than the number currently reported using culture. Less is known about the benefits of testing for B. cereus, C. perfringens and S. aureus. Further investigation is needed on how clinical outcomes may be altered if test results are available more quickly and at a greater precision than in the current practice of bacterial culture
Validation of three rapid screening methods for detection of verotoxin-producing Escherichia coli in foods:interlaboratory study
An interlaboratory study was conducted for the validation of 3 methods for the detection of all verotoxin-producing Escherichia coli (VTEC) in foods. The methods were a multi-analyte 1-step lateral flow immunoassay (LFIA) for detection of E. coli O157 and verotoxin (VT); an enzyme-linked immunosorbent assay targeted against VT1, VT2, and VT2c (VT-ELISA); and a polymerase chain reaction (PCR) method for detection of VT genes (VT-PCR). Aliquots (25 g or 25 mL) of 4 food types (raw minced [ground] beef, unpasteurized milk, unpasteurized apple juice [cider], and salami) were individually inoculated with low numbers (<9 to 375 cells/25 g) of 6 test strains of E. coli (serogroups O26, O103, O111, O145, and O157) with differing VT-producing capabilities. Five replicates for each test strain and 5 uninoculated samples were prepared for each food type. Fourteen participating laboratories analyzed samples using the LFIA, 9 analyzed the samples by ELISA, and 9 by PCR. The LFIA for O157 and VT had a specificity (correct identification of negative samples) of 92 and 94%, respectively, and a sensitivity (correct identification of positive samples) of 94 and 55%, respectively. The VT-ELISA and VT-PCR had a specificity of 98 and 99%, respectively, and a sensitivity of 89 and 72%, respectively