115 research outputs found
Combined Loop-Mediated Isothermal Amplification Assays for Rapid Detection and One-Step Differentiation of Campylobacter jejuni and Campylobacter coli in Meat Products
A loop-mediated isothermal amplification (LAMP) assay system was established, allowing rplD gene-based simultaneous detection of Campylobacter jejuni and Campylobacter coli in enriched meat products. Additionally, one-step differentiation of target species on agar plates was enabled by cdtC gene- and gyrA gene-based duplex LAMP. Both the rplD and cdtCâgyrA LAMP assays amplified the target sequences in all 62 C. jejuni and 27 C. coli strains used for determining inclusivity and revealed 100% exclusivity toward 85 tested non-target species. Throughout the entire experiments, C. jejuni and C. coli strains were 100% distinguishable by melting curves of cdtC and gyrA LAMP products. After 24-h enrichment, the rplD LAMP assay reliably detected initial inoculation levels of 10â100 CFU/g in artificially contaminated minced meat. Investigation of naturally contaminated meat samples revealed a diagnostic accuracy of 95% toward real-time PCR and 94.1% toward the standard culture method applying the 24-h incubation period. Diagnostic sensitivity and specificity, and positive and negative predictive values were 89.8, 100, 100, and 91.2%, respectively, when measured against real-time PCR, and 89.6, 98.1, 97.7, and 91.2%, respectively, when measured against the standard culture method. After 48-h enrichment, the detection limit of the rplD LAMP assay improved to initial inoculation levels of 1â10 CFU/g in artificially contaminated minced meat. Applying the 48-h incubation period on naturally contaminated meat samples resulted in 100% concordant results between rplD LAMP, real-time PCR, and the standard culture method. The established LAMP assay system was proved to be suitable for rapid meat sample screening. Furthermore, it constitutes a promising tool for investigating other Campylobacter sources and could therefore make a valuable contribution to protect consumers from foodborne illness
Recommended from our members
Improving fodder yields and nutritive value of some forage grasses as animal feeds through intercropping with Egyptian Clover (trifolium alexandrinum l.)
The present study aimed to evaluate the potential of improving the feeding value of Egyptian clover (EC), ryegrass (R), triticale (T), barley (B), and oats (O) monoculture, or Egyptian clover mixed with ryegrass (EC+R), oats (EC+O), barely (EC+B), and triticale (EC+T) at 75:25% seeding rate, respectively, during two successive winter seasons of 2018/19 and 2019/20. Harvesting of plots was carried out at 5 cm stubble height after 60, 100, and 140 days from sowing. The in vitro nutritive value and ruminal fermentation of the monoculture and intercropping containing EC were evaluated. Green forage yield of EC was higher than other plants with about 160% of fresh forage compared with T, O, or EC+T intercropping. The highest crude protein (CP) concentration was noted in EC, while the lowest (p < 0.001) concentration was observed in T, which had the highest fiber fractions content. Ryegrass had the highest net in vitro gas production (GP), while EC+R had the lowest GP (p < 0.05). The EC increased dry matter and organic matter degradability. EC and R reduced protozoal count, while total volatile fatty acids (VFA), acetate, and propionate were increased with B and EC+T intercropping (p < 0.05). Overall, intercropping of EC with grass of triticale or ryegrass at mixing rates of 75:25% resulted in improving fresh and dry forage yields. The legumeâgrass intercropping improved the protozoa count partitioning factor as an index of microbial protein synthesis and total VFA concentration
Surveillance on A/H5N1 virus in domestic poultry and wild birds in Egypt
The endemic H5N1 high pathogenicity avian influenza virus (A/H5N1) in poultry
in Egypt continues to cause heavy losses in poultry and poses a significant
threat to human health. Here we describe results of A/H5N1 surveillance in
domestic poultry in 2009 and wild birds in 2009-2010. Tracheal and cloacal
swabs were collected from domestic poultry from 22024 commercial farms, 1435
backyards and 944 live bird markets (LBMs) as well as from 1297 wild birds
representing 28 different types of migratory birds. Viral RNA was extracted
from a mix of tracheal and cloacal swabs media. Matrix gene of avian influenza
type A virus was detected using specific real-time reverse-transcription
polymerase chain reaction (RT-qPCR) and positive samples were tested by RT-
qPCR for simultaneous detection of the H5 and N1 genes. In this surveillance,
A/H5N1 was detected from 0.1% (nâ=â23/) of examined commercial poultry farms,
10.5% (nâ=â151) of backyard birds and 11.4% (nâ=â108) of LBMs but no wild bird
tested positive for A/H5N1. The virus was detected from domestic poultry year-
round with higher incidence in the warmer months of summer and spring
particularly in backyard birds. Outbreaks were recorded mostly in Lower Egypt
where 95.7% (nâ=â22), 68.9% (nâ=â104) and 52.8% (nâ=â57) of positive
commercial farms, backyards and LBMs were detected, respectively. Higher
prevalence (56%, nâ=â85) was reported in backyards that had mixed chickens and
waterfowl together in the same vicinity and LBMs that had waterfowl (76%, nâ=
82). Our findings indicated broad circulation of the endemic A/H5N1 among
poultry in 2009 in Egypt. In addition, the epidemiology of A/H5N1 has changed
over time with outbreaks occurring in the warmer months of the year. Backyard
waterfowl may play a role as a reservoir and/or source of A/H5N1 particularly
in LBMs. The virus has been established in poultry in the Nile Delta where
major metropolitan areas, dense human population and poultry stocks are
concentrated. Continuous surveillance, tracing the source of live birds in the
markets and integration of multifaceted strategies and global collaboration
are needed to control the spread of the virus in Egypt
In vitro evaluation of sodium butyrate on the growth of three Salmonella serovars derived from pigs at a mild acidic pH value
Foodborne zoonotic diseases can be transferred into the food chain at the stage of livestock farming. As an emerging public health challenge, practicable reduction measures in porcine health management for Salmonella are constantly being investigated. This in vitro study aimed to determine the influence of six different sodium butyrate (SB) concentrations (0, 5, 10, 20, 40, and 80 mM) on the growth of three different Salmonella enterica serovars at a constant pH value of 6.0, corresponding to conditions in the pig's hindgut. S. Derby and S. Typhimurium, isolated from a pig farm, and S. Typhimurium DSM 19587, which served as control, were used. Broth microdilution assay was applied to record Salmonella growth in the presence of different SB-concentrations over six different incubation periods (0, 1, 2, 4, 6, and 24 h). Results were quantified in the log colony-forming units (log10 CFU/mL). For 1 h incubation, the addition of SB showed no significant differences in the range of initial Salmonella dose of about 5.7 log10 between concentrations (0â80 mM, 5.26 ± 0.10â5.60 ± 0.07 log10, p > 0.05). After 6 h, for SB addition, the range of Salmonella counts was significantly lower compared to no addition of SB (5â80 mM, p < 0.05), 6.78 ± 0.84â7.90 ± 0.10 log10 for 5 mM, and 7.53 ± 0.04â8.71 ± 0.22 log10 for 0 mM. Moreover, for SB concentrations of 40 and 80 mM, no difference in the range of Salmonella counts over 6 h was obtained (5.23 ± 0.11â5.38 ± 0.05 log10, p > 0.05), and minor Salmonella growth was recorded at the earliest after 24 h incubation. Growth rates for varying SB concentrations and incubation times were confirmed in a similar manner for the three serovars. Obtained results suggest that increasing SB concentrations suppress Salmonella growth for concentrations of 5â20 mM over a 6 h incubation period and for 40 and 80 mM over a 24 h incubation period. When transferring these in vitro findings to the porcine organism, it may be assumed that Salmonella reduction can be achieved by increased butyrate content in the chyme of the large intestine
Recommended from our members
Improved shelf-life and consumer acceptance of fresh-cut and fried potato strips by an edible coating of garden cress seed mucilage
Coatings that reduce the fat content of fried food are an alternate option to reach both health concerns and consumer demand. Mucilage of garden cress (Lepidium sativum) seed extract (MSE) was modified into an edible coating with or without ascorbic acid (AA) to coat fresh-cut potato strips during cold storage (5 °C and 95% RH for 12 days) and subsequent frying. Physical attributes such as color, weight loss, and texture of potato strips coated with MSE solutions with or without AA showed that coatings efficiently delayed browning, reduced weight loss, and maintained the texture during cold storage. Moreover, MSE with AA provided the most favorable results in terms of reduction in oil uptake. In addition, the total microbial count was lower for MSE-coated samples when compared to the control during the cold storage. MSE coating also performed well on sensory attributes, showing no off flavors or color changes. As a result, the edible coating of garden cress mucilage could be a promising application for extending shelf-life and reducing the oil uptake of fresh-cut potato strips
Antimicrobial resistance among migrants in Europe: a systematic review and meta-analysis
BACKGROUND: Rates of antimicrobial resistance (AMR) are rising globally and there is concern that increased migration is contributing to the burden of antibiotic resistance in Europe. However, the effect of migration on the burden of AMR in Europe has not yet been comprehensively examined. Therefore, we did a systematic review and meta-analysis to identify and synthesise data for AMR carriage or infection in migrants to Europe to examine differences in patterns of AMR across migrant groups and in different settings. METHODS: For this systematic review and meta-analysis, we searched MEDLINE, Embase, PubMed, and Scopus with no language restrictions from Jan 1, 2000, to Jan 18, 2017, for primary data from observational studies reporting antibacterial resistance in common bacterial pathogens among migrants to 21 European Union-15 and European Economic Area countries. To be eligible for inclusion, studies had to report data on carriage or infection with laboratory-confirmed antibiotic-resistant organisms in migrant populations. We extracted data from eligible studies and assessed quality using piloted, standardised forms. We did not examine drug resistance in tuberculosis and excluded articles solely reporting on this parameter. We also excluded articles in which migrant status was determined by ethnicity, country of birth of participants' parents, or was not defined, and articles in which data were not disaggregated by migrant status. Outcomes were carriage of or infection with antibiotic-resistant organisms. We used random-effects models to calculate the pooled prevalence of each outcome. The study protocol is registered with PROSPERO, number CRD42016043681. FINDINGS: We identified 2274 articles, of which 23 observational studies reporting on antibiotic resistance in 2319 migrants were included. The pooled prevalence of any AMR carriage or AMR infection in migrants was 25·4% (95% CI 19·1-31·8; I2 =98%), including meticillin-resistant Staphylococcus aureus (7·8%, 4·8-10·7; I2 =92%) and antibiotic-resistant Gram-negative bacteria (27·2%, 17·6-36·8; I2 =94%). The pooled prevalence of any AMR carriage or infection was higher in refugees and asylum seekers (33·0%, 18·3-47·6; I2 =98%) than in other migrant groups (6·6%, 1·8-11·3; I2 =92%). The pooled prevalence of antibiotic-resistant organisms was slightly higher in high-migrant community settings (33·1%, 11·1-55·1; I2 =96%) than in migrants in hospitals (24·3%, 16·1-32·6; I2 =98%). We did not find evidence of high rates of transmission of AMR from migrant to host populations. INTERPRETATION: Migrants are exposed to conditions favouring the emergence of drug resistance during transit and in host countries in Europe. Increased antibiotic resistance among refugees and asylum seekers and in high-migrant community settings (such as refugee camps and detention facilities) highlights the need for improved living conditions, access to health care, and initiatives to facilitate detection of and appropriate high-quality treatment for antibiotic-resistant infections during transit and in host countries. Protocols for the prevention and control of infection and for antibiotic surveillance need to be integrated in all aspects of health care, which should be accessible for all migrant groups, and should target determinants of AMR before, during, and after migration. FUNDING: UK National Institute for Health Research Imperial Biomedical Research Centre, Imperial College Healthcare Charity, the Wellcome Trust, and UK National Institute for Health Research Health Protection Research Unit in Healthcare-associated Infections and Antimictobial Resistance at Imperial College London
Defining criteria for disease activity states in systemic juvenile idiopathic arthritis based on the systemic Juvenile Arthritis Disease Activity Score
Objective
To develop and validate cutoff values in the systemic Juvenile Arthritis Disease Activity Score 10 (sJADAS10) that distinguish the states of inactive disease (ID), minimal disease activity (MiDA), moderate disease activity (MoDA), and high disease activity (HDA) in children with systemic juvenile idiopathic arthritis (sJIA), based on subjective disease state assessment by the treating pediatric rheumatologist.
Methods
The cutoffs definition cohort was composed of 400 patients enrolled at 30 pediatric rheumatology centers in 11 countries. Using the subjective physician rating as an external criterion, 6 methods were applied to identify the cutoffs: mapping, calculation of percentiles of cumulative score distribution, Youden index, 90% specificity, maximum agreement, and ROC curve analysis. Sixty percent of the patients were assigned to the definition cohort and 40% to the validation cohort. Cutoff validation was conducted by assessing discriminative ability.
Results
The sJADAS10 cutoffs that separated ID from MiDA, MiDA from MoDA, and MoDA from HDA were †2.9, †10, and > 20.6. The cutoffs discriminated strongly among different levels of pain, between patients with or without morning stiffness, and between patients whose parents judged their disease status as remission or persistent activity/flare or were satisfied or not satisfied with current illness outcome.
Conclusion
The sJADAS cutoffs revealed good metrologic properties in both definition and validation cohorts, and are therefore suitable for use in clinical trials and routine practice
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05â2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
Association of respiratory symptoms and lung function with occupation in the multinational Burden of Obstructive Lung Disease (BOLD) study
Background
Chronic obstructive pulmonary disease has been associated with exposures in the workplace. We aimed to assess the association of respiratory symptoms and lung function with occupation in the Burden of Obstructive Lung Disease study.
Methods
We analysed cross-sectional data from 28â823 adults (â„40â
years) in 34 countries. We considered 11 occupations and grouped them by likelihood of exposure to organic dusts, inorganic dusts and fumes. The association of chronic cough, chronic phlegm, wheeze, dyspnoea, forced vital capacity (FVC) and forced expiratory volume in 1â
s (FEV1)/FVC with occupation was assessed, per study site, using multivariable regression. These estimates were then meta-analysed. Sensitivity analyses explored differences between sexes and gross national income.
Results
Overall, working in settings with potentially high exposure to dusts or fumes was associated with respiratory symptoms but not lung function differences. The most common occupation was farming. Compared to people not working in any of the 11 considered occupations, those who were farmers for â„20â
years were more likely to have chronic cough (OR 1.52, 95% CI 1.19â1.94), wheeze (OR 1.37, 95% CI 1.16â1.63) and dyspnoea (OR 1.83, 95% CI 1.53â2.20), but not lower FVC (ÎČ=0.02â
L, 95% CI â0.02â0.06â
L) or lower FEV1/FVC (ÎČ=0.04%, 95% CI â0.49â0.58%). Some findings differed by sex and gross national income.
Conclusion
At a population level, the occupational exposures considered in this study do not appear to be major determinants of differences in lung function, although they are associated with more respiratory symptoms. Because not all work settings were included in this study, respiratory surveillance should still be encouraged among high-risk dusty and fume job workers, especially in low- and middle-income countries.publishedVersio
- âŠ