59 research outputs found
Randomized Clinical Trial to Evaluate the Pathogenicity of Bibersteinia trehalosi in Respiratory Disease among Calves
Bibersteinia trehalosi causes respiratory disease in ruminants particularly in wild and domestic sheep. Recently, there has been an increased number of B. trehalosi isolates obtained from diagnostic samples from bovine respiratory disease cases. This study evaluated the role of B. trehalosi in bovine respiratory disease using an intra-tracheal inoculation model in calves. Thirty six cross bred 2–3 month old dairy calves were inoculated intra-tracheally with either leukotoxin negative B. trehalosi, leukotoxin positive B. trehalosi isolate, Mannheimia haemolytica, a combination of leukotoxin negative B. trehalosi and M. haemolytica or negative control. Calves were euthanized and necropsy performed on day 10 of study. B. trehalosi inoculated calves did not have increased lung involvement compared to control calves. Additionally, B. trehalosiwas only cultured once from the lungs of inoculated calves at necropsy. Based on these findings B. trehalosi may not be a primary pathogen of respiratory disease in cattle. Culture of B. trehalosifrom diagnostic submissions should not be immediately identified as a primary cause of respiratory disease
Isolation and Characterization of Methicillin-Resistant Staphylococcus aureus from Pork Farms and Visiting Veterinary Students
In the last decade livestock-associated methicillin-resistant S. aureus (LA-MRSA) has become a public health concern in many parts of the world. Sequence type 398 (ST398) has been the most commonly reported type of LA-MRSA. While many studies have focused on long-term exposure experienced by swine workers, this study focuses on short-term exposures experienced by veterinary students conducting diagnostic investigations. The objectives were to assess the rate of MRSA acquisition and longevity of carriage in students exposed to pork farms and characterize the recovered MRSA isolates. Student nasal swabs were collected immediately before and after farm visits. Pig nasal swabs and environmental sponge samples were also collected. MRSA isolates were identified biochemically and molecularly including spa typing and antimicrobial susceptibility testing. Thirty (30) veterinary students were enrolled and 40 pork farms were visited. MRSA was detected in 30% of the pork farms and in 22% of the students following an exposure to a MRSA-positive pork farm. All students found to be MRSA-positive initially following farm visit were negative for MRSA within 24 hours post visit. Most common spa types recovered were t002 (79%), t034 (16%) and t548 (4%). Spa types found in pork farms closely matched those recovered from students with few exceptions. Resistance levels to antimicrobials varied, but resistance was most commonly seen for spectinomycin, tetracyclines and neomycin. Non-ST398 MRSA isolates were more likely to be resistant to florfenicol and neomycin as well as more likely to be multidrug resistant compared to ST398 MRSA isolates. These findings indicate that MRSA can be recovered from persons visiting contaminated farms. However, the duration of carriage was very brief and most likely represents contamination of nasal passages rather than biological colonization. The most common spa types found in this study were associated with ST5 and expands the range of livestock-associated MRSA types
Detection of Salmonella Enteritidis in Pooled Poultry Environmental Samples Using a Serotype-Specific Real-Time–Polymerase Chain Reaction Assay
While real-time–polymerase chain reaction (RT PCR) has been used as a rapid test for detection of Salmonella Enteritidis in recent years, little research has been done to assess the feasibility of pooling poultry environmental samples with aSalmonella Enteritidis–specific RT PCR assay. Therefore the objective of this study was to compare RT PCR SalmonellaEnteritidis detection in individual and pooled (in groups of two, three, and four) poultry environmental drag swab samples to traditional cultural methods. The drag swabs were collected from poultry facilities previously confirmed positive forSalmonella Enteritidis and were cultured according to National Poultry Improvement Plan guidelines. Initial, SalmonellaEnteritidis–specific RT PCR assay threshold cycle cutoff values of ≤36, ≤30, and ≤28 were evaluated in comparison to culture. The average limit of detection of the RT PCR assay was 2.4 × 103 colony-forming units (CFUs)/ml, which corresponded to an average threshold cycle value of 36.6. Before enrichment, samples inoculated with concentrations from 102 to 105 CFUs/ml were detected by RT PCR, while after enrichment, samples inoculated from 100 to 105 CFUs/ml were detected by RT PCR. Threshold cycle cutoff values were used in the subsequent field trial from which Salmonella Enteritidis was cultured in 7 of 208 environmental samples (3.4%). Individual samples were 99.0%, 100%, and 100% in agreement with the RT PCR at threshold cycle (Ct) cutoff values of ≤36, ≤30, and ≤28 respectively. The agreement for pooled samples also followed the same trend with highest agreement at Ct ≤ 28 (pool of 2 = 100.0%, pool of 3 = 100.0%, pool of 4 = 100.0%), midrange agreement at Ct ≤ 30 (pool of 2 = 99.0%, pool of 3 = 100.0%, pool of 4 = 100.0%), and lowest agreement at Ct ≤ 36 (pool of 2 = 98.1%, pool of 3 = 97.1%, pool of 4 = 98.1%). In conclusion, regardless of the level of pooling after tetrathionate enrichment, sensitivity was very good, and results would be comparable to what would have been found with individual culture or individual RT PCR at Ct ≤ 36
Data evidencing slow anaerobic digestion in emergency treatment and disposal of infectious animal carcasses
Burial of infectious and potentially infectious livestock and poultry animals is the most common response to an emergency situation. The data set summarizes 22-week-long experiment that simulates the environment found within conventional burial trenches for emergency disposal of animal carcasses, worldwide, sometimes with a topical application of quicklime as it is required in the Republic of Korea. This data set shows the rarely presented evidence of the extremely slow decay of animal carcasses. Besides visual evidence of no visible breakdown of carcass material, i.e., carcass (or carcass quarters and coarse cuts) still resembled the initial material at the end of the study, we present data characterizing the process. Specifically, temporal variations of digestate quality (pH, ammonia, volatile fatty acids), biogas production, and the persistence of odorous volatile organic compounds are summarized. The data provide important evidence of undesirable, slow progression of the digestion process. The evidence of failure to achieve practical endpoints with the anaerobic digestion provides the impetus for seeking alternative, improved methods of disposal that will be feasible in emergency context, such as aerated burial concept (Koziel et al., 2018 [1])
Method for sampling and analysis of volatile biomarkers in process gas from aerobic digestion of poultry carcasses using time-weighted average SPME and GC–MS
A passive sampling method, using retracted solid-phase microextraction (SPME) – gas chromatography–mass spectrometry and time-weighted averaging, was developed and validated for tracking marker volatile organic compounds (VOCs) emitted during aerobic digestion of biohazardous animal tissue. The retracted SPME configuration protects the fragile fiber from buffeting by the process gas stream, and it requires less equipment and is potentially more biosecure than conventional active sampling methods. VOC concentrations predicted via a model based on Fick’s first law of diffusion were within 6.6–12.3% of experimentally controlled values after accounting for VOC adsorption to the SPME fiber housing. Method detection limits for five marker VOCs ranged from 0.70 to 8.44 ppbv and were statistically equivalent (p \u3e 0.05) to those for active sorbent-tube-based sampling. The sampling time of 30 min and fiber retraction of 5 mm were found to be optimal for the tissue digestion process
Antimicrobial Resistance Distribution Differs Among Methicillin Resistant Staphylococcus aureus Sequence Type (ST) 5 Isolates From Health Care and Agricultural Sources
Antimicrobial resistance (AMR) is an expanding public health concern and methicillin resistant Staphylococcus aureus (MRSA) is a notable example. Since the discovery of livestock associated MRSA (LA-MRSA), public health concerns have arisen surrounding the potential of LA-MRSA isolates to serve as a reservoir for AMR determinants. In this study, we compare swine associated LA-MRSA ST5 and human clinical MRSA ST5 isolates for phenotypic antimicrobial susceptibilities determined via broth microdilution and genotypic determinants of AMR using whole genome sequencing and comparative genomic analysis to identify AMR elements. Swine associated LA-MRSA ST5 isolates exhibited phenotypic resistance to fewer antibiotics than clinical MRSA ST5 isolates from humans with no swine contact. Distinct genomic AMR elements were harbored by each subgroup, with little overlap in shared AMR genes between swine associated LA-MRSA ST5 and clinical MRSA ST5 isolates. Our results demonstrate that phenotypic antimicrobial susceptibilities and genotypic determinants of AMR among swine associated LA-MRSA ST5 and clinical MRSA ST5 isolates are separate and distinct
Investigation of the Impact of Increased Dietary Insoluble Fiber through the Feeding of Distillers Dried Grains with Solubles (DDGS) on the Incidence and Severity of Brachyspira-Associated Colitis in Pigs
Diet has been implicated as a major factor impacting clinical disease expression of swine dysentery and Brachyspira hyodysenteriae colonization. However, the impact of diet on novel pathogenic strongly beta-hemolytic Brachyspira spp. including “B. hampsonii” has yet to be investigated. In recent years, distillers dried grains with solubles (DDGS), a source of insoluble dietary fiber, has been increasingly included in diets of swine. A randomized complete block experiment was used to examine the effect of increased dietary fiber through the feeding of DDGS on the incidence of Brachyspira-associated colitis in pigs. One hundred 4-week-old pigs were divided into five groups based upon inocula (negative control, Brachyspira intermedia,Brachyspira pilosicoli, B. hyodysenteriae or “B. hampsonii”) and fed one of two diets containing no (diet 1) or 30% (diet 2) DDGS. The average days to first positive culture and days post inoculation to the onset of clinical dysentery in the B. hyodysenteriae groups was significantly shorter for diet 2 when compared to diet 1 (P = 0.04 and P = 0.0009, respectively). A similar difference in the average days to first positive culture and days post inoculation to the onset of clinical dysentery was found when comparing the “B. hampsonii” groups. In this study, pigs receiving 30% DDGS shed on average one day prior to and developed swine dysentery nearly twice as fast as pigs receiving 0% DDGS. Accordingly, these data suggest a reduction in insoluble fiber through reducing or eliminating DDGS in swine rations should be considered an integral part of any effective disease elimination strategy for swine dysentery
Lab-scale evaluation of aerated burial concept for treatment and emergency disposal of infectious animal carcasses
Nearly 55,000 outbreaks of animal disease were reported to the World Animal Health Information Database between 2005 and 2016. To suppress the spread of disease, large numbers of animal mortalities often must be disposed of quickly and are frequently buried on the farm where they were raised. While this method of emergency disposal is fast and relatively inexpensive, it also can have undesirable and lasting impacts (slow decay, concerns about groundwater contamination, pathogens re-emergence, and odor). Following the 2010 foot-and-mouth disease outbreak, the Republic of Korea\u27s National Institute of Animal Science funded research on selected burial alternatives or modifications believed to have potential to reduce undesirable impacts of burial. One such modification involves the injection of air into the liquid degradation products from the 60–70% water from decomposing carcasses in lined burial trenches. Prior to prototype development in the field, a laboratory-scale study of aerated decomposition (AeD) of poultry carcasses was conducted to quantify improvements in time of carcass decomposition, reduction of potential groundwater pollutants in the liquid products of decomposition (since trench liners may ultimately leak), and reduction of odorous VOCs emitted during decomposition. Headspace gases also were monitored to determine the potential for using gaseous biomarkers in the aerated burial trench exhaust stream to monitor completion of the decomposition. Results of the lab-scale experiments show that the mass of chicken carcasses was reduced by 95.0 ± 0.9% within 3 months at mesophilic temperatures (vs. negligible reduction via mesophilic anaerobic digestion typical of trench burial) with concomitant reduction of biochemical oxygen demand (BOD; 99%), volatile suspended solids (VSS; 99%), total suspended solids (TSS; 99%), and total ammonia nitrogen (TAN; 98%) in the liquid digestate. At week #7 BOD and TSS in digestate met the U.S. EPA standards for treated wastewater discharge to surface water. Salmonella and Staphylococcus were inactivated by the AeD process after week #1 and #3, respectively. Five gaseous biomarkers: pyrimidine; p-cresol; phenol; dimethyl disulfide; and dimethyl trisulfide; were identified and correlated with digestate quality. Phenol was the best predictor of TAN (R = 0.96), BOD (R = 0.92), and dissolved oxygen (DO) (R = −0.91). Phenol was also the best predictor populations of Salmonella (R = 0.95) and aerobes (R = 0.88). P-cresol was the best predictor for anaerobes (R = 0.88). The off-gas from AeD will require biofiltration or other odor control measures for a much shorter time than anaerobic decomposition. The lab-scale studies indicate that AeD burial has the potential to make burial a faster, safer, and more environmentally friendly method for emergency disposal and treatment of infectious animal carcasses and that this method should be further developed via prototype-scale field studies
Multidrug-resistant Strains of Salmonella enterica Typhimurium, United States, 1997–19981
To evaluate multidrug-resistant strains of Salmonella
enterica Typhimurium, including definitive type 104 (DT104) in the United States, we reviewed data from the National Antimicrobial Resistance Monitoring System (NARMS). In 1997–1998, 25% (703) of 2,767 serotyped Salmonella isolates received at NARMS were S. Typhimurium; antimicrobial susceptibility testing and phage typing were completed for 697. Fifty-eight percent (402) were resistant to >1 antimicrobial agent. Three multidrug-resistant (>5 drugs) strains accounted for 74% (296) of all resistant isolates. Ceftriaxone resistance was present in 3% (8), and nalidixic acid resistance in 1% (4), of these multidrug-resistant strains. By phage typing, 37% (259) of S. Typhimurium isolates were DT104, 30% (209) were of undefined type and 15% (103) were untypable. Fifty percent (202) of resistant (>1 drug) isolates were DT104. Multidrug-resistant S. Typhimurium isolates, particularly DT104, account for a substantial proportion of S. Typhimurium isolates; ceftriaxone resistance is exhibited by some of these strains
Multilaboratory Survey To Evaluate Salmonella Prevalence in Diarrheic and Nondiarrheic Dogs and Cats in the United States between 2012 and 2014
Eleven laboratories collaborated to determine the periodic prevalence of Salmonella in a population of dogs and cats in the United States visiting veterinary clinics. Fecal samples (2,965) solicited from 11 geographically dispersed veterinary testing laboratories were collected in 36 states between January 2012 and April 2014 and tested using a harmonized method. The overall study prevalence of Salmonella in cats (3 of 542) was <1%. The prevalence in dogs (60 of 2,422) was 2.5%. Diarrhea was present in only 55% of positive dogs; however, 3.8% of the all diarrheic dogs were positive, compared with 1.8% of the nondiarrheic dogs. Salmonella-positive dogs were significantly more likely to have consumed raw food (P = 0.01), to have consumed probiotics (P = 0.002), or to have been given antibiotics (P = 0.01). Rural dogs were also more likely to be Salmonella positive than urban (P = 0.002) or suburban (P = 0.001) dogs. In the 67 isolates, 27 unique serovars were identified, with three dogs having two serovars present. Antimicrobial susceptibility testing of 66 isolates revealed that only four of the isolates were resistant to one or more antibiotics. Additional characterization of the 66 isolates was done using pulsed-field gel electrophoresis and whole-genome sequencing (WGS). Sequence data compared well to resistance phenotypic data and were submitted to the National Center for Biotechnology Information (NCBI). This study suggests an overall decline in prevalence of Salmonella-positive dogs and cats over the last decades and identifies consumption of raw food as a major risk factor for Salmonella infection. Of note is that almost half of the Salmonella-positive animals were clinically nondiarrheic
- …