40 research outputs found
Assessing the effect of a single dose florfenicol treatment in feedlot cattle on the antimicrobial resistance patterns in faecal Escherichia coli
The objective of this clinical trial was to examine the effect of a single dose of florfenicol on antimicrobial resistance patterns in faecal E. coli of feedlot steers. Steers (n = 370), were purchased from two sources and housed in outdoor concrete floored pens. Two cattle from each pen (n = 42 pens, 84 cattle) were randomly selected for faecal sampling at study day 1, 14, 28, and 42. One sampled animal from each of 21 pens was randomly selected to receive a single 39.6 mg/kg dose of florfenicol subcutaneously at study day 11. Ten lactose positive colonies were isolated from faecal swabs and tested for antimicrobial resistance to 11 antimicrobials using the disk diffusion method. Zones of inhibition were grouped using cluster analysis and clusters were ordered by increasing multiple resistance. A cumulative logistic regression model using generalized estimating equations was used to assess factors associated with increasing levels of multiple resistance. Immediately post-treatment, all isolates obtained from treated cattle belonged to multiple resistant clusters containing chloramphenicol resistance. Though less pronounced in later sampling, resistance to chloramphenicol and other antimicrobials persisted. Antimicrobial treatment, sampling time and animal source, as well as interactions between these variables, were important predictors of the odds of E. coli belonging to a more resistant cluster. A very clear but transitory shift to increasingly multiple resistant faecal E. coli in response to florfenicol treatment was observed. There was no indication of horizontal transfer of resistant E. coli between steers. Level of resistance was influenced by complex interaction of animal source and previous managemen
Nutritional strategies to combat Salmonella in mono-gastric food animal production
Nutritional strategies to minimize Salmonella in food animal production are one of the key components in producing safer food. The current European approach is to use a farm-to-fork strategy, where each sector must implement measures to minimize and reduce Salmonella contamination. In the pre-harvest phase, this means that all available tools need to be used such as implementation of biosecurity measures, control of Salmonella infections in animals at the farm as well as in transport and trade, optimal housing and management including cleaning, disinfection procedures as well as efforts to achieve Salmonella-free feed production. This paper describes some nutritional strategies that could be used in farm control programmes in the major mono-gastric food production animals: poultry and pigs. Initially, it is important to prevent the introduction of Salmonella onto the farm through Salmonella-contaminated feed and this risk is reduced through heat treatment and the use of organic acids and their salts and formaldehyde. Microbiological sampling and monitoring for Salmonella in the feed mills is required to minimize the introduction of Salmonella via feed onto the farm. In addition, feed withdrawal may create a stressful situation in animals, resulting in an increase in Salmonella shedding. Physical feed characteristics such as coarse-ground meal to pigs can delay gastric emptying, thereby increasing the acidity of the gut and thus reducing the possible prevalence of Salmonella. Coarse-ground grains and access to litter have also been shown to decrease Salmonella shedding in poultry. The feed can also modify the gastro-intestinal tract microflora and influence the immune system, which can minimize Salmonella colonization and shedding. Feed additives, such as organic acids, short-and medium-chain fatty acids, probiotics, including competitive exclusion cultures, prebiotics and certain specific carbohydrates, such as mannan-based compounds, egg proteins, essential oils and bacteriophages, have the potential to reduce Salmonella levels when added to the feed. These nutritional strategies could be evaluated and used in farm control programmes
Methods and microbial risks associated with composting of animal carcasses in the United States
Composting is an alternative method of carcass disposal in those situations when conventional methods are inadequate. With proper maintenance and monitoring, carcass composting systems can be safe and efficient with minimal environmental impacts. Importantly, proper composting eliminates many pathogens and may reduce levels of carcass contamination with spore-forming bacteria, prions, and other pathogens
Review on bovine respiratory syncytial virus and bovine parainfluenza : usual suspects in bovine respiratory disease : a narrative review
Bovine Respiratory Syncytial virus (BRSV) and Bovine Parainfluenza 3 virus (BPIV3) are closely related viruses involved in and both important pathogens within bovine respiratory disease (BRD), a major cause of morbidity with economic losses in cattle populations around the world. The two viruses share characteristics such as morphology and replication strategy with each other and with their counterparts in humans, HRSV and HPIV3. Therefore, BRSV and BPIV3 infections in cattle are considered useful animal models for HRSV and HPIV3 infections in humans. The interaction between the viruses and the different branches of the host's immune system is rather complex. Neutralizing antibodies seem to be a correlate of protection against severe disease, and cell-mediated immunity is thought to be essential for virus clearance following acute infection. On the other hand, the host's immune response considerably contributes to the tissue damage in the upper respiratory tract. BRSV and BPIV3 also have similar pathobiological and epidemiological features. Therefore, combination vaccines against both viruses are very common and a variety of traditional live attenuated and inactivated BRSV and BPIV3 vaccines are commercially available
A field study to determine the prevalence, dairy herd management systems, and fresh cow clinical conditions associated with ketosis in western European dairy herds
The aim of this study was to determine the prevalence, major management systems, and fresh cow clinical conditions associated with ketosis in western European dairy herds. A total of 131 dairies were enrolled in Germany, France, Italy, the Netherlands, and the United Kingdom during 2011 to 2012. A milk-based test for ketones (Keto-Test; Sanwa Kagaku Kenkyusho Co. Ltd., Nagoya, Japan; distributed by Elanco Animal Health, Antwerp, Belgium) was used for screening cows between d 7 and 21 after calving and ketosis was defined as a Keto-Test >/=100micromol/L. Study cows were observed for clinical disease up to 35d postcalving. Multivariate analysis (generalized estimating equation logistic regression) was performed to determine country, farm, management, feed, and cow factors associated with ketosis and to determine associations between ketosis and fresh cow diseases. Thirty-nine percent of the cows were classified as having ketosis. The herd average of ketosis was 43% in Germany, 53% in France, 31% in Italy, 46% in the Netherlands, and 31% in the United Kingdom. Of the 131 farms, 112 (85%) had 25% or more of their fresh cows resulting as positive for ketosis. Clinical ketosis was not reported in most farms and the highest level of clinical ketosis reported was 23%. The risks of ketosis were significantly lower in Italy and the United Kingdom compared with France, the Netherlands, and Germany. Larger herd size was associated with a decreased risk of ketosis. The farms that fed partially mixed rations had 1.5 times higher odds of ketosis than those that fed total mixed rations. Cows that calved in April to June had the highest odds of ketosis, with about twice as high odds compared with cows that calved in July to September. The cows that calved in January to March tended to have 1.5 times higher risk of ketosis compared with cows that calved in July to September. The odds of ketosis in parity 2 and parity 3 to 7 was significantly higher (1.5 and 2.8 times higher, respectively) than the odds of ketosis in parity 1. The odds of ketosis was significantly smaller in parity 2 compared with parity 3 to 7. Ketosis was associated with significantly higher odds of all common fresh cow conditions: metritis, mastitis, displaced abomasum, clinical ketosis, lameness, and gastrointestinal disorders. Odds of ketosis in cows having had twins or dystocia were not increased, whereas higher odds of ketosis were observed in cows with milk fever or retained placenta
Comparison of Salmonella enterica serovar distribution and antibiotic resistance patterns in wastewater at municipal water treatment plants in two California cities
Aim: To determine Salmonella enterica serovars and antibiotic resistance (ABR) in the human waste stream.
Methods and Results: Sampling of influent wastewater at municipal treatment plants in two California cities was performed by collecting composite samples, over a 24-h period, from the treatment plants on five to six occasions. Serial water quantities were filtered and cultured with a Salmonella selective method and an oxytetracycline-supplemented Salmonella selective method. Antibiotic susceptibilities to 12 antibiotics were determined and the isolates were grouped based on ABR patterns. From 983 S. enterica isolated, 102 represented unique sampling-serovar-ABR patterns. Thirty-five different serovars were identified to be distributed over 17 different ABR patterns. The serovar distribution differed between the sampling sites, whereas there was no significant trend in levels of multiple ABR.
Conclusions: Salmonella enterica was recovered with ease from small sample volumes of wastewater received by municipal water treatment plants. A large variety of serovars and ABR profiles were represented in the recovered Salmonella.
Significance and Impact of the Study: The ease of sampling and recovery of Salmonella from municipal wastewater from treatment plants makes it a valuable sampling approach for monitoring the presence of Salmonella in the human population
Evaluation of the effects of oral colostrum supplementation during the first fourteen days on the health and performance of preweaned calves
Increasing concerns about antimicrobial resistance have led to the development and implementation of alternatives to antimicrobial use in animal production. The objective of this clinical trial was to determine the effect of colostrum supplementation of the milk replacer ration on morbidity, mortality, feed intake, and weight gain of preweaned calves. Ninety 1-d-old calves on each of 3 commercial calf ranches were randomly allocated to 1 of 3 groups. Treatment-group calves received 10 g of supplemental immunoglobulin G (IgG) in the form of 70 g of colostrum powder in the milk replacer twice daily for 14 d. The placebo-group calves received a nutritionally equivalent supplement lacking IgG in the milk replacer twice daily for 14 d. Control calves received milk replacer without supplements twice daily. Calves were housed in individual hutches and were weighed on d 1, 28, and 60. Serum was collected on d 2 for serum IgG determination. Daily health evaluations for the first 28 d of life were performed by study personnel blinded to treatment group assignment. Observed illness was treated based on health assessment, rectal temperature, and specific calf ranch protocols. Feed consumption (milk and grain) was recorded. Calves receiving supplemental colostrum had less diarrhea and received fewer antimicrobial treatments than control and placebo calves. The results indicated that calf diarrhea was associated with low serum IgG levels and low-weight calves. Grain consumption and weight gain over the first 28 d of life were significantly greater in colostrum-supplemented calves compared with control calves. No differences in mortality or respiratory disease incidence among groups were detected. Supplemental colostrum during the first 2 wk of life can reduce diarrheal disease in preweaned calves on calf ranches and thereby reduce the amount of antimicrobial treatments needed