34 research outputs found

    The synergistic necrohemorrhagic action of Clostridium perfringens perfringolysin and alpha toxin in the bovine intestine and against bovine endothelial cells

    Get PDF
    Bovine necrohemorrhagic enteritis is a major cause of mortality in veal calves. Clostridium perfringens is considered as the causative agent, but there has been controversy on the toxins responsible for the disease. Recently, it has been demonstrated that a variety of C. perfringens type A strains can induce necrohemorrhagic lesions in a calf intestinal loop assay. These results put forward alpha toxin and perfringolysin as potential causative toxins, since both are produced by all C. perfringens type A strains. The importance of perfringolysin in the pathogenesis of bovine necrohemorrhagic enteritis has not been studied before. Therefore, the objective of the current study was to evaluate the role of perfringolysin in the development of necrohemorrhagic enteritis lesions in calves and its synergism with alpha toxin. A perfringolysin-deficient mutant, an alpha toxin-deficient mutant and a perfringolysin alpha toxin double mutant were less able to induce necrosis in a calf intestinal loop assay as compared to the wild-type strain. Only complementation with both toxins could restore the activity to that of the wild-type. In addition, perfringolysin and alpha toxin had a synergistic cytotoxic effect on bovine endothelial cells. This endothelial cell damage potentially explains why capillary hemorrhages are an initial step in the development of bovine necrohemorrhagic enteritis. Taken together, our results show that perfringolysin acts synergistically with alpha toxin in the development of necrohemorrhagic enteritis in a calf intestinal loop model and we hypothesize that both toxins act by targeting the endothelial cells

    Steering endogenous butyrate production in the intestinal tract of broilers as a tool to improve gut health

    Get PDF
    The ban on antimicrobial growth promoters and efforts to reduce therapeutic antibiotic usage has led to major problems of gastrointestinal dysbiosis in livestock production in Europe. Control of dysbiosis without the use of antibiotics requires a thorough understanding of the interaction between the microbiota and the host mucosa. The gut microbiota of the healthy chicken is highly diverse, producing various metabolic end products, including gases and fermentation acids. The distal gut knows an abundance of bacteria from within the Firmicutes Clostridium clusters IV and XIVa that produce butyric acid, which is one of the metabolites that is sensed by the host as a signal. The host responds by strengthening the epithelial barrier, reducing inflammation, and increasing the production of mucins and antimicrobial peptides. Stimulating the colonization and growth of butyrate producing bacteria thus may help optimizing gut health. Various strategies are available to stimulate butyrate production in the distal gut. These include delivery of prebiotic substrates that are broken down by bacteria into smaller molecules which are then used by butyrate producers, a concept called cross-feeding. Xylo-oligosaccharides (XOS) are such compounds as they can be converted to lactate which is further metabolized to butyrate. Probiotic lactic acid producers can be supplied to support the cross-feeding reactions. Direct feeding of butyrate producing Clostridium cluster IV and XIVa strains are a future tool provided that large scale production of strictly anaerobic bacteria can be optimized. Current results of strategies that promote butyrate production in the gut are promising. Nevertheless, our current understanding of the intestinal ecosystem is still insufficient, and further research efforts are needed to fully exploit the capacity of these strategies

    Reduced particle size wheat bran is butyrogenic and lowers Salmonella colonization, when added to poultry feed

    No full text
    Feed additives, including prebiotics, are commonly used alternatives to antimicrobial growth promoters to improve gut health and performance in broilers. Wheat bran is a highly concentrated source of (in)soluble fiber which is partly degraded by the gut microbiota. The aim of the present study was to investigate the potential of wheat bran as such to reduce colonization of the cecum and shedding of Salmonella bacteria in vivo. Also, the effect of particle size was evaluated. Bran with an average reduced particle size of 280μm decreased levels of cecal Salmonella colonization and shedding shortly after infection when compared to control groups and groups receiving bran with larger particle sizes. In vitro fermentation experiments revealed that bran with smaller particle size was fermented more efficiently, with a significantly higher production of butyric and propionic acid, compared to the control fermentation and fermentation of a larger fraction. Fermentation products derived from bran with an average particle size of 280μm downregulated the expression of hilA, an important invasion-related gene of Salmonella. This downregulation was reflected in an actual lowered invasive potential when Salmonella bacteria were pretreated with the fermentation products derived from the smaller bran fraction. These data suggest that wheat bran with reduced particle size can be a suitable feed additive to help control Salmonella infections in broilers. The mechanism of action most probably relies on a more efficient fermentation of this bran fraction and the consequent increased production of short chain fatty acids (SCFA). Among these SCFA, butyric and propionic acid are known to reduce the invasion potential of Salmonella bacteria.publisher: Elsevier articletitle: Reduced particle size wheat bran is butyrogenic and lowers Salmonella colonization, when added to poultry feed journaltitle: Veterinary Microbiology articlelink: http://dx.doi.org/10.1016/j.vetmic.2016.12.009 content_type: article copyright: © 2016 Elsevier B.V. All rights reserved.status: publishe

    What can we learn from the past by means of very long-term follow-up after aortic valve replacement?

    No full text
    Background: Studies on very long-term outcomes after aortic valve replacement are sparse. Methods: In this retrospective cohort study, long-term outcomes during 25.1 ± 2.8 years of follow-up were determined in 673 patients who underwent aortic valve replacement with or without concomitant coronary artery bypass surgery for severe aortic stenosis and/or regurgitation. Independent predictors of decreased long-term survival were determined. Cumulative incidence rates of major adverse events in patients with a mechanical versus those with a biologic prosthesis were assessed, as well as of major bleeding events in patients with a mechanical prosthesis under the age of 60 versus those above the age of 60. Results: Impaired left ventricular function, severe prosthesis–patient mismatch, and increased aortic cross-clamp time were independent predictors of decreased long-term survival. Left ventricular hypertrophy, a mechanical or biologic prosthesis, increased cardiopulmonary bypass time, new-onset postoperative atrial fibrillation, and the presence of symptoms did not independently predict decreased long-term survival. The risk of major bleeding events was higher in patients with a mechanical in comparison with those with a biologic prosthesis. Younger age (under 60 years) did not protect patients with a mechanical prosthesis against major bleeding events. Conclusions: Very long-term outcome data are invaluable for careful decision-making on aortic valve replacement

    What Can We Learn from the Past by Means of Very Long-Term Follow-Up after Aortic Valve Replacement?

    No full text
    Background: Studies on very long-term outcomes after aortic valve replacement are sparse. Methods: In this retrospective cohort study, long-term outcomes during 25.1 ± 2.8 years of follow-up were determined in 673 patients who underwent aortic valve replacement with or without concomitant coronary artery bypass surgery for severe aortic stenosis and/or regurgitation. Independent predictors of decreased long-term survival were determined. Cumulative incidence rates of major adverse events in patients with a mechanical versus those with a biologic prosthesis were assessed, as well as of major bleeding events in patients with a mechanical prosthesis under the age of 60 versus those above the age of 60. Results: Impaired left ventricular function, severe prosthesis–patient mismatch, and increased aortic cross-clamp time were independent predictors of decreased long-term survival. Left ventricular hypertrophy, a mechanical or biologic prosthesis, increased cardiopulmonary bypass time, new-onset postoperative atrial fibrillation, and the presence of symptoms did not independently predict decreased long-term survival. The risk of major bleeding events was higher in patients with a mechanical in comparison with those with a biologic prosthesis. Younger age (under 60 years) did not protect patients with a mechanical prosthesis against major bleeding events. Conclusions: Very long-term outcome data are invaluable for careful decision-making on aortic valve replacement

    Prosthesis-Patient Mismatch After Aortic Valve Replacement: Effect on Long-Term Survival

    No full text
    Mean follow-up in previous studies on the effect of prosthesis-patient mismatch on long-term survival after aortic valve replacement (AVR) is confined to a maximum of one decade. This retrospective longitudinal cohort study was performed to determine the effect on long-term survival of prosthesis-patient mismatch after AVR with a mean follow-up of almost two decades. Kaplan-Meier survival analysis was used to determine long-term survival after AVR in a cohort of 673 consecutive patients, divided into 163 patients (24.2%) with prosthesis-patient mismatch (indexed effective orifice area ≤ 0.85 cm(2)/m(2)) and 510 patients (75.8%) without prosthesis-patient mismatch (indexed effective orifice area >0.85 cm(2)/m(2)). Effective orifice area values of the prosthetic valves were retrieved from the literature or obtained from the charts of the prosthetic valve manufacturers. Cox multiple regression analysis was used to identify possible independent predictors, including prosthesis-patient mismatch, of decreased long-term survival. Median sizes of the implanted mechanical (n = 430) and biologic (n = 243) prostheses were 25 and 23 mm, respectively. Mean follow-up after AVR was 17.8 ± 1.8 years. Prosthesis-patient mismatch was not an independent predictor of decreased long-term survival (hazard ratio, 0.828; 95% confidence interval, 0.669 to 1.025; p = 0.083). Severe prosthesis-patient mismatch (indexed effective orifice area ≤ 0.65 cm(2)/m(2)), occurring in only 17 patients (2.5%), showed an insignificant trend toward decreased long-term survival (hazard ratio, 1.68; 95% confidence interval, 0.97 to 2.91; p = 0.066). Prosthesis-patient mismatch was not an independent predictor of decreased long-term survival after AV

    New-onset postoperative atrial fibrillation after aortic valve replacement: Effect on long-term survival

    No full text
    Objective: There is a paucity of data on long-term survival of new-onset postoperative atrial fibrillation (POAF) after cardiac surgery. Also, mean follow-up in previous studies is confined to a maximum of one decade. This retrospective, longitudinal cohort study was performed to determine the effect on long-term survival of new-onset POAF after aortic valve replacement (AVR) over a mean follow-up of almost 2 decades. Methods: Kaplan-Meier survival analysis was used to determine long-term survival after AVR, performed between January 1, 1990, and January 1, 1994, in 569 consecutive patients without a history of atrial fibrillation, divided into 241 patients (42.4%) with and 328 patients (57.6%) without new-onset POAF. New-onset POAF was considered in multivariable analysis for decreased long-term survival. After AVR, patients with new-onset POAF were treated with the aim to restore sinus rhythm within 24 to 48 hours from onset by medication and when medication failed by direct-current cardioversion before discharge home. Results: Mean follow-up after AVR was 17.8 +/- 1.9 years. Incidence of new-onset POAF was 42.4%. Kaplan-Meier overall cumulative survival rates at 15 years of follow-up were similar in the patients with new-onset POAF versus those without: 41.5% (95% confidence interval [CI], 35.2-47.7) versus 41.3% (95% CI, 36.0-46.7), respectively. New-onset POAF was not an independent risk factor for decreased long-term survival (hazard ratio 0.815; 95% CI, 0.663-1.001; P =.052). Conclusions: New-onset POAF after AVR does not affect long-term survival when treatment is aimed to restore sinus rhythm before discharge hom

    Reduced-particle-size wheat bran is efficiently colonized by a lactic acid-producing community and reduces levels of Enterobacteriaceae in the cecal microbiota of broilers

    No full text
    In the present study, we investigated whether reducing the particle size of wheat bran affects the colonizing microbial community using batch fermentations with cecal inocula from seven different chickens. We also investigated the effect of in-feed administration of regular wheat bran (WB; 1,690 mu m) and wheat bran with reduced particle size (WB280; 280 mu m) on the cecal microbial community composition of broilers. During batch fermentation, WB280 was colonized by a lactic acid-producing community (Bifidobacteriaceae and Lactobacillaceae) and by Lach-nospiraceae that contain lactic acid-consuming butyric acid-producing species. The relative abundances of the Enterobacteriaceae decreased in the particle-associated communities for both WB and WB280 compared to that of the control. In addition, the community attached to wheat bran was enriched in xylan-degrading bacteria. When administered as a feed additive to broilers, WB280 significantly increased the richness of the cecal microbiota and the abundance of bacteria containing the butyryl-coenzyme A (CoA):acetate CoA-transferase gene, a key gene involved in bacterial butyrate production, while decreasing the abundances of Enterobacteriaceae family members in the ceca. Particle size reduction of wheat bran thus resulted in the colonization of the bran particles by a very specific lactic acid- and butyric acid-producing community and can be used to steer toward beneficial microbial shifts. This can potentially increase the resilience against pathogens and increase animal performance when the reduced-particle-size wheat bran is administered as a feed additive to broilers. IMPORTANCE Prebiotic dietary fibers are known to improve the gastrointestinal health of both humans and animals in many different ways. They can increase the bulking capacity, improve transit times, and, depending on the fiber, even stimulate the growth and activity of resident beneficial bacteria. Wheat bran is a readily available by-product of flour processing and is a highly concentrated source of (in) soluble dietary fiber. The intake of fiber-rich diets has been associated with increased Firmicutes and decreased Proteobacteria numbers. Here, we show that applying only 1% of a relatively simple substrate which was technically modified using relatively simple techniques reduces the concentration of Enterobacteriaceae. This could imply that in future intervention studies, one should take the particle size of dietary fibers into account
    corecore