5 research outputs found

    Precursor-derived in-water peracetic acid impacts on broiler performance, gut microbiota and antimicrobial resistance genes

    Get PDF
    Past antimicrobial misuse has led to the spread of antimicrobial resistance amongst pathogens, reportedly a major public health threat. Attempts to reduce the spread of antimicrobial resistant (AMR) bacteria are in place worldwide, among which finding alternatives to antimicrobials have a pivotal role. Such molecules could be used as “green alternatives” to reduce the bacterial load either by targeting specific bacterial groups or more generically, functioning as biocides when delivered in vivo. In this study, the effect of in-water peracetic acid as a broad-spectrum antibiotic alternative for broilers was assessed via hydrolysis of precursors sodium percarbonate and tetraacetylethylenediamine. Six equidistant peracetic acid levels were tested from 0 to 50 ppm using four pens per treatment and 4 birds per pen (i.e., 16 birds per treatment and 96 in total). Peracetic acid was administered daily from d 7 to 14 of age whilst measuring performance parameters and end-point bacterial concentration (qPCR) in crop, jejunum, and ceca, as well as crop 16S sequencing. PAA treatment, especially at 20, 30, and 40 ppm, increased body weight at d 14, and feed intake during PAA exposure compared to control (P < 0.05). PAA decreased bacterial concentration in the crop only (P < 0.05), which was correlated to better performance (P < 0.05). Although no differences in alpha- and beta-diversity were found, it was observed a reduction of Lactobacillus (P < 0.05) and Flectobacillus (P < 0.05) in most treatments compared to control, together with an increased abundance of predicted 4-aminobutanoate degradation (V) pathway. The analysis of the AMR genes did not point towards any systematic differences in gene abundance due to treatment administration. This, together with the rest of our observations could indicate that proximal gut microbiota modulation could result in performance amelioration. Thus, peracetic acid may be a valid antimicrobial alternative that could also positively affect performance

    A Bayesian reanalysis of the Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial

    No full text
    Background Timing of initiation of kidney-replacement therapy (KRT) in critically ill patients remains controversial. The Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial compared two strategies of KRT initiation (accelerated versus standard) in critically ill patients with acute kidney injury and found neutral results for 90-day all-cause mortality. Probabilistic exploration of the trial endpoints may enable greater understanding of the trial findings. We aimed to perform a reanalysis using a Bayesian framework. Methods We performed a secondary analysis of all 2927 patients randomized in multi-national STARRT-AKI trial, performed at 168 centers in 15 countries. The primary endpoint, 90-day all-cause mortality, was evaluated using hierarchical Bayesian logistic regression. A spectrum of priors includes optimistic, neutral, and pessimistic priors, along with priors informed from earlier clinical trials. Secondary endpoints (KRT-free days and hospital-free days) were assessed using zero–one inflated beta regression. Results The posterior probability of benefit comparing an accelerated versus a standard KRT initiation strategy for the primary endpoint suggested no important difference, regardless of the prior used (absolute difference of 0.13% [95% credible interval [CrI] − 3.30%; 3.40%], − 0.39% [95% CrI − 3.46%; 3.00%], and 0.64% [95% CrI − 2.53%; 3.88%] for neutral, optimistic, and pessimistic priors, respectively). There was a very low probability that the effect size was equal or larger than a consensus-defined minimal clinically important difference. Patients allocated to the accelerated strategy had a lower number of KRT-free days (median absolute difference of − 3.55 days [95% CrI − 6.38; − 0.48]), with a probability that the accelerated strategy was associated with more KRT-free days of 0.008. Hospital-free days were similar between strategies, with the accelerated strategy having a median absolute difference of 0.48 more hospital-free days (95% CrI − 1.87; 2.72) compared with the standard strategy and the probability that the accelerated strategy had more hospital-free days was 0.66. Conclusions In a Bayesian reanalysis of the STARRT-AKI trial, we found very low probability that an accelerated strategy has clinically important benefits compared with the standard strategy. Patients receiving the accelerated strategy probably have fewer days alive and KRT-free. These findings do not support the adoption of an accelerated strategy of KRT initiation

    Initiation of continuous renal replacement therapy versus intermittent hemodialysis in critically ill patients with severe acute kidney injury: a secondary analysis of STARRT-AKI trial

    No full text
    Background: There is controversy regarding the optimal renal-replacement therapy (RRT) modality for critically ill patients with acute kidney injury (AKI). Methods: We conducted a secondary analysis of the STandard versus Accelerated Renal Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial to compare outcomes among patients who initiated RRT with either continuous renal replacement therapy (CRRT) or intermittent hemodialysis (IHD). We generated a propensity score for the likelihood of receiving CRRT and used inverse probability of treatment with overlap-weighting to address baseline inter-group differences. The primary outcome was a composite of death or RRT dependence at 90-days after randomization. Results: We identified 1590 trial participants who initially received CRRT and 606 who initially received IHD. The composite outcome of death or RRT dependence at 90-days occurred in 823 (51.8%) patients who commenced CRRT and 329 (54.3%) patients who commenced IHD (unadjusted odds ratio (OR) 0.90; 95% confidence interval (CI) 0.75-1.09). After balancing baseline characteristics with overlap weighting, initial receipt of CRRT was associated with a lower risk of death or RRT dependence at 90-days compared with initial receipt of IHD (OR 0.81; 95% CI 0.66-0.99). This association was predominantly driven by a lower risk of RRT dependence at 90-days (OR 0.61; 95% CI 0.39-0.94). Conclusions: In critically ill patients with severe AKI, initiation of CRRT, as compared to IHD, was associated with a significant reduction in the composite outcome of death or RRT dependence at 90-days

    Regional Practice Variation and Outcomes in the Standard Versus Accelerated Initiation of Renal Replacement Therapy in Acute Kidney Injury (STARRT-AKI) Trial: A Post Hoc Secondary Analysis.

    No full text
    ObjectivesAmong patients with severe acute kidney injury (AKI) admitted to the ICU in high-income countries, regional practice variations for fluid balance (FB) management, timing, and choice of renal replacement therapy (RRT) modality may be significant.DesignSecondary post hoc analysis of the STandard vs. Accelerated initiation of Renal Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial (ClinicalTrials.gov number NCT02568722).SettingOne hundred-fifty-three ICUs in 13 countries.PatientsAltogether 2693 critically ill patients with AKI, of whom 994 were North American, 1143 European, and 556 from Australia and New Zealand (ANZ).InterventionsNone.Measurements and main resultsTotal mean FB to a maximum of 14 days was +7199 mL in North America, +5641 mL in Europe, and +2211 mL in ANZ (p p p p p p p p = 0.007).ConclusionsAmong STARRT-AKI trial centers, significant regional practice variation exists regarding FB, timing of initiation of RRT, and initial use of continuous RRT. After adjustment, such practice variation was associated with lower ICU and hospital stay and 90-day mortality among ANZ patients compared with other regions
    corecore