301 research outputs found

    Veterinary Medicine Needs New Green Antimicrobial Drugs

    Get PDF
    Given that: (1) the worldwide consumption of antimicrobial drugs (AMDs) used in food-producing animals will increase over the coming decades; (2) the prudent use of AMDs will not suffice to stem the rise in human antimicrobial resistance (AMR) of animal origin; (3) alternatives to AMD use are not available or not implementable, there is an urgent need to develop novel AMDs for food-producing animals. This is not for animal health reasons, but to break the link between human and animal resistomes. In this review we establish the feasibility of developing for veterinary medicine new AMDs, termed green antibiotics, having minimal ecological impact on the animal commensal and environmental microbiomes.We first explain why animal and human commensal microbiota comprise a turnstile exchange, between the human and animal resistomes. We then outline the ideal physico-chemical, pharmacokinetic and pharmacodynamic properties of a veterinary green antibiotic and conclude that they can be developed through a rational screening of currently used AMD classes. The ideal drug will be hydrophilic, of relatively low potency, slow clearance and small volume of distribution. It should be eliminated principally by the kidney as inactive metabolite(s). For oral administration, bioavailability can be enhanced by developing lipophilic pro-drugs. For parenteral administration, slow-release formulations of existing eco-friendly AMDs with a short elimination half-life can be developed. These new eco-friendly veterinary AMDs can be developed from currently used drug classes to provide alternative agents to those currently used in veterinary medicine and mitigate animal contributions to the human AMR problem

    N2 Gas Flushing Limits the Rise of Antibiotic-Resistant Bacteria in Bovine Raw Milk during Cold Storage

    Get PDF
    Antibiotic resistance has been noted to be a major and increasing human health issue. Cold storage of raw milk promotes the thriving of psychrotrophic/psychrotolerant bacteria, which are well known for their ability to produce enzymes that are frequently heat stable. However, these bacteria also carry antibiotic resistance (AR) features. In places, where no cold chain facilities are available and despite existing recommendations numerous adulterants, including antibiotics, are added to raw milk. Previously, N-2 gas flushing showed real potential for hindering bacterial growth in raw milk at a storage temperature ranging from 6 to 25 degrees C. Here, the ability of N-2 gas (N) to tackle antibiotic-resistant bacteria was tested and compared to that of the activated lactoperoxidase system (HT) for three raw milk samples that were stored at 6 degrees C for 7 days. To that end, the mesophiles and psychrotrophs that were resistant to gentamycin (G), ceftazidime (Ce), levofloxacin (L), and trimethoprim-sulfamethoxazole (TS) were enumerated. For the log(10) ratio (which is defined as the bacterial counts from a certain condition divided by the counts on the corresponding control), classical Analyses of Variance (ANOVA) was performed, followed by a mean comparison with the Ryan-Einot-Gabriel-Welsch multiple range test (REGWQ). If the storage "time" factor was the major determinant of the recorded effects, cold storage alone or in combination with HT or with N promoted a sample-dependent response in consideration of the AR levels. The efficiency of N in limiting the increase in AR was highest for fresh raw milk and was judged to be equivalent to that of HT for one sample and superior to that of HT for the two other samples; moreover, compared to HT, N seemed to favor a more diverse community at 6 degrees C that was less heavily loaded with antibiotic multi-resistance features. Our results imply that N-2 gas flushing could strengthen cold storage of raw milk by tackling the bacterial spoilage potential while simultaneously hindering the increase of bacteria carrying antibiotic resistance/multi-resistance features.Peer reviewe

    Surveillance of Gram-negative bacteria: impact of variation in current European laboratory reporting practice on apparent multidrug resistance prevalence in paediatric bloodstream isolates.

    Get PDF
    This study evaluates whether estimated multidrug resistance (MDR) levels are dependent on the design of the surveillance system when using routine microbiological data. We used antimicrobial resistance data from the Antibiotic Resistance and Prescribing in European Children (ARPEC) project. The MDR status of bloodstream isolates of Escherichia coli, Klebsiella pneumoniae and Pseudomonas aeruginosa was defined using European Centre for Disease Prevention and Control (ECDC)-endorsed standardised algorithms (non-susceptible to at least one agent in three or more antibiotic classes). Assessment of MDR status was based on specified combinations of antibiotic classes reportable as part of routine surveillance activities. The agreement between MDR status and resistance to specific pathogen-antibiotic class combinations (PACCs) was assessed. Based on all available antibiotic susceptibility testing, the proportion of MDR isolates was 31% for E. coli, 30% for K. pneumoniae and 28% for P. aeruginosa isolates. These proportions fell to 9, 14 and 25%, respectively, when based only on classes collected by current ECDC surveillance methods. Resistance percentages for specific PACCs were lower compared with MDR percentages, except for P. aeruginosa. Accordingly, MDR detection based on these had low sensitivity for E. coli (2-41%) and K. pneumoniae (21-85%). Estimates of MDR percentages for Gram-negative bacteria are strongly influenced by the antibiotic classes reported. When a complete set of results requested by the algorithm is not available, inclusion of classes frequently tested as part of routine clinical care greatly improves the detection of MDR. Resistance to individual PACCs should not be considered reflective of MDR percentages in Enterobacteriaceae

    Developing core elements and checklist items for global hospital antimicrobial stewardship programmes:a consensus approach

    Get PDF
    International audienc

    EFSA BIOHAZ Panel (EFSA Panel on Biologicial Hazards), 2013. Scientific Opinion on the public health hazards to be covered by inspection of meat (solipeds)

    Get PDF
    A risk ranking process identified Trichinella spp. as the most relevant biological hazard in the context of meat inspection of domestic solipeds. Without a full and reliable soliped traceability system, it is considered that either testing all slaughtered solipeds for Trichinella spp., or inactivation meat treatments (heat or irradiation) should be used to maintain the current level of safety. With regard to general aspects of current meat inspection practices, the use of manual techniques during current post-mortem soliped meat inspection may increase microbial cross-contamination, and is considered to have a detrimental effect on the microbiological status of soliped carcass meat. Therefore, the use of visual-only inspection is suggested for “non-suspect” solipeds. For chemical hazards, phenylbutazone and cadmium were ranked as being of high potential concern. Monitoring programmes for chemical hazards should be more flexible and based on the risk of occurrence, taking into account Food Chain Information (FCI), covering the specific on-farm environmental conditions and individual animal treatments, and the ranking of chemical substances, which should be regularly updated and include new hazards. Sampling, testing and intervention protocols for chemical hazards should be better integrated and should focus particularly on cadmium, phenylbutazone and priority “essential substances” approved for treatment of equine animals. Implementation and enforcement of a more robust and reliable identification system throughout the European Union is needed to improve traceability of domestic solipeds. Meat inspection is recognised as a valuable tool for surveillance and monitoring of animal health and welfare conditions. If visual only post-mortem inspection is implemented for routine slaughter, a reduction in the detection of strangles and mild cases of rhodococcosis would occur. However, this was considered unlikely to affect the overall surveillance of both diseases. Improvement of FCI and traceability were considered as not having a negative effect on animal health and welfare surveillance

    Safety of COVID-19 vaccines administered in the EU: Should we be concerned?

    Get PDF
    The COVID-19 pandemic has had an unprecedented and devastating impact on public health, society and economics around the world. As a result, the development of vaccines to protect individuals from symptomatic COVID-19 infections has represented the only feasible health tool to combat the spread of the disease. However, at the same time the development and regulatory assessment of different vaccines has challenged pharmaceutical industries and regulatory agencies as this process has occurred in the shorter time ever though. So far, two mRNA and two adenovirus-vectored vaccines have received a conditional marketing authorisation in the EU and other countries. This review summarized and discusses the assessment reports of the European Medicine Agency (EMA) concerning the safety of the 3 vaccines currently used in the EU (Pfizer, Moderna and Astra-Zeneca). A particular focus has been paid to safety information from pre-clinical (animal) and clinical (phase 3 trials) studies. Overall, the most frequent adverse effects reported after the administration of these vaccines consisted of local reactions at the injection site (sore arm and erythema) followed by non-specific systemic effects (myalgia, chills, fatigue, headache, and fever), which occurred soon after vaccination and resolved shortly. Rare cases of vaccine-induced immune thrombotic thrombocytopenia have been reported for Vaxzevria. Data on long-term studies, interaction with other vaccines, use in pregnancy/breast-feeding, use in immunocompromised subjects, and in subjects with comorbidities, autoimmune or inflammatory disorders are still missing for these vaccines. Therefore, careful follow-up and surveillance studies for continued vaccine safety monitoring will be needed to ascertain the potential risks of such adverse events or diseases. In conclusion, the benefits and risks of current COVID-19 vaccines must be weighed against the real possibility of contract the disease and develop complications and long-term sequels; all this on the basis of the available scientific evidence and in the absence of unmotivated biases

    Influence of Diagnostic Method on Outcomes in Phase 3 Clinical Trials of Bezlotoxumab for the Prevention of Recurrent Clostridioides difficile Infection: A Post Hoc Analysis of MODIFY I/II

    Get PDF
    Background: The optimum diagnostic test method for Clostridioides difficile infection (CDI) remains controversial due to variation in accuracy in identifying true CDI. This post hoc analysis examined the impact of CDI diagnostic testing methodology on efficacy outcomes in phase 3 MODIFY I/II trials. Methods: In MODIFY I/II (NCT01241552/NCT01513239), participants received bezlotoxumab (10 mg/kg) or placebo during anti-CDI treatment for primary/recurrent CDI (rCDI). Using MODIFY I/II pooled data, initial clinical cure (ICC) and rCDI were assessed in participants diagnosed at baseline using direct detection methods (enzyme immunoassay [EIA]/cell cytotoxicity assay [CCA]) or indirect methods to determine toxin-producing ability (toxin gene polymerase chain reaction [tgPCR]/toxigenic culture). Results: Of 1554 participants who received bezlotoxumab or placebo in MODIFY I/II, 781 (50.3%) and 773 (49.7%) were diagnosed by tgPCR/toxigenic culture and toxin EIA/CCA, respectively. Participants diagnosed by toxin EIA/CCA were more likely to be inpatients, older, and have severe CDI. In bezlotoxumab recipients, ICC rates were slightly higher in the toxin EIA/CCA subgroup (81.7%) vs tgPCR/toxigenic culture (78.4%). Bezlotoxumab significantly reduced the rCDI rate vs placebo in both subgroups; however, the magnitude of reduction was substantially larger in participants diagnosed by toxin EIA/CCA (relative difference, –46.6%) vs tgPCR/toxigenic culture (–29.1%). In bezlotoxumab recipients, the rCDI rate was lower in the toxin EIA/CCA subgroup (17.6%) vs tgPCR/toxigenic culture (23.6%; absolute difference, –6.0%; 95% confidence interval, –12.4 to 0.3; relative difference, –25.4%). Conclusions: Diagnostic tests that detect fecal C. difficile toxins are of fundamental importance to accurately diagnosing CDI, including in clinical trial design, ensuring that therapeutic efficacy is not underestimated
    corecore