105 research outputs found

    Efficacy of rifampicin combination therapy for the treatment of enterococcal infections assessed in vivo using a Galleria mellonella infection model.

    Get PDF
    Enterococci are a leading cause of healthcare-associated infection worldwide and display increasing levels of resistance to many of the commonly used antimicrobials, making treatment of their infections challenging. Combinations of antibiotics are occasionally employed to treat serious infections, allowing for the possibility of synergistic killing. The aim of this study was to evaluate the effects of different antibacterial combinations against enterococcal isolates using an in vitro approach and an in vivo Galleria mellonella infection model. Five Enterococcus faecalis and three Enterococcus faecium strains were screened by paired combinations of rifampicin, tigecycline, linezolid or vancomycin using the chequerboard dilution method. Antibacterial combinations that displayed synergy were selected for in vivo testing using a G. mellonella larvae infection model. Rifampicin was an effective antibacterial enhancer when used in combination with tigecycline or vancomycin, with minimum inhibitory concentrations (MICs) of each individual antibiotic being reduced by between two and four doubling dilutions, generating fractional inhibitory concentration index (FICI) values between 0.31 and 0.5. Synergy observed with the chequerboard screening assays was subsequently observed in vivo using the G. mellonella model, with combination treatment demonstrating superior protection of larvae post-infection in comparison with antibiotic monotherapy. In particular, rifampicin in combination with tigecycline or vancomycin significantly enhanced larvae survival. Addition of rifampicin to anti-enterococcal treatment regimens warrants further investigation and may prove useful in the treatment of enterococcal infections whilst prolonging the clinically useful life of currently active antibiotics

    An analysis of C.difficile Environmental Contamination During and Following Treatment for C.difficile Infection

    Get PDF
    Background: Lower Clostridium difficile spore counts in feces from C difficile infection (CDI) patients treated with fidaxomicin versus vancomycin have been observed. We aimed to determine whether environmental contamination is lower in patients treated with fidaxomicin compared with those treated with vancomycin/metronidazole. Methods: The CDI cases were recruited at 4 UK hospitals (Leeds, Bradford, and London [2 centers]). Environmental samples (5 room sites) were taken pretreatment and at 2–3, 4–5, 6–8, and 9–12 days of treatment, end of treatment (EOT), and post-EOT. Fecal samples were collected at diagnosis and as often as produced thereafter. Swabs/feces were cultured for C difficile; percentage of C difficile-positive samples and C difficile bioburden were compared between different treatment arms at each time point. Results: Pre-EOT (n = 244), there was a significant reduction in environmental contamination (≄1 site positive) around fidaxomicin versus vancomycin/metronidazole recipients at days 4–5 (30% vs 50% recipients, P = .04) and at days 9–12 (22% vs 49%, P = .005). This trend was consistently seen at all other timepoints, but it was not statistically significant. No differences were seen between treatment groups post-EOT (n = 76). Fidaxomicin-associated fecal positivity rates and colony counts were consistently lower than those for vancomycin/metronidazole from days 4 to 5 of treatment (including post-EOT); however, the only significant difference was in positivity rate at days 9–12 (15% vs 55%, P = .03). Conclusions: There were significant reductions in C difficile recovery from both feces and the environment around fidaxomicin versus vancomycin/metronidazole recipients. Therefore, fidaxomicin treatment may lower the C difficile transmission risk by reducing excretion and environmental contamination

    Defining young in the context of prostate cancer

    Get PDF
    The experience of prostate cancer is for most men a major life stress with the psychological burden of this disease falling more heavily on those who are younger. Despite this, being young as it applies to prostate cancer is not yet clearly defined with varied chronological approaches applied. However, men’s responses to health crises are closely bound to life course and masculinities from which social roles emerge. This paper applied qualitative methodology (structured focus groups and semistructured interviews with expert informants) using interpretative phenomenological analysis to define what it means to be young and have prostate cancer. Structured focus groups were held with 26 consumer advisors (men diagnosed with prostate cancer who provide support to other men with prostate cancer or raise community awareness) and health professionals. As well, 15 men diagnosed with prostate cancer and in their 40s, 50s, or 60s participated in semi-structured interviews. Participants discussed the attributes that describe a young man with prostate cancer and the experience of being young and diagnosed with prostate cancer. Chronological definitions of a young man were absent or inconsistent. Masculine constructions of what it means to be a young man and life course characteristics appear more relevant to defining young as it applies to prostate cancer compared with chronological age. These findings have implications for better understanding the morbidities associated with this illness, and in designing interventions that are oriented to life course and helping young men reconstruct their identities after prostate cancer

    Diagnosis of Aortic Graft Infection: A Case Definition by the Management of Aortic Graft Infection Collaboration (MAGIC)

    Get PDF
    OBJECTIVE/BACKGROUND: The management of aortic graft infection (AGI) is highly complex and in the absence of a universally accepted case definition and evidence-based guidelines, clinical approaches and outcomes vary widely. The objective was to define precise criteria for diagnosing AGI. METHODS: A process of expert review and consensus, involving formal collaboration between vascular surgeons, infection specialists, and radiologists from several English National Health Service hospital Trusts with large vascular services (Management of Aortic Graft Infection Collaboration [MAGIC]), produced the definition. RESULTS: Diagnostic criteria from three categories were classified as major or minor. It is proposed that AGI should be suspected if a single major criterion or two or more minor criteria from different categories are present. AGI is diagnosed if there is one major plus any criterion (major or minor) from another category. (i) Clinical/surgical major criteria comprise intraoperative identification of pus around a graft and situations where direct communication between the prosthesis and a nonsterile site exists, including fistulae, exposed grafts in open wounds, and deployment of an endovascular stent-graft into an infected field (e.g., mycotic aneurysm); minor criteria are localized AGI features or fever ≄38°C, where AGI is the most likely cause. (ii) Radiological major criteria comprise increasing perigraft gas volume on serial computed tomography (CT) imaging or perigraft gas or fluid (≄7 weeks and ≄3 months, respectively) postimplantation; minor criteria include other CT features or evidence from alternative imaging techniques. (iii) Laboratory major criteria comprise isolation of microorganisms from percutaneous aspirates of perigraft fluid, explanted grafts, and other intraoperative specimens; minor criteria are positive blood cultures or elevated inflammatory indices with no alternative source. CONCLUSION: This AGI definition potentially offers a practical and consistent diagnostic standard, essential for comparing clinical management strategies, trial design, and developing evidence-based guidelines. It requires validation that is planned in a multicenter, clinical service database supported by the Vascular Society of Great Britain & Ireland

    Opportunities for antimicrobial stewardship in patients with acute bacterial skin and skin structure infections who are unsuitable for beta-lactam antibiotics: a multicenter prospective observational study

    Get PDF
    Purpose: The objective of this prospective, observational study was to describe the treatment, severity assessment and healthcare resources required for management of patients with acute bacterial skin and skin structure infections who were unsuitable for beta-lactam antibiotic treatments. Methods: Patients were enrolled across five secondary care National Health Service hospitals. Eligible patients had a diagnosis of acute bacterial skin and skin structure infection and were considered unsuitable for beta-lactam antibiotics (e.g. confirmed/suspected methicillin-resistant Staphylococcus aureus, beta-lactam allergy). Data regarding diagnosis, severity of the infection, antibiotic treatment and patient management were collected. Results: 145 patients with acute bacterial skin and skin structure infection were included; 79% (n = 115) patients received greater than two antibiotic regimens; median length of the first antibiotic regimen was 2 days (interquartile range of 1–5); median time to switch from intravenous to oral antibiotics was 4 days (interquartile range of 3–8, n = 72/107); 25% (n = 10/40) patients with Eron class 1 infection had systemic inflammatory response syndrome, suggesting they were misclassified. A higher proportion of patients with systemic inflammatory response syndrome received treatment in an inpatient setting, and their length of stay was prolonged in comparison with patients without systemic inflammatory response syndrome. Conclusion: There exists an urgent need for more focused antimicrobial stewardship strategies and tools for standardised clinical assessment of acute bacterial skin and skin structure infection severity in patients who are unsuitable for beta-lactam antibiotics. This will lead to optimised antimicrobial treatment strategies and ensure effective healthcare resource utilisation

    The welfare implications of large litter size in the domestic pig II: management factors

    Get PDF
    AbstractIncreasing litter size has long been a goal of pig (Sus scrofa domesticus) breeders and producers in many countries. Whilst this has economic and environmental benefits for the pig industry, there are also implications for pig welfare. Certain management interventions are used when litter size routinely exceeds the ability of individual sows to successfully rear all the piglets (ie viable piglets outnumber functional teats). Such interventions include: tooth reduction; split suckling; cross-fostering; use of nurse sow systems and early weaning, including split weaning; and use of artificial rearing systems. These practices raise welfare questions for both the piglets and sow and are described and discussed in this review. In addition, possible management approaches which might mitigate health and welfare issues associated with large litters are identified. These include early intervention to provide increased care for vulnerable neonates and improvements to farrowing accommodation to mitigate negative effects, particularly for nurse sows. An important concept is that management at all stages of the reproductive cycle, not simply in the farrowing accommodation, can impact on piglet outcomes. For example, poor stockhandling at earlier stages of the reproductive cycle can create fearful animals with increased likelihood of showing poor maternal behaviour. Benefits of good sow and litter management, including positive human-animal relationships, are discussed. Such practices apply to all production situations, not just those involving large litters. However, given that interventions for large litters involve increased handling of piglets and increased interaction with sows, there are likely to be even greater benefits for management of hyper-prolific herds.</jats:p

    The impact of the introduction of fidaxomicin on the management of Clostridium difficile infection in seven NHS secondary care hospitals in England: a series of local service evaluations.

    Get PDF
    Clostridium difficile infection (CDI) is associated with high mortality. Reducing incidence is a priority for patients, clinicians, the National Health Service (NHS) and Public Health England alike. In June 2012, fidaxomicin (FDX) was launched for the treatment of adults with CDI. The objective of this evaluation was to collect robust real-world data to understand the effectiveness of FDX in routine practice. In seven hospitals introducing FDX between July 2012 and July 2013, data were collected retrospectively from medical records on CDI episodes occurring 12 months before/after the introduction of FDX. All hospitalised patients aged ≄18 years with primary CDI (diarrhoea with presence of toxin A/B without a previous CDI in the previous 3 months) were included. Recurrence was defined as in-patient diarrhoea re-emergence requiring treatment any time within 3 months after the first episode. Each hospital had a different protocol for the use of FDX. In hospitals A and B, where FDX was used first line for all primary and recurrent episodes, the recurrence rate reduced from 10.6 % to 3.1 % and from 16.3 % to 3.1 %, with a significant difference in 28-day mortality from 18.2 % to 3.1 % (p < 0.05) and 17.3 % to 6.3 % (p < 0.05) for hospitals A and B, respectively. In hospitals using FDX in selected patients only, the changes in recurrence rates and mortality were less marked. The pattern of adoption of FDX appears to affect its impact on CDI outcome, with maximum reduction in recurrence and all-cause mortality where it is used as first-line treatment

    Diagnosis of aortic graft infection : a case definition by the management of aortic graft infection collaboration (MAGIC)

    Get PDF
    Objective/Background The management of aortic graft infection (AGI) is highly complex and in the absence of a universally accepted case definition and evidence-based guidelines, clinical approaches and outcomes vary widely. The objective was to define precise criteria for diagnosing AGI. Methods A process of expert review and consensus, involving formal collaboration between vascular surgeons, infection specialists, and radiologists from several English National Health Service hospital Trusts with large vascular services (Management of Aortic Graft Infection Collaboration [MAGIC]), produced the definition. Results Diagnostic criteria from three categories were classified as major or minor. It is proposed that AGI should be suspected if a single major criterion or two or more minor criteria from different categories are present. AGI is diagnosed if there is one major plus any criterion (major or minor) from another category. (i) Clinical/surgical major criteria comprise intraoperative identification of pus around a graft and situations where direct communication between the prosthesis and a nonsterile site exists, including fistulae, exposed grafts in open wounds, and deployment of an endovascular stent-graft into an infected field (e.g., mycotic aneurysm); minor criteria are localized AGI features or fever ≄38°C, where AGI is the most likely cause. (ii) Radiological major criteria comprise increasing perigraft gas volume on serial computed tomography (CT) imaging or perigraft gas or fluid (≄7 weeks and ≄3 months, respectively) postimplantation; minor criteria include other CT features or evidence from alternative imaging techniques. (iii) Laboratory major criteria comprise isolation of microorganisms from percutaneous aspirates of perigraft fluid, explanted grafts, and other intraoperative specimens; minor criteria are positive blood cultures or elevated inflammatory indices with no alternative source. Conclusion This AGI definition potentially offers a practical and consistent diagnostic standard, essential for comparing clinical management strategies, trial design, and developing evidence-based guidelines. It requires validation that is planned in a multicenter, clinical service database supported by the Vascular Society of Great Britain & Ireland

    In Vitro Evaluation of Enterococcus faecalis Adhesion on Various Endodontic Medicaments

    Get PDF
    E. faecalis in endodontic infection represents a biofilm type of disease, which explains the bacteria’s resistance to various antimicrobial compounds and the subsequent failure after endodontic treatment. The purpose of this study was to compare antimicrobial activities and bacteria kinetic adhesion in vitro for three endodontic medicaments with a clinical isolate of E. faecalis. We devised a shake culture which contained the following intracanalar preparations: CPD, Endoidrox (EIX), PulpCanalSealer (PCS); these were immersed in a liquid culture medium inoculated with the microorganism. The shake system velocity was able to prevent non-specific bacteria adhesion and simulated the salivary flow. Specimens were collected daily (from both the medium and medicaments) for 10 days; the viable cells were counted by plate count, while the adhesion index AI° [E. faecalis fg DNA] /mm2 was evaluated in the pastes after DNA extraction, by quantitative real time PCR for the 16S rRNA gene. A partial growth inhibition, during the first 24 hours, was observed in the liquid medium and on the medicaments for EIX and subsequently for CPD (six logs). EIX showed the lowest adhesion coefficient (5*102 [fg DNA]/mm2) for nine days and was similar to the control. PCS showed no antimicrobial/antibiofilm properties. This showed that “calcium oxide” base compounds could be active against biofilm progression and at least in the short term (2-4 days) on E. faecalis cells growing in planktonic cultures

    The welfare implications of large litter size in the domestic pig I: biological factors

    Get PDF
    AbstractIncreasing litter size has long been a goal of pig breeders and producers, and may have implications for pig (Sus scrofa domesticus) welfare. This paper reviews the scientific evidence on biological factors affecting sow and piglet welfare in relation to large litter size. It is concluded that, in a number of ways, large litter size is a risk factor for decreased animal welfare in pig production. Increased litter size is associated with increased piglet mortality, which is likely to be associated with significant negative animal welfare impacts. In surviving piglets, many of the causes of mortality can also occur in non-lethal forms that cause suffering. Intense teat competition may increase the likelihood that some piglets do not gain adequate access to milk, causing starvation in the short term and possibly long-term detriments to health. Also, increased litter size leads to more piglets with low birth weight which is associated with a variety of negative long-term effects. Finally, increased production pressure placed on sows bearing large litters may produce health and welfare concerns for the sow. However, possible biological approaches to mitigating health and welfare issues associated with large litters are being implemented. An important mitigation strategy is genetic selection encompassing traits that promote piglet survival, vitality and growth. Sow nutrition and the minimisation of stress during gestation could also contribute to improving outcomes in terms of piglet welfare. Awareness of the possible negative welfare consequences of large litter size in pigs should lead to further active measures being taken to mitigate the mentioned effects.</jats:p
    • 

    corecore