9 research outputs found

    Vancomycin-resistant enterococci: consequences for therapy and infection control

    Get PDF
    ABSTRACTVancomycin-resistant enterococci (VRE) have emerged as important nosocomial pathogens, initially in the USA, but now also in Europe, where hospital outbreaks are being reported with increasing frequency, although the incidence of VRE infections remains extremely low in most European countries. The recently demonstrated in-human transmission of vancomycin resistance from VRE to methicillin-resistant Staphylococcus aureus (MRSA) in two American patients underscores the potential danger of a coexisting reservoir of both pathogens. As MRSA is already endemic in many European hospital settings, prevention of endemicity with VRE seems relevant, but should be balanced against the costs associated with the implementation of effective strategies. The presence of a large community reservoir of VRE in Europe could hamper the feasibility of infection control strategies. Although the prevalence of colonisation amongst healthy subjects has apparently decreased after the ban on avoparcin use in the agricultural industry, a large proportion of admitted patients are still potential sources of VRE transmission. With no risk profile available to identify these carriers, effective screening, followed by barrier precautions for carriers, seems to be impossible. Recent studies, however, have suggested that hospital outbreaks are almost exclusively caused by specific genogroups of VRE that can be characterised phenotypically and genotypically (e.g., co-resistance to ampicillin and the presence of the variant esp gene). Based on our own experience, we propose that VRE infection control programmes should be restricted to patients colonised with these VRE strains. If such a strain is cultured from a clinical sample, surveillance amongst contact patients is recommended and barrier precautions should be implemented in the case of documented spread

    Calsequestrin as a risk factor in Graves’ hyperthyroidism and Graves’ ophthalmopathy patients

    Get PDF
    Background: The pathogenesis of Graves’ ophthalmopathy (GO), Graves’ hyperthyroidism (GH) and the mechanisms for its link to thyroid autoimmunity are poorly understood. Our research focuses on the role of the skeletal muscle calcium binding protein calsequestrin (CASQ1) in thyroid. We measured the concentration of the CASQ1 protein correlating levels with parameters of the eye signs, CASQ1 antibody levels and CASQ1 gene polymorphism rs3838284. Methods: CASQ1 protein was measured by quantitative Western Blotting. The protein concentrations were expressed as pmol/mg total protein by reference to CASQ1 standards. Results: Western blot analysis showed the presence of two forms of CASQ1 in the thyroid. The mean concentration of CASQ1 protein was significantly reduced in patients with Graves’ disease, compared to thyroid from control subjects with multi-nodular goitre or thyroid cancer. Although in patients with GO it was lower than that, compared with patients with GH this difference was not significant. Reduced CASQ1 in Graves’ thyroid correlated with the homozygous genotype of the rs3838284 CASQ1 polymorphism. Conclusions: Decreased CASQ1 in the thyroid of patients with Graves’ disease compared to thyroid from control subjects is not explained but may reflect consumption of the protein during an autoimmune reaction against CASQ1 in the thyroid

    Foreign adopted children are a source of methicillin-resistant Staphylococcus aureus transmission to countries with low prevalence

    No full text
    Item does not contain fulltextWe report a 13.0% prevalence rate of methicillin-resistant Staphylococcus aureus (MRSA) carriers in foreign adopted children, who are frequently hospitalized within the first year after arrival. Hospitalization in the country of origin and special need status are no significant risk factors for MRSA colonization. Healthcare workers are overrepresented among their adoptive parents. These children represent a potential source of MRSA transmission into the healthcare system

    Ecological effects of selective decontamination on resistant gram-negative bacterial colonization.

    No full text
    Contains fulltext : 88783.pdf (publisher's version ) (Closed access)RATIONALE: Selective digestive tract decontamination (SDD) and selective oropharyngeal decontamination (SOD) eradicate gram-negative bacteria (GNB) from the intestinal and respiratory tract in intensive care unit (ICU) patients, but their effect on antibiotic resistance remains controversial. OBJECTIVES: We quantified the effects of SDD and SOD on bacterial ecology in 13 ICUs that participated in a study, in which SDD, SOD, or standard care was used during consecutive periods of 6 months (de Smet AM, Kluytmans JA, Cooper BS, Mascini EM, Benus RF, van der Werf TS, van der Hoeven JG, Pickkers P, Bogaers-Hofman D, van der Meer NJ, et al. N Engl J Med 2009;360:20-31). METHODS: Point prevalence surveys of rectal and respiratory samples were performed once monthly in all ICU patients (receiving or not receiving SOD/SDD). Effects of SDD on rectal, and of SDD/SOD on respiratory tract, carriage of GNB were determined by comparing results from consecutive point prevalence surveys during intervention (6 mo for SDD and 12 mo for SDD/SOD) with consecutive point prevalence data in the pre- and postintervention periods. MEASUREMENTS AND MAIN RESULTS: During SDD, average proportions of patients with intestinal colonization with GNB resistant to either ceftazidime, tobramycin, or ciprofloxacin were 5, 7, and 7%, and increased to 15, 13, and 13% postintervention (P < 0.05). During SDD/SOD resistance levels in the respiratory tract were not more than 6% for all three antibiotics but increased gradually (for ceftazidime; P < 0.05 for trend) during intervention and to levels of 10% or more for all three antibiotics postintervention (P < 0.05). CONCLUSIONS: SOD and SDD have marked effects on the bacterial ecology in an ICU, with rising ceftazidime resistance prevalence rates in the respiratory tract during intervention and a considerable rebound effect of ceftazidime resistance in the intestinal tract after discontinuation of SDD

    Decontamination of the digestive tract and oropharynx in ICU patients.

    Get PDF
    Contains fulltext : 79996.pdf (publisher's version ) (Open Access)BACKGROUND: Selective digestive tract decontamination (SDD) and selective oropharyngeal decontamination (SOD) are infection-prevention measures used in the treatment of some patients in intensive care, but reported effects on patient outcome are conflicting. METHODS: We evaluated the effectiveness of SDD and SOD in a crossover study using cluster randomization in 13 intensive care units (ICUs), all in The Netherlands. Patients with an expected duration of intubation of more than 48 hours or an expected ICU stay of more than 72 hours were eligible. In each ICU, three regimens (SDD, SOD, and standard care) were applied in random order over the course of 6 months. Mortality at day 28 was the primary end point. SDD consisted of 4 days of intravenous cefotaxime and topical application of tobramycin, colistin, and amphotericin B in the oropharynx and stomach. SOD consisted of oropharyngeal application only of the same antibiotics. Monthly point-prevalence studies were performed to analyze antibiotic resistance. RESULTS: A total of 5939 patients were enrolled in the study, with 1990 assigned to standard care, 1904 to SOD, and 2045 to SDD; crude mortality in the groups at day 28 was 27.5%, 26.6%, and 26.9%, respectively. In a random-effects logistic-regression model with age, sex, Acute Physiology and Chronic Health Evaluation (APACHE II) score, intubation status, and medical specialty used as covariates, odds ratios for death at day 28 in the SOD and SDD groups, as compared with the standard-care group, were 0.86 (95% confidence interval [CI], 0.74 to 0.99) and 0.83 (95% CI, 0.72 to 0.97), respectively. CONCLUSIONS: In an ICU population in which the mortality rate associated with standard care was 27.5% at day 28, the rate was reduced by an estimated 3.5 percentage points with SDD and by 2.9 percentage points with SOD. (Controlled Clinical Trials number, ISRCTN35176830.
    corecore