143 research outputs found

    Systemic inflammatory response syndrome in adult patients with nosocomial bloodstream infections due to enterococci

    Get PDF
    BACKGROUND: Enterococci are the third leading cause of nosocomial bloodstream infection (BSI). Vancomycin resistant enterococci are common and provide treatment challenges; however questions remain about VRE's pathogenicity and its direct clinical impact. This study analyzed the inflammatory response of Enterococcal BSI, contrasting infections from vancomycin-resistant and vancomycin-susceptible isolates. METHODS: We performed a historical cohort study on 50 adults with enterococcal BSI to evaluate the associated systemic inflammatory response syndrome (SIRS) and mortality. We examined SIRS scores 2 days prior through 14 days after the first positive blood culture. Vancomycin resistant (n = 17) and susceptible infections (n = 33) were compared. Variables significant in univariate analysis were entered into a logistic regression model to determine the affect on mortality. RESULTS: 60% of BSI were caused by E. faecalis and 34% by E. faecium. 34% of the isolates were vancomycin resistant. Mean APACHE II (A2) score on the day of BSI was 16. Appropriate antimicrobials were begun within 24 hours in 52%. Septic shock occurred in 62% and severe sepsis in an additional 18%. Incidence of organ failure was as follows: respiratory 42%, renal 48%, hematologic 44%, hepatic 26%. Crude mortality was 48%. Progression to septic shock was associated with death (OR 14.9, p < .001). There was no difference in A2 scores on days -2, -1 and 0 between the VRE and VSE groups. Maximal SIR (severe sepsis, septic shock or death) was seen on day 2 for VSE BSI vs. day 8 for VRE. No significant difference was noted in the incidence of organ failure, 7-day or overall mortality between the two groups. Univariate analysis revealed that AP2>18 at BSI onset, and respiratory, cardiovascular, renal, hematologic and hepatic failure were associated with death, but time to appropriate therapy >24 hours, age, and infection due to VRE were not. Multivariate analysis revealed that hematologic (OR 8.4, p = .025) and cardiovascular failure (OR 7.5, p = 032) independently predicted death. CONCLUSION: In patients with enterococcal BSI, (1) the incidence of septic shock and organ failure is high, (2) patients with VRE BSI are not more acutely ill prior to infection than those with VSE BSI, and (3) the development of hematologic or cardiovascular failure independently predicts death

    Do medical students copy the drug treatment choices of their teachers or do they think for themselves?

    Get PDF
    PURPOSE: Although the importance of rational prescribing is generally accepted, the teaching of pharmacotherapy to undergraduate medical students is still unsatisfactory. Because clinical teachers are an important role model for medical students, it is of interest to know whether this extends to therapeutic decision-making. The aim of this study was to find out which factors contribute to the drug choices made by medical students and their teachers (general practitioners and clinical specialists). METHODS: Final-year medical students (n = 32), and general practitioners (n = 29), lung specialists (n = 26), orthopaedic surgeons (n = 24), and internists (n = 24) serving as medical teachers from all eight medical schools in the Netherlands participated in the study. They were asked to prescribe treatment (drug or otherwise) for uncomplicated (A) and complicated (B) written patient cases and to indicate which factors influenced their choice of treatment, using a list of factors reported in the literature to influence drug prescribing. RESULTS: Final-year medical students primarily based their drug choice on the factors 'effectiveness of the drugs' and 'examples from medical teachers'. In contrast, clinical teachers primarily based their drug choice on the factors 'clinical experience', 'effectiveness of the drugs', 'side effects of the drugs', 'standard treatment guidelines', and 'scientific literature'. CONCLUSIONS: Medical teachers would appear to base their drug choice mainly on clinical experience and drug-related factors, whereas final-year medical students base their drug choice mainly on examples provided by their medical teachers. It is essential that medical teachers clearly explain to their students how they arrive at a specific choice of medication since medical students tend to copy the therapeutic drug choices from their teachers, mainly because of a lack of experience. Presenting students with clinical therapeutic problems early during undergraduate training will not only give them a chance to gain experience in solving medical problems but will also give meaning to what they are studying as opposed to merely reproducing what they learn or copying what they are tol

    Does vancomycin prescribing intervention affect vancomycin-resistant enterococcus infection and colonization in hospitals? A systematic review

    Get PDF
    BACKGROUND: Vancomycin resistant enterococcus (VRE) is a major cause of nosocomial infections in the United States and may be associated with greater morbidity, mortality, and healthcare costs than vancomycin-susceptible enterococcus. Current guidelines for the control of VRE include prudent use of vancomycin. While vancomycin exposure appears to be a risk factor for VRE acquisition in individual patients, the effect of vancomycin usage at the population level is not known. We conducted a systematic review to determine the impact of reducing vancomycin use through prescribing interventions on the prevalence and incidence of VRE colonization and infection in hospitals within the United States. METHODS: To identify relevant studies, we searched three electronic databases, and hand searched selected journals. Thirteen studies from 12 articles met our inclusion criteria. Data were extracted and summarized for study setting, design, patient characteristics, types of intervention(s), and outcome measures. The relative risk, 95% confidence interval, and p-value associated with change in VRE acquisition pre- and post-vancomycin prescription interventions were calculated and compared. Heterogeneity in study results was formally explored by stratified analysis. RESULTS: No randomized clinical trials on this topic were found. Each of the 13 included studies used a quasi-experimental design of low hierarchy. Seven of the 13 studies reported statistically significant reductions in VRE acquisition following interventions, three studies reported no significant change, and three studies reported increases in VRE acquisition, one of which reported statistical significance. Results ranged from a reduction of 82.5% to an increase of 475%. Studies of specific wards, which included sicker patients, were more likely to report positive results than studies of an entire hospital including general inpatients (Fisher's exact test 0.029). The type of intervention, endemicity status, type of study design, and the duration of intervention were not found to significantly modify the results. Among the six studies that implemented vancomycin reduction strategies as the sole intervention, two of six (33%) found a significant reduction in VRE colonization and/or infection. In contrast, among studies implementing additional VRE control measures, five of seven (71%) reported a significant reduction. CONCLUSION: It was not possible to conclusively determine a potential role for vancomycin usage reductions in controlling VRE colonization and infection in hospitals in the United States. The effectiveness of such interventions and their sustainability remains poorly defined because of the heterogeneity and quality of studies. Future research using high-quality study designs and implementing vancomycin as the sole intervention are needed to answer this question

    The inheritance of seed dormancy in Sinapis arvensis L

    Get PDF
    Selection for dormant and non-dormant seed in Sinapis arvensis was carried to the seventh and fourteenth generation, respectively. Crosses between the dormant and non-dormant lines clearly showed both a maternal and an embryonic component of seed dormancy. A model for the number of alleles controlling dormancy was constructed and tested. The maternal component of dormancy was shown to be controlled by a single locus with two alleles, the dormant allele being dominant to the non-dormant. No clear picture of the control of the embryonic component of dormancy was found

    Limited Vegetation Development on a Created Salt Marsh Associated with Over-Consolidated Sediments and Lack of Topographic Heterogeneity

    Get PDF
    Restored salt marshes frequently lack the full range of plant communities present on reference marshes, with upper marsh species underrepresented. This often results from sites being too low in the tidal frame and/or poorly drained with anoxic sediments. A managed coastal realignment scheme at Abbotts Hall, Essex, UK, has oxic sediments at elevations at which upper marsh communities would be expected. But 7 years after flooding, it continued to be dominated by pioneer communities, with substantial proportions of bare ground, so other factors must hinder vegetation development at these elevations. The poorly vegetated areas had high sediment shear strength, low water and organic carbon content and very flat topography. These characteristics occur frequently on the upper parts of created marshes. Experimental work is required to establish causal links with the ecological differences, but other studies have also reported that reduced plant β-diversity and lower usage by fish are associated with topographic uniformity. Uniformity also leads to very different visual appearance from natural marshes. On the upper intertidal, sediment deposition rate are slow, water velocities are low and erosive forces are weak. So, topographic heterogeneity cannot develop naturally, even if creeks have been excavated. Without active management, these conditions will persist indefinitely

    Partner randomized controlled trial: study protocol and coaching intervention

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Many children with asthma live with frequent symptoms and activity limitations, and visits for urgent care are common. Many pediatricians do not regularly meet with families to monitor asthma control, identify concerns or problems with management, or provide self-management education. Effective interventions to improve asthma care such as small group training and care redesign have been difficult to disseminate into office practice.</p> <p>Methods and design</p> <p>This paper describes the protocol for a randomized controlled trial (RCT) to evaluate a 12-month telephone-coaching program designed to support primary care management of children with persistent asthma and subsequently to improve asthma control and disease-related quality of life and reduce urgent care events for asthma care. Randomization occurred at the practice level with eligible families within a practice having access to the coaching program or to usual care. The coaching intervention was based on the transtheoretical model of behavior change. Targeted behaviors included 1) effective use of controller medications, 2) effective use of rescue medications and 3) monitoring to ensure optimal control. Trained lay coaches provided parents with education and support for asthma care, tailoring the information provided and frequency of contact to the parent's readiness to change their child's day-to-day asthma management. Coaching calls varied in frequency from weekly to monthly. For each participating family, follow-up measurements were obtained at 12- and 24-months after enrollment in the study during a telephone interview.</p> <p>The primary outcomes were the mean change in 1) the child's asthma control score, 2) the parent's quality of life score, and 3) the number of urgent care events assessed at 12 and 24 months. Secondary outcomes reflected adherence to guideline recommendations by the primary care pediatricians and included the proportion of children prescribed controller medications, having maintenance care visits at least twice a year, and an asthma action plan. Cost-effectiveness of the intervention was also measured.</p> <p>Discussion</p> <p>Twenty-two practices (66 physicians) were randomized (11 per treatment group), and 950 families with a child 3-12 years old with persistent asthma were enrolled. A description of the coaching intervention is presented.</p> <p>Trial registration</p> <p>ClinicalTrials.gov identifier <a href="http://www.clinicaltrials.gov/ct2/show/NCT00860834">NCT00860834</a>.</p

    Low back pain status in elite and semi-elite Australian football codes: a cross-sectional survey of football (soccer), Australian rules, rugby league, rugby union and non-athletic controls

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Our understanding of the effects of football code participation on low back pain (LBP) is limited. It is unclear whether LBP is more prevalent in athletic populations or differs between levels of competition. Thus it was the aim of this study to document and compare the prevalence, intensity, quality and frequency of LBP between elite and semi-elite male Australian football code participants and a non-athletic group.</p> <p>Methods</p> <p>A cross-sectional survey of elite and semi-elite male Australian football code participants and a non-athletic group was performed. Participants completed a self-reported questionnaire incorporating the Quadruple Visual Analogue Scale (QVAS) and McGill Pain Questionnaire (short form) (MPQ-SF), along with additional questions adapted from an Australian epidemiological study. Respondents were 271 elite players (mean age 23.3, range 17–39), 360 semi-elite players (mean age 23.8, range 16–46) and 148 non-athletic controls (mean age 23.9, range 18–39).</p> <p>Results</p> <p>Groups were matched for age (p = 0.42) and experienced the same age of first onset LBP (p = 0.40). A significant linear increase in LBP from the non-athletic group, to the semi-elite and elite groups for the QVAS and the MPQ-SF was evident (p < 0.001). Elite subjects were more likely to experience more frequent (daily or weekly OR 1.77, 95% CI 1.29–2.42) and severe LBP (discomforting and greater OR 1.75, 95% CI 1.29–2.38).</p> <p>Conclusion</p> <p>Foolers in Australia have significantly more severe and frequent LBP than a non-athletic group and this escalates with level of competition.</p

    Rapid evolution of microbe-mediated protection against pathogens in a worm host.

    Get PDF
    Microbes can defend their host against virulent infections, but direct evidence for the adaptive origin of microbe-mediated protection is lacking. Using experimental evolution of a novel, tripartite interaction, we demonstrate that mildly pathogenic bacteria (Enterococcus faecalis) living in worms (Caenorhabditis elegans) rapidly evolved to defend their animal hosts against infection by a more virulent pathogen (Staphylococcus aureus), crossing the parasitism-mutualism continuum. Host protection evolved in all six, independently selected populations in response to within-host bacterial interactions and without direct selection for host health. Microbe-mediated protection was also effective against a broad spectrum of pathogenic S. aureus isolates. Genomic analysis implied that the mechanistic basis for E. faecalis-mediated protection was through increased production of antimicrobial superoxide, which was confirmed by biochemical assays. Our results indicate that microbes living within a host may make the evolutionary transition to mutualism in response to pathogen attack, and that microbiome evolution warrants consideration as a driver of infection outcome

    Ecological networks: Pursuing the shortest path, however narrow and crooked

    Get PDF
    International audienceRepresenting data as networks cuts across all sub-disciplines in ecology and evolutionary biology. Besides providing a compact representation of the interconnections between agents, network analysis allows the identification of especially important nodes, according to various metrics that often rely on the calculation of the shortest paths connecting any two nodes. While the interpretation of a shortest paths is straightforward in binary, unweighted networks, whenever weights are reported, the calculation could yield unexpected results. We analyzed 129 studies of ecological networks published in the last decade that use shortest paths, and discovered a methodological inaccuracy related to the edge weights used to calculate shortest paths (and related centrality measures), particularly in interaction networks. Specifically, 49% of the studies do not report sufficient information on the calculation to allow their replication, and 61% of the studies on weighted networks may contain errors in how shortest paths are calculated. Using toy models and empirical ecological data, we show how to transform the data prior to calculation and illustrate the pitfalls that need to be avoided. We conclude by proposing a five-point checklist to foster best-practices in the calculation and reporting of centrality measures in ecology and evolution studies. The last two decades have witnessed an exponential increase in the use of graph analysis in ecological and conservation studies (see refs. 1,2 for recent introductions to network theory in ecology and evolution). Networks (graphs) represent agents as nodes linked by edges representing pairwise relationships. For instance, a food web can be represented as a network of species (nodes) and their feeding relationships (edges) 3. Similarly, the spatial dynamics of a metapopulation can be analyzed by connecting the patches of suitable habitat (nodes) with edges measuring dispersal between patches 4. Data might either simply report the presence/absence of an edge (binary, unweighted networks), or provide a strength for each edge (weighted networks). In turn, these weights can represent a variety of ecologically-relevant quantities, depending on the system being described. For instance, edge weights can quantify interaction frequency (e.g., visitation networks 5), interaction strength (e.g., per-capita effect of one species on the growth rate of another 3), carbon-flow between trophic levels 6 , genetic similarity 7 , niche overlap (e.g., number of shared resources between two species 8), affinity 9 , dispersal probabilities (e.g., the rate at which individuals of a population move between patches 10), cost of dispersal between patches (e.g., resistance 11), etc. Despite such large variety of ecological network representations, a common task is the identification of nodes of high importance, such as keystone species in a food web, patches acting as stepping stones in a dispersal network , or genes with pleiotropic effects. The identification of important nodes is typically accomplished through centrality measures 5,12. Many centrality measures has been proposed, each probing complementary aspects of node-to-node relationships 13. For instance, Closeness centrality 14,15 highlights nodes that are "near" to all othe

    Vesicular Stomatitis Virus-Based Ebola Vaccine Is Well-Tolerated and Protects Immunocompromised Nonhuman Primates

    Get PDF
    Ebola virus (EBOV) is a significant human pathogen that presents a public health concern as an emerging/re-emerging virus and as a potential biological weapon. Substantial progress has been made over the last decade in developing candidate preventive vaccines that can protect nonhuman primates against EBOV. Among these prospects, a vaccine based on recombinant vesicular stomatitis virus (VSV) is particularly robust, as it can also confer protection when administered as a postexposure treatment. A concern that has been raised regarding the replication-competent VSV vectors that express EBOV glycoproteins is how these vectors would be tolerated by individuals with altered or compromised immune systems such as patients infected with HIV. This is especially important as all EBOV outbreaks to date have occurred in areas of Central and Western Africa with high HIV incidence rates in the population. In order to address this concern, we evaluated the safety of the recombinant VSV vector expressing the Zaire ebolavirus glycoprotein (VSVΔG/ZEBOVGP) in six rhesus macaques infected with simian-human immunodeficiency virus (SHIV). All six animals showed no evidence of illness associated with the VSVΔG/ZEBOVGP vaccine, suggesting that this vaccine may be safe in immunocompromised populations. While one goal of the study was to evaluate the safety of the candidate vaccine platform, it was also of interest to determine if altered immune status would affect vaccine efficacy. The vaccine protected 4 of 6 SHIV-infected macaques from death following ZEBOV challenge. Evaluation of CD4+ T cells in all animals showed that the animals that succumbed to lethal ZEBOV challenge had the lowest CD4+ counts, suggesting that CD4+ T cells may play a role in mediating protection against ZEBOV
    corecore