171 research outputs found

    Herbivory and time since flowering shape floral rewards and pollinator-pathogen interactions

    Get PDF
    Herbivory can induce chemical changes throughout plant tissues including flowers, which could affect pollinator-pathogen interactions. Pollen is highly defended compared to nectar, but no study has examined whether herbivory affects pollen chemistry. We assessed the effects of leaf herbivory on nectar and pollen alkaloids in Nicotiana tabacum, and how herbivory-induced changes in nectar and pollen affect pollinator-pathogen interactions. We damaged leaves of Nicotiana tabacum using the specialist herbivore Manduca sexta and compared nicotine and anabasine concentrations in nectar and pollen. We then pooled nectar and pollen by collection periods (within and after one month of flowering), fed them in separate experiments to bumble bees (Bombus impatiens) infected with the gut pathogen Crithidia bombi, and assessed infections after seven days. We did not detect alkaloids in nectar, and leaf damage did not alter the effect of nectar on Crithidia counts. In pollen, herbivory induced higher concentrations of anabasine but not nicotine, and alkaloid concentrations rose and then fell as a function of days since flowering. Bees fed pollen from damaged plants had Crithidia counts 15 times higher than bees fed pollen from undamaged plants, but only when pollen was collected after one month of flowering, indicating that both damage and time since flowering affected interaction outcomes. Within undamaged treatments, bees fed late-collected pollen had Crithidia counts 10 times lower than bees fed early-collected pollen, also indicating the importance of time since flowering. Our results emphasize the role of herbivores in shaping pollen chemistry, with consequences for interactions between pollinators and their pathogens

    Beyond 30 days: Does limiting the duration of surgical site infection follow-up limit detection?

    Get PDF
    Concern over consistency and completeness of surgical site infection (SSI) surveillance has increased due to public reporting of hospital SSI rates and imminent non-payment rules for hospitals that do not meet national benchmarks. Already, hospitals no longer receive additional payment from the Centers for Medicare & Medicaid Services (CMS) for certain infections following coronary artery bypass graft (CABG) surgery, orthopedic procedures, and bariatric surgery. One major concern is incomplete and differential post-discharge surveillance. At present, substantial variation exists in how and whether hospitals identify SSI events after the hospitalization in which the surgery occurred. Parameters used for SSI surveillance such as the duration of the window of time that surveillance takes place following the surgical procedure can impact the completeness of surveillance data. Determination of the optimal surveillance time period involves balancing the potential increased case ascertainment associated with a longer follow-up period with the increased resources that would be required. Currently, the time window for identifying potentially preventable SSIs related to events at the time of surgery is not fully standardized. The Centers for Disease Control and Prevention (CDC) National Healthcare Surveillance Network (NHSN) requires a 365-day postoperative surveillance period for procedures involving implants and a 30-day period for non-implant procedures. In contrast, the National Surgical Quality Improvement Program (NSQIP) and the Society of Thoracic Surgeons (STS) systems employ 30-day post-operative surveillance regardless of implant. As consensus builds towards national quality measures for hospital-specific SSI rates, it will be important to assess the frequency of events beyond the 30-day post-surgical window that may quantify the value of various durations of surveillance, and ultimately inform the choice of specific outcome measures

    Enhanced surgical site infection surveillance following hysterectomy, vascular, and colorectal surgery

    Get PDF
    Objective.To evaluate the use of inpatient pharmacy and administrative data to detect surgical site infections (SSIs) following hysterectomy and colorectal and vascular surgery.Design.Retrospective cohort study.Setting.Five hospitals affiliated with academic medical centers.Patients.Adults who underwent abdominal or vaginal hysterectomy, colorectal surgery, or vascular surgery procedures between July 1, 2003, and June 30, 2005.Methods.We reviewed the medical records of weighted, random samples drawn from 3,079 abdominal and vaginal hysterectomy, 4,748 colorectal surgery, and 3,332 vascular surgery procedures. We compared routine surveillance with screening of inpatient pharmacy data and diagnosis codes and then performed medical record review to confirm SSI status.Results.Medical records from 823 hysterectomy, 736 colorectal surgery, and 680 vascular surgery procedures were reviewed. SSI rates determined by antimicrobial- and/or diagnosis code-based screening followed by medical record review (enhanced surveillance) were substantially higher than rates determined by routine surveillance (4.3% [95% confidence interval, 3.6%—5.1%] vs 2.7% for hysterectomies, 7.1% [95% confidence interval, 6.7%–8.2%] vs 2.0% for colorectal procedures, and 2.3% [95% confidence interval, 1.9%–2.9%] vs 1.4% for vascular procedures). Enhanced surveillance had substantially higher sensitivity than did routine surveillance to detect SSI (92% vs 59% for hysterectomies, 88% vs 22% for colorectal procedures, and 72% vs 43% for vascular procedures). A review of medical records confirmed SSI for 31% of hysterectomies, 20% of colorectal procedures, and 31% of vascular procedures that met the enhanced screening criteria.Conclusion.Antimicrobial- and diagnosis code-based screening may be a useful method for enhancing and streamlining SSI surveillance for a variety of surgical procedures, including those procedures targeted by the Centers for Medicare and Medicaid Services.</jats:sec

    Interaction between socioeconomic deprivation and likelihood of pre‐emptive transplantation: influence of competing risks and referral characteristics – a retrospective study

    Get PDF
    Socioeconomic deprivation (SED) influences likelihood of pre‐emptive kidney transplantation (PET), but the mechanisms behind this are unclear. We explored the relationships between SED and patient characteristics at referral, which might explain this discrepancy. A retrospective cohort study was performed. SED was measured by Scottish Index of Multiple Deprivation (SIMD). Logistic regression evaluated predictors of PET. A competing risks survival analysis evaluated the interaction between SED and progression to end‐stage kidney disease (ESKD) and death. Of 7765 patients with follow‐up of 5.69 ± 6.52 years, 1298 developed ESKD requiring RRT; 113 received PET, 64 of which were from live donors. Patients receiving PET were “less deprived” with higher SIMD (5 ± 7 vs. 4 ± 5; P = 0.003). This appeared independent of overall comorbidity burden. SED was associated with a higher risk of death but not ESKD. Higher SIMD decile was associated with a higher likelihood of PET (OR 1.14, 95% CI 1.06, 1.23); the presence of diabetes and malignancy also reduced PET. SED was associated with reduced likelihood of PET after adjustment for baseline comorbidity, and this was not explained by risk of death or faster progression to ESKD. Education and outreach into transplantation should be augmented in areas with higher deprivation

    From plant fungi to bee parasites: mycorrhizae and soil nutrients shape floral chemistry and bee pathogens

    Get PDF
    Bee populations have experienced declines in recent years, due in part to increased disease incidence. Multiple factors influence bee-pathogen interactions, including nectar and pollen quality and secondary metabolites. However, we lack an understanding of how plant interactions with their environment shape bee diet quality. We examined how plant interactions with the belowground environment alter floral rewards and, in turn, bee-pathogen interactions. Soil-dwelling mycorrhizal fungi are considered plant mutualists, although the outcome of the relationship depends on environmental conditions such as nutrients. In a 2x2 factorial design, we asked whether mycorrhizal fungi and nutrients affect concentrations of nectar and pollen alkaloids (anabasine and nicotine) previously shown to reduce infection by the gut pathogen Crithidia in the native bumblebee Bombus impatiens. To ask how plant interactions affect this common bee pathogen, we fed pollen and nectar from our treatment plants, and from a wildflower pollen control with artificial nectar, to bees infected with Crithidia. Mycorrhizal fungi and fertilizer both influenced flowering phenology and floral chemistry. While we found no anabasine or nicotine in nectar, high fertilizer increased anabasine and nicotine in pollen. AMF decreased nicotine concentrations, but the reduction due to AMF was stronger in high than low-nutrient conditions. AMF and nutrients also had interactive effects on bee pathogens via changes in nectar and pollen. High fertilizer reduced Crithidia cell counts relative to low fertilizer in AMF plants, but increased Crithidia in non-AMF plants. These results did not correspond with effects of fertilizer and AMF on pollen alkaloid concentrations, suggesting that other components of pollen or nectar were affected by treatments and shaped pathogen counts. Our results indicate that soil biotic and abiotic environment can alter bee-pathogen interactions via changes in floral rewards, and underscore the importance of integrative studies to predict disease dynamics and ecological outcomes

    Implementing automated surveillance for tracking Clostridium difficile infection at multiple healthcare facilities

    Get PDF
    Automated surveillance utilizing electronically available data has been found to be accurate and save time. An automated CDI surveillance algorithm was validated at four CDC Prevention Epicenters hospitals. Electronic surveillance was highly sensitive, specific, and showed good to excellent agreement for hospital-onset; community-onset, study facility associated; indeterminate; and recurrent CDI

    Multicenter study of the impact of community-onset Clostridium difficile infection on surveillance for C. difficile infection

    Get PDF
    OBJECTIVE: To evaluate the influence of community-onset/healthcare facility-associated cases on Clostridium difficile infection (CDI) incidence and outbreak detection. DESIGN: Retrospective cohort. SETTING: Five acute-care healthcare facilities in the United States. METHODS: Positive stool C. difficile toxin assays from July 2000 through June 2006 and healthcare facility exposure information were collected. CDI cases were classified as hospital-onset (HO) if they were diagnosed > 48 hours after admission or community-onset/healthcare facility-associated if they were diagnosed ≤ 48 hours from admission and had recently been discharged from the healthcare facility. Four surveillance definitions were compared: HO cases only and HO plus community-onset/healthcare facility-associated cases diagnosed within 30 (HCFA-30), 60 (HCFA-60) and 90 (HCFA-90) days after discharge from the study hospital. Monthly CDI rates were compared. Control charts were used to identify potential CDI outbreaks. RESULTS: The HCFA-30 rate was significantly higher than the HO rate at two healthcare facilities (p<0.01). The HCFA-30 rate was not significantly different from the HCFA-60 or HCFA-90 rates at any healthcare facility. The correlations between each healthcare facility’s monthly rates of HO and HCFA-30 CDI were almost perfect (range, 0.94–0.99, p<0.001). Overall, 12 time points had a CDI rate >3 SD above the mean, including 11 by the HO definition and 9 by the HCFA-30 definition, with discordant results at 4 time points (κ = 0.794, p<0.001). CONCLUSIONS: Tracking community-onset/healthcare facility-associated cases in addition to HO cases captures significantly more CDI cases but surveillance of HO CDI alone is sufficient to detect an outbreak

    The Relationship Between Fractures and DXA Measures of BMD in the Distal Femur of Children and Adolescents With Cerebral Palsy or Muscular Dystrophy

    Get PDF
    Children with limited or no ability to ambulate frequently sustain fragility fractures. Joint contractures, scoliosis, hip dysplasia, and metallic implants often prevent reliable measures of bone mineral density (BMD) in the proximal femur and lumbar spine, where BMD is commonly measured. Further, the relevance of lumbar spine BMD to fracture risk in this population is questionable. In an effort to obtain bone density measures that are both technically feasible and clinically relevant, a technique was developed involving dual-energy X-ray absorptiometry (DXA) measures of the distal femur projected in the lateral plane. The purpose of this study is to test the hypothesis that these new measures of BMD correlate with fractures in children with limited or no ability to ambulate. The relationship between distal femur BMD Z-scores and fracture history was assessed in a cross-sectional study of 619 children aged 6 to 18 years with muscular dystrophy or moderate to severe cerebral palsy compiled from eight centers. There was a strong correlation between fracture history and BMD Z-scores in the distal femur; 35% to 42% of those with BMD Z-scores less than −5 had fractured compared with 13% to 15% of those with BMD Z-scores greater than −1. Risk ratios were 1.06 to 1.15 (95% confidence interval 1.04–1.22), meaning a 6% to 15% increased risk of fracture with each 1.0 decrease in BMD Z-score. In clinical practice, DXA measure of BMD in the distal femur is the technique of choice for the assessment of children with impaired mobility. © 2010 American Society for Bone and Mineral Researc

    Clinical trial of laronidase in Hurler syndrome after hematopoietic cell transplantation.

    Get PDF
    BackgroundMucopolysaccharidosis I (MPS IH) is a lysosomal storage disease treated with hematopoietic cell transplantation (HCT) because it stabilizes cognitive deterioration, but is insufficient to alleviate all somatic manifestations. Intravenous laronidase improves somatic burden in attenuated MPS I. It is unknown whether laronidase can improve somatic disease following HCT in MPS IH. The objective of this study was to evaluate the effects of laronidase on somatic outcomes of patients with MPS IH previously treated with HCT.MethodsThis 2-year open-label pilot study of laronidase included ten patients (age 5-13 years) who were at least 2 years post-HCT and donor engrafted. Outcomes were assessed semi-annually and compared to historic controls.ResultsThe two youngest participants had a statistically significant improvement in growth compared to controls. Development of persistent high-titer anti-drug antibodies (ADA) was associated with poorer 6-min walk test (6MWT) performance; when patients with high ADA titers were excluded, there was a significant improvement in the 6MWT in the remaining seven patients.ConclusionsLaronidase seemed to improve growth in participants &lt;8 years old, and 6MWT performance in participants without ADA. Given the small number of patients treated in this pilot study, additional study is needed before definitive conclusions can be made
    corecore