195 research outputs found

    Cattail Invasion of Sedge/Grass Meadows in Lake Ontario: Photointerpretation Analysis of Sixteen Wetlands over Five Decades

    Get PDF
    Photointerpretation studies were conducted to evaluate vegetation changes in wetlands of Lake Ontario and the upper St. Lawrence River associated with regulation of water levels since about 1960. The studies used photographs from 16 sites (four each from drowned river mouth, barrier beach, open embayment, and protected embayment wetlands) and spanned a period from the 1950s to 2001 at roughly decadal intervals. Meadow marsh was the most prominent vegetation type in most wetlands in the late 1950s when water levels had declined following high lake levels in the early 1950s. Meadow marsh increased at some sites in the mid-1960s in response to low lake levels and decreased at all sites in the late 1970s following a period of high lake levels. Typha increased at nearly all sites, except waveexposed open embayments, in the 1970s. Meadow marsh continued to decrease and Typha to increase at most sites during sustained higher lake levels through the 1980s, 1990s, and into 2001. Most vegetation changes could be correlated with lake-level changes and with life-history strategies and physiological tolerances to water depth of prominent taxa. Analyses of GIS coverages demonstrated that much of the Typha invasion was landward into meadow marsh, largely by Typha × glauca. Lesser expansion toward open water included both T. × glauca and T. angustifolia. Although many models focus on the seed bank as a key component of vegetative change in wetlands, our results suggest that canopy-dominating, moisture- requiring Typha was able to invade meadow marsh at higher elevations because sustained higher lake levels allowed it to survive and overtake sedges and grasses that can tolerate periods of drier soil conditions

    6-hydroxydopamine-mediated release of norepinephrine increases faecal excretion of Salmonella enterica serovar Typhimurium in pigs

    Get PDF
    Salmonella enterica serovar Typhimurium is an animal and zoonotic pathogen of worldwide importance. In pigs, transport and social stress are associated with reactivation and spread of Salmonella Typhimurium infection. The stress-related catecholamine norepinephrine (NE) has been reported to activate growth and virulence factor expression in Salmonella; however the extent to which NE contributes to stress-associated salmonellosis is unclear. We studied the impact of releasing NE from endogenous stores during Salmonella Typhimurium infection of pigs by administration of 6-hydroxydopamine (6-OHDA), which selectively destroys noradrenergic nerve terminals. Treatment of pigs with 6-OHDA 7 or 16 days post-oral inoculation with Salmonella Typhimurium produced elevated plasma NE levels and transiently, but significantly, increased faecal excretion of the challenge strain. Oral administration of NE to Salmonella Typhimurium-infected pigs also transiently and significantly increased shedding; however pre-culture of the bacteria with NE did not alter the outcome of infection. Salmonella has been proposed to sense and respond to NE via a homologue of the adrenergic sensor kinase QseC. A ΔqseC mutant of Salmonella Typhimurium was consistently excreted in lower numbers than the parent strain post-oral inoculation of pigs, though not significantly so. 6-OHDA treatment of pigs infected with the ΔqseC mutant also increased faecal excretion of the mutant strain, albeit to a lesser extent than observed upon 6-OHDA treatment of pigs infected with the parent strain. Our data support the notion that stress-related catecholamines modulate the interaction of enteric bacterial pathogens with their hosts

    Increased Soil Frost Versus Summer Drought as Drivers of Plant Biomass Responses To Reduced Precipitation: Results from A Globally-Coordinated Field Experiment

    Get PDF
    Reduced precipitation treatments often are used in field experiments to explore the effects of drought on plant productivity and species composition. However, in seasonally snow-covered regions reduced precipitation also reduces snow cover, which can increase soil frost depth, decrease minimum soil temperatures and increase soil freeze-thaw cycles. Therefore, in addition to the effects of reduced precipitation on plants via drought, freezing damage to overwintering plant tissues at or below the soil surface could further affect plant productivity and relative species abundances during the growing season. We examined the effects of both reduced rainfall (via rain-out shelters) and reduced snow cover (via snow removal) at 13 sites globally (primarily grasslands) within the framework of the International Drought Experiment, a coordinated distributed experiment. Plant cover was estimated at the species level and aboveground biomass was quantified at the functional group level. Among sites, we observed a negative correlation between the snow removal effect on minimum soil temperature and plant biomass production the next growing season. Three sites exhibited significant rain-out shelter effects on plant productivity, but there was no correlation among sites between the rain-out shelter effect on minimum soil moisture and plant biomass. There was no interaction between snow removal and rain-out shelters for plant biomass, although these two factors only exhibited significant effects simultaneously for a single site. Overall, our results reveal that reduced snowfall, when it decreases minimum soil temperatures, can be an important component of the total effect of reduced precipitation on plant productivity

    Improving the practicality of using non-aversive handling methods to reduce background stress and anxiety in laboratory mice

    Get PDF
    Handling can stimulate stress and anxiety in laboratory animals that negatively impacts welfare and introduces a confounding factor in many areas of research. Picking up mice by the tail is a major source of handling stress that results in strong aversion to the handler, while mice familiarised with being picked up in a tunnel or cupped on the open hand show low stress and anxiety, and actively seek interaction with their handlers. Here we investigate the duration and frequency of handling required for effective familiarisation with these non-aversive handling methods, and test whether this is sufficient to prevent aversion and anxiety when animals then experience immobilisation and a mild procedure (subcutaneous injection). Very brief handling (2 s) was sufficient to familiarise mice with tunnel handling, even when experienced only during cage cleaning. Brief but more frequent handling was needed for familiarisation with cup handling, while pick up by tail induced strong aversion even when handling was brief and infrequent. Experience of repeated immobilisation and subcutaneous injection did not reverse the positive effects of tunnel handling. Our findings demonstrate that replacing tail with tunnel handling during routine cage cleaning and procedures provides a major refinement with little if any cost for familiarisation

    Non-Hodgkin's lymphoma, obesity and energy homeostasis polymorphisms

    Get PDF
    A population-based case–control study of lymphomas in England collected height and weight details from 699 non-Hodgkin's lymphoma (NHL) cases and 914 controls. Obesity, defined as a body mass index (BMI) over 30 kg m−2 at five years before diagnosis,, was associated with an increased risk of NHL (OR=1.5, 95% CI 1.1–2.1). The excess was most pronounced for diffuse large B-cell lymphoma (OR=1.9, 95% CI 1.3–2.8). Genetic variants in the leptin (LEP 19G>A, LEP −2548G>A) and leptin receptor genes (LEPR 223Q>R), previously shown to modulate NHL risk, as well as a polymorphism in the energy regulatory gene adiponectin (APM1 276G>T), were investigated. Findings varied with leptin genotype, the risks being decreased with LEP 19AA (OR=0.7, 95% CI 0.5–1.0) and increased with LEP −2548GA (OR=1.3, 95% CI 1.0–1.7) and −2548AA (OR=1.4, 95% CI 1.0–1.9), particularly for follicular lymphoma. These genetic findings, which were independent of BMI, were stronger for men than women

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy

    HpARI protein secreted by a helminth parasite suppresses interleukin-33

    Get PDF
    Infection by helminth parasites is associated with amelioration of allergic reactivity, but mechanistic insights into this association are lacking. Products secreted by the mouse parasite Heligmosomoides polygyrus suppress type 2 (allergic) immune responses through interference in the interleukin-33 (IL-33) pathway. Here, we identified H. polygyrus Alarmin Release Inhibitor (HpARI), an IL-33-suppressive 26-kDa protein, containing three predicted complement control protein (CCP) modules. In vivo, recombinant HpARI abrogated IL-33, group 2 innate lymphoid cell (ILC2) and eosinophilic responses to Alternaria allergen administration, and diminished eosinophilic responses to Nippostrongylus brasiliensis, increasing parasite burden. HpARI bound directly to both mouse and human IL-33 (in the cytokine's activated state) and also to nuclear DNA via its N-terminal CCP module pair (CCP1/2), tethering active IL-33 within necrotic cells, preventing its release, and forestalling initiation of type 2 allergic responses. Thus, HpARI employs a novel molecular strategy to suppress type 2 immunity in both infection and allergy. Osbourn et al identified HpARI, a protein secreted by a helminth parasite that is capable of suppressing allergic responses. HpARI binds to IL-33 (a critical inducer of allergy) and nuclear DNA, preventing the release of IL-33 from necrotic epithelial cells

    Arrhythmia and Death Following Percutaneous Revascularization in Ischemic Left Ventricular Dysfunction: Prespecified Analyses From the REVIVED-BCIS2 Trial.

    Get PDF
    BACKGROUND: Ventricular arrhythmia is an important cause of mortality in patients with ischemic left ventricular dysfunction. Revascularization with coronary artery bypass graft or percutaneous coronary intervention is often recommended for these patients before implantation of a cardiac defibrillator because it is assumed that this may reduce the incidence of fatal and potentially fatal ventricular arrhythmias, although this premise has not been evaluated in a randomized trial to date. METHODS: Patients with severe left ventricular dysfunction, extensive coronary disease, and viable myocardium were randomly assigned to receive either percutaneous coronary intervention (PCI) plus optimal medical and device therapy (OMT) or OMT alone. The composite primary outcome was all-cause death or aborted sudden death (defined as an appropriate implantable cardioverter defibrillator therapy or a resuscitated cardiac arrest) at a minimum of 24 months, analyzed as time to first event on an intention-to-treat basis. Secondary outcomes included cardiovascular death or aborted sudden death, appropriate implantable cardioverter defibrillator (ICD) therapy or sustained ventricular arrhythmia, and number of appropriate ICD therapies. RESULTS: Between August 28, 2013, and March 19, 2020, 700 patients were enrolled across 40 centers in the United Kingdom. A total of 347 patients were assigned to the PCI+OMT group and 353 to the OMT alone group. The mean age of participants was 69 years; 88% were male; 56% had hypertension; 41% had diabetes; and 53% had a clinical history of myocardial infarction. The median left ventricular ejection fraction was 28%; 53.1% had an implantable defibrillator inserted before randomization or during follow-up. All-cause death or aborted sudden death occurred in 144 patients (41.6%) in the PCI group and 142 patients (40.2%) in the OMT group (hazard ratio, 1.03 [95% CI, 0.82-1.30]; P=0.80). There was no between-group difference in the occurrence of any of the secondary outcomes. CONCLUSIONS: PCI was not associated with a reduction in all-cause mortality or aborted sudden death. In patients with ischemic cardiomyopathy, PCI is not beneficial solely for the purpose of reducing potentially fatal ventricular arrhythmias. REGISTRATION: URL: https://www. CLINICALTRIALS: gov; Unique identifier: NCT01920048

    Viability and Outcomes With Revascularization or Medical Therapy in Ischemic Ventricular Dysfunction: A Prespecified Secondary Analysis of the REVIVED-BCIS2 Trial.

    Get PDF
    IMPORTANCE: In the Revascularization for Ischemic Ventricular Dysfunction (REVIVED-BCIS2) trial, percutaneous coronary intervention (PCI) did not improve outcomes for patients with ischemic left ventricular dysfunction. Whether myocardial viability testing had prognostic utility for these patients or identified a subpopulation who may benefit from PCI remained unclear. OBJECTIVE: To determine the effect of the extent of viable and nonviable myocardium on the effectiveness of PCI, prognosis, and improvement in left ventricular function. DESIGN, SETTING, AND PARTICIPANTS: Prospective open-label randomized clinical trial recruiting between August 28, 2013, and March 19, 2020, with a median follow-up of 3.4 years (IQR, 2.3-5.0 years). A total of 40 secondary and tertiary care centers in the United Kingdom were included. Of 700 randomly assigned patients, 610 with left ventricular ejection fraction less than or equal to 35%, extensive coronary artery disease, and evidence of viability in at least 4 myocardial segments that were dysfunctional at rest and who underwent blinded core laboratory viability characterization were included. Data analysis was conducted from March 31, 2022, to May 1, 2023. INTERVENTION: Percutaneous coronary intervention in addition to optimal medical therapy. MAIN OUTCOMES AND MEASURES: Blinded core laboratory analysis was performed of cardiac magnetic resonance imaging scans and dobutamine stress echocardiograms to quantify the extent of viable and nonviable myocardium, expressed as an absolute percentage of left ventricular mass. The primary outcome of this subgroup analysis was the composite of all-cause death or hospitalization for heart failure. Secondary outcomes were all-cause death, cardiovascular death, hospitalization for heart failure, and improved left ventricular function at 6 months. RESULTS: The mean (SD) age of the participants was 69.3 (9.0) years. In the PCI group, 258 (87%) were male, and in the optimal medical therapy group, 277 (88%) were male. The primary outcome occurred in 107 of 295 participants assigned to PCI and 114 of 315 participants assigned to optimal medical therapy alone. There was no interaction between the extent of viable or nonviable myocardium and the effect of PCI on the primary or any secondary outcome. Across the study population, the extent of viable myocardium was not associated with the primary outcome (hazard ratio per 10% increase, 0.98; 95% CI, 0.93-1.04) or any secondary outcome. The extent of nonviable myocardium was associated with the primary outcome (hazard ratio, 1.07; 95% CI, 1.00-1.15), all-cause death, cardiovascular death, and improvement in left ventricular function. CONCLUSIONS AND RELEVANCE: This study found that viability testing does not identify patients with ischemic cardiomyopathy who benefit from PCI. The extent of nonviable myocardium, but not the extent of viable myocardium, is associated with event-free survival and likelihood of improvement of left ventricular function. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT01920048
    corecore