109 research outputs found

    Leaf litter breakdown along an elevational gradient in Australian alpine streams

    Get PDF
    The breakdown of allochthonous organic matter, is a central step in nutrient cycling in stream ecosystems. There is concern that increased temperatures from climate change will alter the breakdown rate of organic matter, with important consequences for the ecosystem functioning of alpine streams. This study investigated the rate of leaf litter breakdown and how temperature and other factors such as microbial and invertebrate activities influenced this over elevational and temporal gradients. Dried leaves of Snow Gum (Eucalyptus pauciflora) and cotton strips were deployed in coarse (6 mm), and fine (50 mu m) mesh size bags along an 820 m elevation gradient. Loss of mass in leaf litter and cotton tensile strength per day (k per day), fungal biomass measured as ergosterol concentration, invertebrate colonization of leaf litter, and benthic organic matter (mass and composition) were determined. Both microbial and macroinvertebrate activities were equally important in leaf litter breakdown with the abundance of shredder invertebrate taxa. The overall leaf litter breakdown rate and loss of tensile strength in cotton strips (both k per day) were greater during warmer deployment periods and at lower elevations, with significant positive relationships between mean water temperature and leaf breakdown and loss of tensile strength rate, but no differences between sites, after accounting for the effects of temperature. Despite considerably lower amounts of benthic organic matter in streams above the tree line relative to those below, shredders were present in coarse mesh bags at all sites. Ergosterol concentration was greater on leaves in coarse mesh bags than in fine mesh bags, implying differences in the microbial communities. The importance of water temperatures on the rate of leaf litter breakdown suggests the potential effects of climate change-induced temperature increases on ecological processes in such streams

    Case Study Comparing Effects of Microplastic Derived from Bottle Caps Collected in Two Cities on Triticum aestivum (Wheat)

    Get PDF
    As plastic has become an integral component of daily life, microplastic has become a ubiquitous, unavoidable constituent of nearly all ecosystems. Besides monitoring the amount and distribution of microplastic in the environment, it is necessary to understand the possible direct effects, especially toxicity and how it is affected by environmental factors where it is discarded. The present study investigated how microplastic derived from high-density polyethylene bottle caps collected in two climatically different cities, i.e., Singapore (tropical rainforest climate) and Lahti, Finland (continental climate), affected the essential agricultural grain crop, Triticum aestivum (L.). Wheat seedlings were exposed to microplastic derived from these collected bottle caps, as well as new and artificially aged caps, for seven days. Morphological parameters, such as root and shoot length and oxidative stress development, were measured. Exposure to microplastic derived from the caps resulted in reduced seedling root and shoot lengths compared to the controls, as well as enhanced lipid peroxidation and catalase activity. With all parameters tested, microplastic derived from Lahti bottle caps exhibited more severe effects than Singapore, which was similar to that elicited by new microplastic. The Singapore microplastic had possibly leached its toxic substances before collection due to accelerated degradation promoted by the prevailing warmer climate conditions

    Enchytraeus crypticus Avoid Soil Spiked with Microplastic

    Get PDF
    Microplastics (MPs) of varying sizes are widespread pollutants in our environment. The general opinion is that the smaller the size, the more dangerous the MPs are due to enhanced uptake possibilities. It would be of considerably ecological significance to understand the response of biota to microplastic contamination both physically and physiologically. Here, we report on an area choice experiment (avoidance test) using Enchytraeus crypticus, in which we mixed different amounts of high-density polyethylene microplastic particles into the soil. In all experimental scenarios, more Enchytraeids moved to the unspiked sections or chose a lower MP-concentration. Worms in contact with MP exhibited an enhanced oxidative stress status, measured as the induced activity of the antioxidative enzymes catalase and glutathione S-transferase. As plastic polymers per se are nontoxic, the exposure time employed was too short for chemicals to leach from the microplastic, and as the microplastic particles used in these experiments were too large (4 mm) to be consumed by the Enchytraeids, the likely cause for the avoidance and oxidative stress could be linked to altered soil properties

    Enchytraeus crypticus Avoid Soil Spiked with Microplastic

    Get PDF
    Microplastics (MPs) of varying sizes are widespread pollutants in our environment. The general opinion is that the smaller the size, the more dangerous the MPs are due to enhanced uptake possibilities. It would be of considerably ecological significance to understand the response of biota to microplastic contamination both physically and physiologically. Here, we report on an area choice experiment (avoidance test) using Enchytraeus crypticus, in which we mixed different amounts of high-density polyethylene microplastic particles into the soil. In all experimental scenarios, more Enchytraeids moved to the unspiked sections or chose a lower MP-concentration. Worms in contact with MP exhibited an enhanced oxidative stress status, measured as the induced activity of the antioxidative enzymes catalase and glutathione S-transferase. As plastic polymers per se are nontoxic, the exposure time employed was too short for chemicals to leach from the microplastic, and as the microplastic particles used in these experiments were too large (4 mm) to be consumed by the Enchytraeids, the likely cause for the avoidance and oxidative stress could be linked to altered soil properties

    The NK Cell Response to Mouse Cytomegalovirus Infection Affects the Level and Kinetics of the Early CD8+ T-Cell Response

    Get PDF
    Natural killer (NK) cells and CD8+ T cells play a prominent role in the clearance of mouse cytomegalovirus (MCMV) infection. The role of NK cells in modulating the CD8+ T-cell response to MCMV infection is still the subject of intensive research. For analyzing the impact of NK cells on mounting of a CD8+ T-cell response and the contribution of these cells to virus control during the first days postinfection (p.i.), we used C57BL/6 mice in which NK cells are specifically activated through the Ly49H receptor engaged by the MCMV-encoded ligand m157. Our results indicate that the requirement for CD8+ T cells in early MCMV control inversely correlates with the engagement of Ly49H. While depletion of CD8+ T cells has only a minor effect on the early control of wild-type MCMV, CD8+ T cells are essential in the control of Δm157 virus. The frequencies of virus epitope-specific CD8+ T cells and their activation status were higher in mice infected with Δm157 virus. In addition, these mice showed elevated levels of alpha interferon (IFN-α) and several other proinflammatory cytokines as early as 1.5 days p.i. Although the numbers of conventional dendritic cells (cDCs) were reduced later during infection, particularly in Δm157-infected mice, they were not significantly affected at the peak of the cytokine response. Altogether, we concluded that increased antigen load, preservation of early cDCs' function, and higher levels of innate cytokines collectively account for an enhanced CD8+ T-cell response in C57BL/6 mice infected with a virus unable to activate NK cells via the Ly49H–m157 interaction

    Reduced costs with bisoprolol treatment for heart failure - An economic analysis of the second Cardiac Insufficiency Bisoprolol Study (CIBIS-II)

    Get PDF
    Background Beta-blockers, used as an adjunctive to diuretics, digoxin and angiotensin converting enzyme inhibitors, improve survival in chronic heart failure. We report a prospectively planned economic analysis of the cost of adjunctive beta-blocker therapy in the second Cardiac Insufficiency BIsoprolol Study (CIBIS II). Methods Resource utilization data (drug therapy, number of hospital admissions, length of hospital stay, ward type) were collected prospectively in all patients in CIBIS . These data were used to determine the additional direct costs incurred, and savings made, with bisoprolol therapy. As well as the cost of the drug, additional costs related to bisoprolol therapy were added to cover the supervision of treatment initiation and titration (four outpatient clinic/office visits). Per them (hospital bed day) costings were carried out for France, Germany and the U.K. Diagnosis related group costings were performed for France and the U.K. Our analyses took the perspective of a third party payer in France and Germany and the National Health Service in the U.K. Results Overall, fewer patients were hospitalized in the bisoprolol group, there were fewer hospital admissions perpatient hospitalized, fewer hospital admissions overall, fewer days spent in hospital and fewer days spent in the most expensive type of ward. As a consequence the cost of care in the bisoprolol group was 5-10% less in all three countries, in the per them analysis, even taking into account the cost of bisoprolol and the extra initiation/up-titration visits. The cost per patient treated in the placebo and bisoprolol groups was FF35 009 vs FF31 762 in France, DM11 563 vs DM10 784 in Germany and pound 4987 vs pound 4722 in the U.K. The diagnosis related group analysis gave similar results. Interpretation Not only did bisoprolol increase survival and reduce hospital admissions in CIBIS II, it also cut the cost of care in so doing. This `win-win' situation of positive health benefits associated with cost savings is Favourable from the point of view of both the patient and health care systems. These findings add further support for the use of beta-blockers in chronic heart failure

    Why Are Outcomes Different for Registry Patients Enrolled Prospectively and Retrospectively? Insights from the Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF).

    Get PDF
    Background: Retrospective and prospective observational studies are designed to reflect real-world evidence on clinical practice, but can yield conflicting results. The GARFIELD-AF Registry includes both methods of enrolment and allows analysis of differences in patient characteristics and outcomes that may result. Methods and Results: Patients with atrial fibrillation (AF) and ≄1 risk factor for stroke at diagnosis of AF were recruited either retrospectively (n = 5069) or prospectively (n = 5501) from 19 countries and then followed prospectively. The retrospectively enrolled cohort comprised patients with established AF (for a least 6, and up to 24 months before enrolment), who were identified retrospectively (and baseline and partial follow-up data were collected from the emedical records) and then followed prospectively between 0-18 months (such that the total time of follow-up was 24 months; data collection Dec-2009 and Oct-2010). In the prospectively enrolled cohort, patients with newly diagnosed AF (≀6 weeks after diagnosis) were recruited between Mar-2010 and Oct-2011 and were followed for 24 months after enrolment. Differences between the cohorts were observed in clinical characteristics, including type of AF, stroke prevention strategies, and event rates. More patients in the retrospectively identified cohort received vitamin K antagonists (62.1% vs. 53.2%) and fewer received non-vitamin K oral anticoagulants (1.8% vs . 4.2%). All-cause mortality rates per 100 person-years during the prospective follow-up (starting the first study visit up to 1 year) were significantly lower in the retrospective than prospectively identified cohort (3.04 [95% CI 2.51 to 3.67] vs . 4.05 [95% CI 3.53 to 4.63]; p = 0.016). Conclusions: Interpretations of data from registries that aim to evaluate the characteristics and outcomes of patients with AF must take account of differences in registry design and the impact of recall bias and survivorship bias that is incurred with retrospective enrolment. Clinical Trial Registration: - URL: http://www.clinicaltrials.gov . Unique identifier for GARFIELD-AF (NCT01090362)

    Improved risk stratification of patients with atrial fibrillation: an integrated GARFIELD-AF tool for the prediction of mortality, stroke and bleed in patients with and without anticoagulation.

    Get PDF
    OBJECTIVES: To provide an accurate, web-based tool for stratifying patients with atrial fibrillation to facilitate decisions on the potential benefits/risks of anticoagulation, based on mortality, stroke and bleeding risks. DESIGN: The new tool was developed, using stepwise regression, for all and then applied to lower risk patients. C-statistics were compared with CHA2DS2-VASc using 30-fold cross-validation to control for overfitting. External validation was undertaken in an independent dataset, Outcome Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). PARTICIPANTS: Data from 39 898 patients enrolled in the prospective GARFIELD-AF registry provided the basis for deriving and validating an integrated risk tool to predict stroke risk, mortality and bleeding risk. RESULTS: The discriminatory value of the GARFIELD-AF risk model was superior to CHA2DS2-VASc for patients with or without anticoagulation. C-statistics (95% CI) for all-cause mortality, ischaemic stroke/systemic embolism and haemorrhagic stroke/major bleeding (treated patients) were: 0.77 (0.76 to 0.78), 0.69 (0.67 to 0.71) and 0.66 (0.62 to 0.69), respectively, for the GARFIELD-AF risk models, and 0.66 (0.64-0.67), 0.64 (0.61-0.66) and 0.64 (0.61-0.68), respectively, for CHA2DS2-VASc (or HAS-BLED for bleeding). In very low to low risk patients (CHA2DS2-VASc 0 or 1 (men) and 1 or 2 (women)), the CHA2DS2-VASc and HAS-BLED (for bleeding) scores offered weak discriminatory value for mortality, stroke/systemic embolism and major bleeding. C-statistics for the GARFIELD-AF risk tool were 0.69 (0.64 to 0.75), 0.65 (0.56 to 0.73) and 0.60 (0.47 to 0.73) for each end point, respectively, versus 0.50 (0.45 to 0.55), 0.59 (0.50 to 0.67) and 0.55 (0.53 to 0.56) for CHA2DS2-VASc (or HAS-BLED for bleeding). Upon validation in the ORBIT-AF population, C-statistics showed that the GARFIELD-AF risk tool was effective for predicting 1-year all-cause mortality using the full and simplified model for all-cause mortality: C-statistics 0.75 (0.73 to 0.77) and 0.75 (0.73 to 0.77), respectively, and for predicting for any stroke or systemic embolism over 1 year, C-statistics 0.68 (0.62 to 0.74). CONCLUSIONS: Performance of the GARFIELD-AF risk tool was superior to CHA2DS2-VASc in predicting stroke and mortality and superior to HAS-BLED for bleeding, overall and in lower risk patients. The GARFIELD-AF tool has the potential for incorporation in routine electronic systems, and for the first time, permits simultaneous evaluation of ischaemic stroke, mortality and bleeding risks. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier for GARFIELD-AF (NCT01090362) and for ORBIT-AF (NCT01165710)

    Two-year outcomes of patients with newly diagnosed atrial fibrillation: results from GARFIELD-AF.

    Get PDF
    AIMS: The relationship between outcomes and time after diagnosis for patients with non-valvular atrial fibrillation (NVAF) is poorly defined, especially beyond the first year. METHODS AND RESULTS: GARFIELD-AF is an ongoing, global observational study of adults with newly diagnosed NVAF. Two-year outcomes of 17 162 patients prospectively enrolled in GARFIELD-AF were analysed in light of baseline characteristics, risk profiles for stroke/systemic embolism (SE), and antithrombotic therapy. The mean (standard deviation) age was 69.8 (11.4) years, 43.8% were women, and the mean CHA2DS2-VASc score was 3.3 (1.6); 60.8% of patients were prescribed anticoagulant therapy with/without antiplatelet (AP) therapy, 27.4% AP monotherapy, and 11.8% no antithrombotic therapy. At 2-year follow-up, all-cause mortality, stroke/SE, and major bleeding had occurred at a rate (95% confidence interval) of 3.83 (3.62; 4.05), 1.25 (1.13; 1.38), and 0.70 (0.62; 0.81) per 100 person-years, respectively. Rates for all three major events were highest during the first 4 months. Congestive heart failure, acute coronary syndromes, sudden/unwitnessed death, malignancy, respiratory failure, and infection/sepsis accounted for 65% of all known causes of death and strokes for <10%. Anticoagulant treatment was associated with a 35% lower risk of death. CONCLUSION: The most frequent of the three major outcome measures was death, whose most common causes are not known to be significantly influenced by anticoagulation. This suggests that a more comprehensive approach to the management of NVAF may be needed to improve outcome. This could include, in addition to anticoagulation, interventions targeting modifiable, cause-specific risk factors for death. CLINICAL TRIAL REGISTRATION: http://www.clinicaltrials.gov. Unique identifier: NCT01090362

    Risk profiles and one-year outcomes of patients with newly diagnosed atrial fibrillation in India: Insights from the GARFIELD-AF Registry.

    Get PDF
    BACKGROUND: The Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF) is an ongoing prospective noninterventional registry, which is providing important information on the baseline characteristics, treatment patterns, and 1-year outcomes in patients with newly diagnosed non-valvular atrial fibrillation (NVAF). This report describes data from Indian patients recruited in this registry. METHODS AND RESULTS: A total of 52,014 patients with newly diagnosed AF were enrolled globally; of these, 1388 patients were recruited from 26 sites within India (2012-2016). In India, the mean age was 65.8 years at diagnosis of NVAF. Hypertension was the most prevalent risk factor for AF, present in 68.5% of patients from India and in 76.3% of patients globally (P < 0.001). Diabetes and coronary artery disease (CAD) were prevalent in 36.2% and 28.1% of patients as compared with global prevalence of 22.2% and 21.6%, respectively (P < 0.001 for both). Antiplatelet therapy was the most common antithrombotic treatment in India. With increasing stroke risk, however, patients were more likely to receive oral anticoagulant therapy [mainly vitamin K antagonist (VKA)], but average international normalized ratio (INR) was lower among Indian patients [median INR value 1.6 (interquartile range {IQR}: 1.3-2.3) versus 2.3 (IQR 1.8-2.8) (P < 0.001)]. Compared with other countries, patients from India had markedly higher rates of all-cause mortality [7.68 per 100 person-years (95% confidence interval 6.32-9.35) vs 4.34 (4.16-4.53), P < 0.0001], while rates of stroke/systemic embolism and major bleeding were lower after 1 year of follow-up. CONCLUSION: Compared to previously published registries from India, the GARFIELD-AF registry describes clinical profiles and outcomes in Indian patients with AF of a different etiology. The registry data show that compared to the rest of the world, Indian AF patients are younger in age and have more diabetes and CAD. Patients with a higher stroke risk are more likely to receive anticoagulation therapy with VKA but are underdosed compared with the global average in the GARFIELD-AF. CLINICAL TRIAL REGISTRATION-URL: http://www.clinicaltrials.gov. Unique identifier: NCT01090362
    • 

    corecore