34 research outputs found

    Reproductive Biology of American Robins Following a Dutch Elm Disease Control Program

    Get PDF
    Reproductive biology of the American robin (Turdus migratorius Linnaeus) was studied on the Iowa State University campus during the spring and early summer of 1977. Although the robin breeding population was below that reported for similar habitats elsewhere, it was appreciably larger than during the late 1960\u27s when a major reduction in number of breeding robins at Iowa State followed the use of DDT. Breeding robins, as indicated by number of nests, were more numerous in 1977 than in any year during the height of the Dutch elm disease control program, 1962-70. Basic reproduction parameters such as clutch size and hatching success were similar to those in the pre-DDT era suggesting that the robin population in 1977 had regained its pre-DDT level

    Effects of climate and landcover change on stream discharge in the Ozark Highlands, USA

    Get PDF
    Stream discharge of a watershed is affected and altered by climate and landcover changes. These effects vary depending on the magnitude and interaction of the changes, and need to be understood so that local water resource availability can be evaluated and socioeconomic development within a watershed be pursued and managed in a way sustainable with the local water resources. In this study, the landcover and climate change effects on stream discharge from the Jacks Fork River basin in the Ozark Highlands of the south-central United States were examined in three phases: site observation and data collection, model calibration and simulation, and model experiment and analysis. Major results of the study show that climate fluctuations between wet and dry extremes resulted in the same change of the basin discharge regardless of the landcover condition in the basin. On the other hand, under a specified climate condition landcover change from a grassland basin to a fully forested basin only resulted in about one half of the discharge change caused by the climate variation. Furthermore, when landcover change occurred simultaneously with climate variation, the basin discharge change amplified significantly and became larger than the combined discharge changes caused by the climate and landcover change alone, a result indicating a synergistic effect of landcover and climate change on basin discharge variability

    Genetic mechanisms of critical illness in COVID-19.

    Get PDF
    Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 ×  10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Did smokefree legislation in England reduce exposure to secondhand smoke among nonsmoking adults? Cotinine analysis from the Health Survey for England.

    Get PDF
    Background: On 1 July 2007, smokefree legislation was implemented in England, which made virtually all enclosed public places and workplaces smokefree. Objectives: We examined trends in and predictors of secondhand smoke exposure among nonsmoking adults to determine whether exposure changed after the introduction of smokefree legislation and whether these changes varied by socioeconomic status (SES) and by household smoking status. Methods: We analyzed salivary cotinine data from the Health Survey for England that were collected in 7 of 11 annual surveys undertaken between 1998 and 2008. We conducted multivariate regression analyses to examine secondhand smoke exposure as measured by the proportion of nonsmokers with undetectable levels of cotinine and by geometric mean cotinine. Results: Secondhand smoke exposure was higher among those exposed at home and among lower-SES groups. Exposure declined markedly from 1998 to 2008 (the proportion of participants with undetectable cotinine was 2.9 times higher in the last 6 months of 2008 compared with the first 6 months of 1998 and geometric mean cotinine declined by 80%). We observed a significant fall in exposure after legislation was introduced—the odds of having undetectable cotinine were 1.5 times higher [95% confidence interval (CI): 1.3, 1.8] and geometric mean cotinine fell by 27% (95% CI: 17%, 36%) after adjusting for the prelegislative trend and potential confounders. Significant reductions were not, however, seen in those living in lower-social class households or homes where smoking occurs inside on most days. Conclusions: We found that the impact of England’s smokefree legislation on secondhand smoke exposure was above and beyond the underlying long-term decline in secondhand smoke exposure and demonstrates the positive effect of the legislation. Nevertheless, some population subgroups appear not to have benefitted significantly from the legislation. This finding suggests that these groups should receive more support to reduce their exposure

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Suppression of Smooth Brome by Atrazine, Mowing, and Fire

    Get PDF
    Burning and mowing were evaluated in 1989-1991 at Pipestone National Monument, Minnesota, as alternatives to atrazine to suppress smooth brome (Bromus inermis Leyss.) and to affect seeding success of big bluestem (Andropogon gerardii Vitman). Atrazine was the only treatment that significantly reduced smooth brome tiller density (-77% 1990; -70% 1991) as compared to unburned controls. Neither burning (-16% 1990; -37% 1991) nor mowing (-16% 1990; +10% 1991) resulted in significant reductions. Sod-seeded big bluestem failed in all treatments in both years. The failure of chemical and non-chemical management to affect sufficient smooth brome control for big bluestem seeding success exemplify the difficulties in restoring native and seeded prairie

    Burn-based smooth brome management in tallgrass prairie

    No full text
    The invasion and persistence of smooth brome (Bromus inermis Leyss.) is a serious problem facing managers of prairie remnants on the northern Great Plains. This study, conducted from 1988-91 at Mead, Nebraska, and from 1989-91 at Pipestone National Monument, Minnesota, consisted of 3 components. The first component measured changes in smooth brome tiller density and biomass and big bluestem (Andropogon gerardii Vitman) flower culm density in mixed stands following spring burns. Burning was timed to coincide with four smooth brome growth stages and included repeated burns in consecutive years. Burning during smooth brome tiller elongation, heading, and flowering significantly reduced tiller density and biomass. Burning early, at tiller emergence, had no effect on smooth brome in years when precipitation was normal or below normal. However, with above normal precipitation, biomass more than doubled after an early burn. Repeated burns, at the time of tiller elongation and later stages, maintained low tiller density and biomass. A single burn, however, allowed full to partial recovery. The greatest increase in big bluestem flower culm density was coincident with the greatest decrease in smooth brome tiller density. A second component of the study compared differences in the microenvironment between sites dominated by smooth brome and those dominated by big bluestem following early spring burning. Burning smooth brome dominated sites during a dry period resulted in lower soil moisture and lower soil temperatures when compared to burned sites dominated by big bluestem. The third component of this study compared the effectiveness of burning a pure stand of smooth brome at tiller elongation to other methods of smooth brome suppression before sod seeding big bluestem. Atrazine was the only treatment effective in reducing smooth brome tiller density. Burning did not suppress smooth brome tiller density to the degree found in other studies where warm-season grasses were present. Sod seeded big bluestem failed in all treatments. Results from the three components of the study were combined and used to construct a burn-based model for smooth brome management
    corecore