63 research outputs found

    Using Phenology to Unravel Differential Soil Water Use and Productivity in a Semiarid Savanna

    Get PDF
    Savannas are water-limited ecosystems characterized by two dominant plant types: trees and an understory primarily made up grass. Different phenology and root structures of these plant types complicate how savanna primary productivity responds to changes in water availability. We tested the hypothesis that productivity in savannas is controlled by the temporal and vertical distribution of soil water content (SWC) and differences in growing season length of understory and tree plant functional types. To quantify the relationship between tree, understory, and savanna-wide phenology and productivity, we used PhenoCam and satellite observations surrounding an eddy covariance tower at a semiarid savanna site in Arizona, USA. We distinguished between SWC across two different depth intervals (shallow, \u3c0–30 cm and deep, \u3e30–100 cm). We found that tree greenness increased with SWC at both depths, while understory greenness was only sensitive to the shallower SWC measurements. Onset of ecosystem dormancy, estimated from satellite observations close to the eddy covariance tower, explained more variability in annual gross primary productivity (GPP) than in other phenometrics. Higher SWC led to an extended growing season, caused by delayed dormancy in trees, but the understory showed no evidence of delayed dormancy in wetter periods. We infer that the timing of ecosystem scale dormancy, driven by trees, is important in understanding changes in a savanna\u27s GPP. These findings highlight the important effects of rainfall during the winter. These findings suggest that savanna GPP is conditional on different responses to moisture availability in each of the dominant vegetation components

    A comparative study of the effects of four treatment regimes on ivermectin efficacy, body weight and pasture contamination in lambs naturally infected with gastrointestinal nematodes in Scotland

    Get PDF
    AbstractRefugia-based drenching regimes have been widely recommended to slow development of anthelmintic resistance but there are few comparisons between different treatment approaches in the UK. The impact of four ivermectin treatment regimes on drug efficacy, lamb body weight and nematode contamination during a 154 day grazing season were evaluated in a consecutive five year field study. Regimes were whole-flock treatment every 4weeks (NST), targeted selective treatment (TST) based on individual performance, strategic whole-flock treatments at pre-determined times (SPT) or whole-flock treatment when clinical signs were apparent (MT). Mean numbers of ivermectin drenches administered per season were 4.0, 1.8, 2.0 and 1.4 for NST, TST, SPT and MT groups, respectively. The mean anthelmintic efficacy (AE) for each treatment group was based on faecal egg count reduction post-treatment employing a bootstrap sampling based algorithm. Mean AE was 95–98% for all groups in 2006 and mean AE (95% confidence limits) for NST declined to 62% (55%, 68%) in 2010. In comparison, AE for TST, SPT and MT in 2010 were 86% (81%, 92%), 86% (83%, 90%) and 83% (78%, 88%), respectively. Body weight in TST and SPT was similar to NST in all years (p>0.05), however MT lambs were lighter than NST in 2006–2008 (pâ©œ0.04). Tracer lamb worm burdens was lowest in NST but was not significantly different between other groups. Overall, both the TST and SPT regimes appeared to maintain animal performance and conserve anthelmintic efficacy compared with a neo-suppressive anthelmintic treatment regime

    Staphylococcus aureus Bloodstream Infections in Older Adults: Clinical Outcomes and Risk Factors for In-Hospital Mortality

    Full text link
    To assess clinical outcomes and identify risk factors for mortality in older adults with Staphylococcus aureus bloodstream infection (SAB).Retrospective review.University of Michigan Health System, Ann Arbor.All patients aged 80 and older with SAB between January 2004 and July 2008.Clinical data, including comorbid conditions, SAB source, echocardiography results, Charlson Comorbidity Index, mortality (in-hospital and 6-month), and need for rehospitalization or chronic care after discharge.Seventy-six patients aged 80 and older (mean 85.5 ± 4.2) with SAB were identified. Infection sources included 14 (18.4%) vascular catheter associated, 16 (21.1%) wound related, seven (9.2%) endocarditis, five (6.6%) intravascular, and 19 (25%) with unknown source; 46 (60.5%) patients had methicillin-resistant strains. Twenty-two (28.9%) patients underwent surgery or device placement within 30 days of developing SAB; 10 of these 22 had SAB associated with surgical site infection (SSI). Twenty two (28.9%) patients died in the hospital or were discharged to hospice care; at least 43 (56.6%) patients died within 6 months of presentation, and eight were lost to follow-up. Unknown source of bacteremia (odds ratio=5.2, P =.008) was independently associated with in-hospital death. Echocardiography was not pursued in 45% of patients. Of surviving patients, 40 (74.1%) required skilled care after discharge; eight (20%) required rehospitalization.SAB was associated with high mortality rates in patients aged 80 and older. The observed association between SAB and SSI may direct preventive strategies such as perioperative decolonization or antimicrobial prophylaxis. Interventions to optimize clinical care practices in elderly patients with SAB are essential given the associated morbidity and mortality.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/78653/1/j.1532-5415.2009.02666.x.pd

    Medical Care Capacity for Influenza Outbreaks, Los Angeles

    Get PDF
    In December 1997, media reported hospital overcrowding and “the worst [flu epidemic] in the past two decades” in Los Angeles County (LAC). We found that rates of pneumonia and influenza deaths, hospitalizations, and claims were substantially higher for the 1997–98 influenza season than the previous six seasons. Hours of emergency medical services (EMS) diversion (when emergency departments could not receive incoming patients) peaked during the influenza seasons studied; the number of EMS diversion hours per season also increased during the seasons 1993–94 to 1997–98, suggesting a decrease in medical care capacity during influenza seasons. Over the seven influenza seasons studied, the number of licensed beds decreased 12%, while the LAC population increased 5%. Our findings suggest that the capacity of health-care systems to handle patient visits during influenza seasons is diminishing

    Disaster Risks Research and Assessment to Promote Risk Reduction and Management

    Get PDF
    Natural hazard events lead to disasters when the events interact with exposed and vulnerable physical and social systems. Despite significant progress in scientific understanding of physical phenomena leading to natural hazards as well as of vulnerability and exposure, disaster losses due to natural events do not show a tendency to decrease. This tendency is associated with many factors including increase in populations and assets at risk as well as in frequency and/or magnitude of natural events, especially those related to hydro-meteorological and climatic hazards. But essentially disaster losses increase because some of the elements of the multidimensional dynamic disaster risk system are not accounted for risk assessments. A comprehensive integrated system analysis and periodic assessment of disaster risks at any scale, from local to global, based on knowledge and data/information accumulated so far, are essential scientific tools that can assist in recognition and reduction of disaster risks. This paper reviews and synthesizes the knowledge of natural hazards, vulnerabilities, and disaster risks and aims to highlight potential contributions of science to disaster risk reduction (DRR) in order to provide policy-makers with the knowledge necessary to assist disaster risk mitigation and disaster risk management (DRM)

    Demographic and biologic influences on survival in whites and blacks: 40 years of follow-up in the Charleston heart study

    Get PDF
    BACKGROUND: In the United States, life expectancy is significantly lower among blacks than whites. We examined whether socioeconomic status (SES) and cardiovascular disease (CVD) risk factors may help explain this disparity. METHODS: Forty years (1961 through 2000) of all-cause mortality data were obtained on a population-based cohort of 2,283 subjects in the Charleston Heart Study (CHS). We examined the influence of SES and CVD risk factors on all-cause mortality. RESULTS: Complete data were available on 98% of the original sample (647 white men, 728 white women, 423 black men, and 443 black women). After adjusting for SES and CVD risk factors, the hazard ratios (HRs) for white ethnicity were 1.14 (0.98 to 1.32) among men and 0.90 (0.75 to 1.08) among women, indicating that the mortality risk was 14% greater for white men and 10% lower for white women compared to their black counterparts. However the differences were not statistically significant. CONCLUSION: While there are marked contrasts in mortality among blacks and whites in the CHS, the differences can be largely explained by SES and CVD risk factors. Continued focus on improving and controlling cardiovascular disease risk factors may reduce ethnic disparities in survival

    Utility of clinical comprehensive genomic characterization for diagnostic categorization in patients presenting with hypocellular bone marrow failure syndromes

    Get PDF
    Bone marrow failure (BMF) related to hypoplasia of hematopoietic elements in the bone marrow is a heterogeneous clinical entity with a broad differential diagnosis including both inherited and acquired causes. Accurate diagnostic categorization is critical to optimal patient care and detection of genomic variants in these patients may provide this important diagnostic and prognostic information. We performed real-time, accredited (ISO15189) comprehensive genomic characterization including targeted sequencing and whole exome sequencing in 115 patients with BMF syndrome (median age 24 years, range 3 months - 81 years). In patients with clinical diagnoses of inherited BMF syndromes, acquired BMF syndromes or clinically unclassifiable BMF we detected variants in 52% (12/23), 53% (25/47) and 56% (25/45) respectively. Genomic characterization resulted in a change of diagnosis in 30/115 (26%) including the identification of germline causes for 3/47 and 16/45 cases with pre-test diagnoses of acquired and clinically unclassifiable BMF respectively. The observed clinical impact of accurate diagnostic categorization included choice to perform allogeneic stem cell transplantation, disease-specific targeted treatments, identification of at-risk family members and influence of sibling allogeneic stem cell donor choice. Multiple novel pathogenic variants and copy number changes were identified in our cohort including in TERT, FANCA, RPS7 and SAMD9. Whole exome sequence analysis facilitated the identification of variants in two genes not typically associated with a primary clinical manifestation of BMF but also demonstrated reduced sensitivity for detecting low level acquired variants. In conclusion, genomic characterization can improve diagnostic categorization of patients presenting with hypoplastic BMF syndromes and should be routinely performed in this group of patients
    • 

    corecore