419 research outputs found

    Where to restore ecological connectivity? Detecting barriers and quantifying restoration benefits

    Get PDF
    Landscape connectivity is crucial for many ecological processes, including dispersal, gene flow, demographic rescue, and movement in response to climate change. As a result, governmental and non-governmental organizations are focusing efforts to map and conserve areas that facilitate movement to maintain population connectivity and promote climate adaptation. In contrast, little focus has been placed on identifying barriers—landscape features which impede movement between ecologically important areas—where restoration could most improve connectivity. Yet knowing where barriers most strongly reduce connectivity can complement traditional analyses aimed at mapping best movement routes. We introduce a novel method to detect important barriers and provide example applications. Our method uses GIS neighborhood analyses in conjunction with effective distance analyses to detect barriers that, if removed, would significantly improve connectivity. Applicable in least-cost, circuit-theoretic, and simulation modeling frameworks, the method detects both complete (impermeable) barriers and those that impede but do not completely block movement. Barrier mapping complements corridor mapping by broadening the range of connectivity conservation alternatives available to practitioners. The method can help practitioners move beyond maintaining currently important areas to restoring and enhancing connectivity through active barrier removal. It can inform decisions on trade-offs between restoration and protection; for example, purchasing an intact corridor may be substantially more costly than restoring a barrier that blocks an alternative corridor. And it extends the concept of centrality to barriers, highlighting areas that most diminish connectivity across broad networks. Identifying which modeled barriers have the greatest impact can also help prioritize error checking of land cover data and collection of field data to improve connectivity maps. Barrier detection provides a different way to view the landscape, broadening thinking about connectivity and fragmentation while increasing conservation options

    From Isotopes to TK Interviews: Towards Interdisciplinary Research in Fort Resolution and the Slave River Delta, Northwest Territories

    Get PDF
    Evolving research in Fort Resolution and the Slave River Delta, Northwest Territories, aims to improve understanding of how the natural ecosystem functions and responds to various environmental stressors, as well as to enhance the stewardship of natural resources and the capacity of local residents to respond to change. We seek to integrate approaches that span the natural and social sciences and traditional knowledge understandings of change, employing a research design developed in response to the concerns of a northern community. In doing so, we have strived for a research process that is collaborative, interdisciplinary, policy-oriented, and reflective of northern priorities. These elements characterize the new northern research paradigm increasingly promoted by various federal funding agencies, northern partners, and communities. They represent a holistic perspective in the pursuit of solutions to address complex environmental and socioeconomic concerns about impacts of climate change and resource development on northern societies. However, efforts to fulfill the objectives of this research paradigm are associated with a host of on-the-ground challenges. These challenges include (but are not restricted to) developing effective community partnerships and collaboration and documenting change through interdisciplinary approaches. Here we provide an overview of the components that comprise our interdisciplinary research program and offer an accounting of our formative experiences in confronting these challenges

    Cost‐effectiveness of real‐world administration of tobacco pharmacotherapy in the United States Veterans Health Administration

    Full text link
    Background and aimsCost‐effectiveness studies in randomized clinical trials have shown that tobacco cessation pharmacotherapy is among the most cost‐effective of health‐care interventions. Clinical trial eligibility criteria and treatment protocols may not be followed in actual practice. This study aimed to determine whether tobacco cessation pharmacotherapy is cost‐effective in real‐world settings.DesignA retrospective analysis of costs and outcomes.SettingHospitals and clinics of the US Veterans Health Administration, USA.ParticipantsA total of 589 862 US veterans who screened positive for tobacco use in 2011.Intervention and comparatorTobacco users who initiated smoking cessation pharmacotherapy in the 6 months after screening were compared with those who did not use pharmacotherapy in this period. Pharmacotherapy included nicotine replacement therapy, bupropion (if prescribed at 300 mg per day or specifically for tobacco cessation) or varenicline.MeasuresEffectiveness was determined from responses to a subsequent tobacco screening conducted between 7 and 18 months after the treatment observation period. Cost of medications and prescribing health‐care encounters was determined for the period between initial and follow‐up tobacco use screening. Multivariate fixed‐effects regression was used to assess the effect of initial treatment status on cost and outcome while controlling for differences in case‐mix with propensity weighting to adjust for confounding by indication.FindingsThirteen per cent of participants received tobacco cessation pharmacotherapy within 6 months of initial screening. After an average of an additional 218.1 days’ follow‐up, those who initially received pharmacotherapy incurred 143.79inadditionaltreatmentcostandhada3.1143.79 in additional treatment cost and had a 3.1% absolute increase in tobacco quit rates compared with those who were not initially treated. This represents an incremental cost‐effectiveness ratio of 4705 per quit. The upper limit of the 99.9% confidence region was 5600perquit.Withoutpropensityadjustment,thecost‐effectivenessratiowas5600 per quit. Without propensity adjustment, the cost‐effectiveness ratio was 7144 per quit, with the upper limit of the 99.9% confidence region 9500/quit.ConclusionsTobaccocessationpharmacotherapyprovidedbytheUSVeteransHealthAdministrationin2011/12wascost‐effectiveinthisreal‐worldsetting,withanincrementalcost‐effectivenessratioof9500/quit.ConclusionsTobacco cessation pharmacotherapy provided by the US Veterans Health Administration in 2011/12 was cost‐effective in this real‐world setting, with an incremental cost‐effectiveness ratio of 4705 per quit.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/150598/1/add14621_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/150598/2/add14621.pd

    Attributions of responsibility and blame for procrastination behavior

    Get PDF
    The present study examined the relationship between procrastination, delay, blameworthiness, and moral responsibility. Undergraduate students (N = 240) were provided two scenarios in which the reason for inaction (procrastination, delay), the target (self, other), and the outcome (positive, negative) were manipulated, and students were asked to rate the moral responsibility and blameworthiness of the agent. Results indicated that individuals who procrastinated were seen as more morally responsible and blameworthy than those who experienced delay. More specifically, after a negative outcome, procrastination was associated with more moral responsibility, whereas delay was associated with less moral responsibility. After a positive outcome, individuals perceived procrastination as deserving of less moral responsibility, and delays as associated with more moral responsibility. Finally, a three-way interaction showed that participants rated procrastination that resulted in failure as deserving of responsibility when engaged in by others as opposed to oneself

    Experimental Pharmacology in Transgenic Rodent Models of Alzheimer’s Disease

    Get PDF
    This Mini Review discusses the merits and shortfalls of transgenic (tg) rodents modeling aspects of the human Alzheimer’s disease (AD) pathology and their application to evaluate experimental therapeutics. It addresses some of the differences between mouse and rat tg models for these investigations. It relates, in a condensed fashion, the experience of our research laboratory with the application of anti-inflammatory compounds and S-adenosylmethionine (SAM) at the earliest stages of AD-like amyloid pathology in tg mice. The application of SAM was intended to revert the global brain DNA hypomethylation unleashed by the intraneuronal accumulation of amyloid-ÎČ-immunoreactive material, an intervention that restored levels of DNA methylation including of the bace1 gene. This review also summarizes experimental pharmacology observations made in the McGill tg rat model of AD-like pathology by applying “nano-lithium” or a drug with allosteric M1 muscarinic and sigma 1 receptor agonistic properties (AF710B). Extremely low doses of lithium (up to 400 times lower than used in the clinic) had remarkable beneficial effects on lowering pathology and improving cognitive functions in tg rats. Likewise, AF710B treatment, even at advanced stages of the pathology, displayed remarkable beneficial effects. This drug, in experimental conditions, demonstrated possible “disease-modifying” properties as pathology was frankly diminished and cognition improved after a month of “wash-out” period. The Mini-Review ends with a discussion on the predictive value of similar experimental pharmacological interventions in current rodent tg models. It comments on the validity of some of these approaches for early interventions at preclinical stages of AD, interventions which may be envisioned once definitive diagnosis of AD before clinical presentation is made possible

    Risk factors for hospital admission with RSV bronchiolitis in England: a population-based birth cohort study.

    Get PDF
    OBJECTIVE: To examine the timing and duration of RSV bronchiolitis hospital admission among term and preterm infants in England and to identify risk factors for bronchiolitis admission. DESIGN: A population-based birth cohort with follow-up to age 1 year, using the Hospital Episode Statistics database. SETTING: 71 hospitals across England. PARTICIPANTS: We identified 296618 individual birth records from 2007/08 and linked to subsequent hospital admission records during the first year of life. RESULTS: In our cohort there were 7189 hospital admissions with a diagnosis of bronchiolitis, 24.2 admissions per 1000 infants under 1 year (95%CI 23.7-24.8), of which 15% (1050/7189) were born preterm (47.3 bronchiolitis admissions per 1000 preterm infants (95% CI 44.4-50.2)). The peak age group for bronchiolitis admissions was infants aged 1 month and the median was age 120 days (IQR = 61-209 days). The median length of stay was 1 day (IQR = 0-3). The relative risk (RR) of a bronchiolitis admission was higher among infants with known risk factors for severe RSV infection, including those born preterm (RR = 1.9, 95% CI 1.8-2.0) compared with infants born at term. Other conditions also significantly increased risk of bronchiolitis admission, including Down's syndrome (RR = 2.5, 95% CI 1.7-3.7) and cerebral palsy (RR = 2.4, 95% CI 1.5-4.0). CONCLUSIONS: Most (85%) of the infants who are admitted to hospital with bronchiolitis in England are born at term, with no known predisposing risk factors for severe RSV infection, although risk of admission is higher in known risk groups. The early age of bronchiolitis admissions has important implications for the potential impact and timing of future active and passive immunisations. More research is needed to explain why babies born with Down's syndrome and cerebral palsy are also at higher risk of hospital admission with RSV bronchiolitis

    Cardiac mTOR complex 2 preserves ventricular function in pressure-overload hypertrophy

    Get PDF
    Mammalian target of rapamycin (mTOR), a central regulator of growth and metabolism, has tissue-specific functions depending on whether it is part of mTOR complex 1 (mTORC1) or mTORC2. We have previously shown that mTORC1 is required for adaptive cardiac hypertrophy and maintenance of function under basal and pressure-overload conditions. In the present study, we aimed to identify functions of mTORC2 in the heart.; Using tamoxifen-inducible cardiomyocyte-specific gene deletion, we generated mice deficient for cardiac rapamycin-insensitive companion of mTOR (rictor), an essential and specific component of mTORC2. Under basal conditions, rictor deficiency did not affect cardiac growth and function in young mice and also had no effects in adult mice. However, transverse aortic constriction caused dysfunction in the rictor-deficient hearts, whereas function was maintained in controls after 1 week of pressure overload. Adaptive increases in cardiac weight and cardiomyocyte cross-sectional area, fibrosis, and hypertrophic and metabolic gene expression were not different between the rictor-deficient and control mice. In control mice, maintained function was associated with increased protein levels of rictor, protein kinase C (PKC)ÎČII, and PKCÎŽ, whereas rictor ablation abolished these increases. Rictor deletion also significantly decreased PKCΔ at baseline and after pressure overload. Our data suggest that reduced PKCΔ and the inability to increase PKCÎČII and PKCÎŽ abundance are, in accordance with their known function, responsible for decreased contractile performance of the rictor-deficient hearts.; Our study demonstrates that mTORC2 is implicated in maintaining contractile function of the pressure-overloaded male mouse heart

    Food for contagion : synthesis and future directions for studying host-parasite responses to resource shifts in anthropogenic environments

    Get PDF
    Human-provided resource subsidies for wildlife are diverse, common and have profound consequences for wildlife-pathogen interactions, as demonstrated by papers in this themed issue spanning empirical, theoretical and management perspectives from a range of study systems. Contributions cut across scales of organization, from the within-host dynamics of immune function, to population-level impacts on parasite transmission, to landscape-and regional-scale patterns of infection. In this concluding paper, we identify common threads and key findings from author contributions, including the consequences of resource subsidies for (i) host immunity; (ii) animal aggregation and contact rates; (iii) host movement and landscape-level infection patterns; and (iv) interspecific contacts and cross-species transmission. Exciting avenues for future work include studies that integrate mechanistic modelling and empirical approaches to better explore cross-scale processes, and experimental manipulations of food resources to quantify host and pathogen responses. Work is also needed to examine evolutionary responses to provisioning, and ask how diet-altered changes to the host microbiome influence infection processes. Given the massive public health and conservation implications of anthropogenic resource shifts, we end by underscoring the need for practical recommendations to manage supplemental feeding practices, limit human-wildlife conflicts over shared food resources and reduce cross-species transmission risks, including to humans. This article is part of the theme issue 'Anthropogenic resource subsidies and host-parasite dynamics in wildlife'.Peer reviewe

    Implementation of the Tobacco Tactics intervention versus usual care in Trinity Health community hospitals

    Get PDF
    Abstract Background Guided by the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) implementation framework, a National Institutes of Health-sponsored study compared the nurse-administered Tobacco Tactics intervention to usual care. A prior paper describes the effectiveness of the Tobacco Tactics intervention. This subsequent paper provides data describing the remaining constructs of the RE-AIM framework. Methods This pragmatic study used a mixed methods, quasi-experimental design in five Michigan community hospitals of which three received the nurse-administered Tobacco Tactics intervention and two received usual care. Nurses and patients were surveyed pre- and post-intervention. Measures included reach (patient participation rates, characteristics, and receipt of services), adoption (nurse participation rates and characteristics), implementation (pre-to post-training changes in nurses' attitudes, delivery of services, barriers to implementation, opinions about training, documentation of services, and numbers of volunteer follow-up phone calls), and maintenance (continuation of the intervention once the study ended). Results Reach: Patient participation rates were 71.5 %. Compared to no change in the control sites, there were significant pre- to post-intervention increases in self-reported receipt of print materials in the intervention hospitals (n = 1370, p < 0.001). Adoption: In the intervention hospitals, all targeted units and several non-targeted units participated; 76.0 % (n = 1028) of targeted nurses and 317 additional staff participated in the training, and 92.4 % were extremely or somewhat satisfied with the training. Implementation: Nurses in the intervention hospitals reported increases in providing advice to quit, counseling, medications, handouts, and DVD (all p < 0.05) and reported decreased barriers to implementing smoking cessation services (p < 0.001). Qualitative comments were very positive (“user friendly,” “streamlined,” or “saves time”), although problems with showing patients the DVD and charting in the electronic medical record were noted. Maintenance: Nurses continued to provide the intervention after the study ended. Conclusions Given that nurses represent the largest group of front-line providers, this intervention, which meets Joint Commission guidelines for treating inpatient smokers, has the potential to have a wide reach and to decrease smoking, morbidity, and mortality among inpatient smokers. As we move toward more population-based interventions, the RE-AIM framework is a valuable guide for implementation. Trial registration ClinicalTrials.gov, NCT0130921

    Climate change reduces extent of temperate drylands and intensifies drought in deep soils

    Get PDF
    Drylands cover 40% of the global terrestrial surface and provide important ecosystem services. While drylands as a whole are expected to increase in extent and aridity in coming decades, temperature and precipitation forecasts vary by latitude and geographic region suggesting different trajectories for tropical, subtropical, and temperate drylands. Uncertainty in the future of tropical and subtropical drylands is well constrained, whereas soil moisture and ecological droughts, which drive vegetation productivity and composition, remain poorly understood in temperate drylands. Here we show that, over the twenty first century, temperate drylands may contract by a third, primarily converting to subtropical drylands, and that deep soil layers could be increasingly dry during the growing season. These changes imply major shifts in vegetation and ecosystem service delivery. Our results illustrate the importance of appropriate drought measures and, as a global study that focuses on temperate drylands, highlight a distinct fate for these highly populated areas
    • 

    corecore