317 research outputs found

    Changes in North American mammal niche preferences from the late Pleistocene to the present

    Get PDF
    Human population has exponentially grown since the last glaciation, especially across temperate areas with easy access to water sources, excluding mammal species from their former habitats. Thus, we anticipate a change in environmental niche preferences for temperature and precipitation as increased human population forces mammal species into more extreme climates within their environmental tolerances. For our study, we collected species occurrences from 20,000 ybp to the present for 59 North American mammal species. We inferred temperature and precipitation for each location using paleoclimate simulations (CCSM3). Overall, we found that mammals now live in areas that are warmer and dryer on average, as mean annual temperatures rise and precipitation decreases. Their niches have significantly changed in the last 20,000 years for most climate variables except for maximum average monthly temperature and minimum average monthly precipitation, which still maintain a hard limit on geographic boundaries. Our results suggest that although they avoid some climate extremes, including hot temperatures and dry climates, most mammals in our dataset adapt to new climate conditions instead of moving to new geographic areas. This could be related to a high niche plasticity for climate or to geographic and anthropogenic dispersal limitations that prevent animals from migrating to new localities as human population increases and climate changes. Geographic models that integrate fossil and modern niche preferences and dispersal limitations will help elucidate the reasons behind the observed patterns. Moreover, understanding these patterns will help us formulate better conservation plans for the species we wish to protect

    Trophically Integrated Ecometric Models as Tools for Demonstrating Spatial and Temporal Functional Changes in Mammal Communities

    Get PDF
    We are in a modern biodiversity crisis that will restructure community compositions and ecological functions globally. Large mammals, important contributors to ecosystem function, have been affected directly by purposeful extermination and indirectly by climate and land-use changes, yet functional turnover is rarely assessed on a global scale using metrics based on functional traits. Using ecometrics, the study of functional trait distributions and functional turnover, we examine the relationship between vegetation cover and locomotor traits for artiodactyl and carnivoran communities. We show that the ability to detect a functional relationship is strengthened when locomotor traits of both primary consumers (artiodactyls, n = 157 species) and secondary consumers (carnivorans, n = 138 species) are combined into one trophically integrated ecometric model. Overall, locomotor traits of 81% of communities accurately estimate vegeta-tion cover, establishing the advantage of trophically integrated ecometric models over single-group models (58 to 65% correct). We develop an innovative approach within the ecometrics framework, using ecometric anomalies to evaluate mismatches in model estimates and observed values and provide more nuance for understanding relationships between functional traits and vegetation cover. We apply our integrated model to five paleontological sites to illustrate mismatches in the past and today and to demonstrate the utility of the model for paleovegetation interpretations. Observed changes in com-munity traits and their associated vegetations across space and over time demonstrate the strong, rapid effect of environmental filtering on community traits. Ultimately, our trophically integrated ecometric model captures the cascading interactions between taxa, traits, and changing environment

    Equilibrium responses of global net primary production and carbon storage to doubled atmospheric carbon dioxide: sensitivity to changes in vegetation nitrogen concentration

    Get PDF
    We ran the terrestrial ecosystem model (TEM) for the globe at 0.5° resolution for atmospheric CO2 concentrations of 340 and 680 parts per million by volume (ppmv) to evaluate global and regional responses of net primary production (NPP) and carbon storage to elevated CO2 for their sensitivity to changes in vegetation nitrogen concentration. At 340 ppmv, TEM estimated global NPP of 49.0 1015 g (Pg) C yr−1 and global total carbon storage of 1701.8 Pg C; the estimate of total carbon storage does not include the carbon content of inert soil organic matter. For the reference simulation in which doubled atmospheric CO2 was accompanied with no change in vegetation nitrogen concentration, global NPP increased 4.1 Pg C yr−1 (8.3%), and global total carbon storage increased 114.2 Pg C. To examine sensitivity in the global responses of NPP and carbon storage to decreases in the nitrogen concentration of vegetation, we compared doubled CO2 responses of the reference TEM to simulations in which the vegetation nitrogen concentration was reduced without influencing decomposition dynamics (“lower N” simulations) and to simulations in which reductions in vegetation nitrogen concentration influence decomposition dynamics (“lower N+D” simulations). We conducted three lower N simulations and three lower N+D simulations in which we reduced the nitrogen concentration of vegetation by 7.5, 15.0, and 22.5%. In the lower N simulations, the response of global NPP to doubled atmospheric CO2 increased approximately 2 Pg C yr−1 for each incremental 7.5% reduction in vegetation nitrogen concentration, and vegetation carbon increased approximately an additional 40 Pg C, and soil carbon increased an additional 30 Pg C, for a total carbon storage increase of approximately 70 Pg C. In the lower N+D simulations, the responses of NPP and vegetation carbon storage were relatively insensitive to differences in the reduction of nitrogen concentration, but soil carbon storage showed a large change. The insensitivity of NPP in the N+D simulations occurred because potential enhancements in NPP associated with reduced vegetation nitrogen concentration were approximately offset by lower nitrogen availability associated with the decomposition dynamics of reduced litter nitrogen concentration. For each 7.5% reduction in vegetation nitrogen concentration, soil carbon increased approximately an additional 60 Pg C, while vegetation carbon storage increased by only approximately 5 Pg C. As the reduction in vegetation nitrogen concentration gets greater in the lower N+D simulations, more of the additional carbon storage tends to become concentrated in the north temperate-boreal region in comparison to the tropics. Other studies with TEM show that elevated CO2 more than offsets the effects of climate change to cause increased carbon storage. The results of this study indicate that carbon storage would be enhanced by the influence of changes in plant nitrogen concentration on carbon assimilation and decomposition rates. Thus changes in vegetation nitrogen concentration may have important implications for the ability of the terrestrial biosphere to mitigate increases in the atmospheric concentration of CO2 and climate changes associated with the increases

    User-centered Development of STOP (Successful Treatment for Paranoia)::Material Development and Usability Testing for a Digital Therapeutic for Paranoia

    Get PDF
    Background: Paranoia is a highly debilitating mental health condition. One novel intervention for paranoia is cognitive bias modification for paranoia (CBM-pa). CBM-pa comes from a class of interventions that focus on manipulating interpretation bias. Here, we aimed to develop and evaluate new therapy content for CBM-pa for later use in a self-administered digital therapeutic for paranoia called STOP ("Successful Treatment of Paranoia"). Objective: This study aimed to (1) take a user-centered approach with input from living experts, clinicians, and academics to create and evaluate paranoia-relevant item content to be used in STOP and (2) engage with living experts and the design team from a digital health care solutions company to cocreate and pilot-test the STOP mobile app prototype. Methods: We invited 18 people with living or lived experiences of paranoia to create text exemplars of personal, everyday emotionally ambiguous scenarios that could provoke paranoid thoughts. Researchers then adapted 240 suitable exemplars into corresponding intervention items in the format commonly used for CBM training and created 240 control items for the purpose of testing STOP. Each item included newly developed, visually enriching graphics content to increase the engagement and realism of the basic text scenarios. All items were then evaluated for their paranoia severity and readability by living experts (n=8) and clinicians (n=7) and for their item length by the research team. Items were evenly distributed into six 40-item sessions based on these evaluations. Finalized items were presented in the STOP mobile app, which was co-designed with a digital health care solutions company, living or lived experts, and the academic team; user acceptance was evaluated across 2 pilot tests involving living or lived experts. Results: All materials reached predefined acceptable thresholds on all rating criteria: paranoia severity (intervention items: ≥1; control items: ≤1, readability: ≥3, and length of the scenarios), and there was no systematic difference between the intervention and control group materials overall or between individual sessions within each group. For item graphics, we also found no systematic differences in users' ratings of complexity (P=.68), attractiveness (P=.15), and interest (P=.14) between intervention and control group materials. User acceptance testing of the mobile app found that it is easy to use and navigate, interactive, and helpful. Conclusions: Material development for any new digital therapeutic requires an iterative and rigorous process of testing involving multiple contributing groups. Appropriate user-centered development can create user-friendly mobile health apps, which may improve face validity and have a greater chance of being engaging and acceptable to the target end users.</p

    Quality of life outcomes for people with serious mental illness living in supported accommodation: Systematic review and meta-analysis

    Get PDF
    Michele Harrison - ORCID 0000-0001-6088-2998 https://orcid.org/0000-0001-6088-2998Replaced AM with VoR 2020-05-25Purpose: To conduct a systematic review and meta-analysis of Quality of Life (QoL) outcomes for people with serious mental illness living in three types of supported accommodation.Method: Studies were identified that described QoL outcomes for people with serious mental illness living in supported accommodation in 6 electronic databases. We applied a random-effects model to derive the meta-analytic results.Results: 13 studies from 7 countries were included, with 3276 participants receiving; high support (457), supported housing (1576) and floating outreach (1243). QoL outcomes related to wellbeing, living conditions and social functioning were compared between different supported accommodation types. Living condition outcomes were better for people living in supported housing ( = -0.31; CI = [-0.47; -0.16]) and floating outreach ( = -0.95; CI = [-1.30; -0.61]) compared to high support accommodation, with a medium effect size for living condition outcomes between supported housing and floating outreach ( = -0.40; CI = [-0.82; 0.03]), indicating that living conditions are better for people living in floating outreach. Social functioning outcomes were significant for people living in supported housing compared to high support ( = -0.37; CI = [-0.65; -0.09]), with wellbeing outcomes not significant between the three types of supported accommodation.Conclusion: There is evidence that satisfaction with living conditions differs across supported accommodation types. The results suggest there is a need to focus on improving social functioning and wellbeing outcomes for people with serious mental illness across supported accommodation types.https://doi.org/10.1007/s00127-020-01885-x55pubpu

    A Dual Function for Prickle in Regulating Frizzled Stability during Feedback-Dependent Amplification of Planar Polarity

    Get PDF
    The core planar polarity pathway coordinates epithelial cell polarity during animal development, and loss of its activity gives rise to a range of defects, from aberrant morphogenetic cell movements to failure to correctly orient structures, such as hairs and cilia. The core pathway functions via a mechanism involving segregation of its protein components to opposite cells ends, where they form asymmetric intracellular complexes that couple cell-cell polarity. This segregation is a self-organizing process driven by feedback interactions between the core proteins themselves. Despite intense efforts, the molecular pathways underlying feedback have proven difficult to elucidate using conventional genetic approaches. Here we investigate core protein function during planar polarization of the Drosophila wing by combining quantitative measurements of protein dynamics with loss-of-function genetics, mosaic analysis, and temporal control of gene expression. Focusing on the key core protein Frizzled, we show that its stable junctional localization is promoted by the core proteins Strabismus, Dishevelled, Prickle, and Diego. In particular, we show that the stabilizing function of Prickle on Frizzled requires Prickle activity in neighboring cells. Conversely, Prickle in the same cell has a destabilizing effect on Frizzled. This destabilizing activity is dependent on the presence of Dishevelled and blocked in the absence of Dynamin and Rab5 activity, suggesting an endocytic mechanism. Overall, our approach reveals for the first time essential in vivo stabilizing and destabilizing interactions of the core proteins required for self-organization of planar polarity

    Observational pain assessment in older persons with dementia in four countries:observer agreement of items and factor structure of the Pain Assessment in Impaired Cognition

    Get PDF
    Background: Recognition of pain in people with dementia is challenging. Observational scales have been developed, but there is a need to harmonize and improve the assessment process. In EU initiative COST-Action TD1005, 36 promising items were selected from existing scales to be tested further. We aimed to study the observer agreement of each item, and to analyse the factor structure of the complete set. Methods: One hundred and ninety older persons with dementia were recruited in four different countries (Italy, Serbia, Spain and The Netherlands) from different types of healthcare facilities. Patients represented a convenience sample, with no pre-selection on presence of (suspected) pain. The Pain Assessment in Impaired Cognition (PAIC, research version) item pool includes facial expressions of pain (15 items), body movements (10 items) and vocalizations (11 items). Participants were observed by health professionals in two situations, at rest and during movement. Intrarater and interrater reliability was analysed by percentage agreement. The factor structure was examined with principal component analysis with orthogonal rotation. Results: Health professionals performed observations in 40–57 patients in each country. Intrarater and interrater agreement was generally high (≥70%). However, for some facial expression items, agreement was sometimes below 70%. Factor analyses showed a six-component solution, which were named as follows: Vocal pain expression, Face anatomical descriptors, Protective body movements, Vocal defence, Tension and Lack of affect. Conclusions: Observation of PAIC items can be done reliably in healthcare settings. Observer agreement is quite promising already without extensive training. Significance: In this international project, promising items from existing observational pain scales were identified and evaluated regarding their reliability as an alternative to pain self-report in people with dementia. Analysis on factor structure helped to understand the character of the items. Health professionals from four countries using four different European languages were able to rate items reliably. The results contributed to an informed reduction of items for a clinical observer scale (Pain Assessment in Impaired Cognition scale with 15 items: PAIC15)

    Implementing nursing best practice guidelines: Impact on patient referrals

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Although referring patients to community services is important for optimum continuity of care, referrals between hospital and community sectors are often problematic. Nurses are well positioned to inform patients about referral resources. The objective of this study is to describe the impact of implementing six nursing best practice guidelines (BPGs) on nurses' familiarity with patient referral resources and referral practices.</p> <p>Methods</p> <p>A prospective before and after design was used. For each BPG topic, referral resources were identified. Information about these resources was presented at education sessions for nurses. Pre- and post-questionnaires were completed by a random sample of 257 nurses at 7 hospitals, 2 home visiting nursing services and 1 public health unit. Average response rates for pre- and post-implementation questionnaires were 71% and 54.2%, respectively. Chart audits were completed for three BPGs (n = 421 pre- and 332 post-implementation). Post-hospital discharge patient interviews were conducted for four BPGs (n = 152 pre- and 124 post-implementation).</p> <p>Results</p> <p>There were statistically significant increases in nurses' familiarity with resources for all BPGs, and self-reported referrals to specific services for three guidelines. Higher rates of referrals were observed for services that were part of the organization where the nurses worked. There was almost a complete lack of referrals to Internet sources. No significant differences between pre- and post-implementation referrals rates were observed in the chart documentation or in patients' reports of referrals.</p> <p>Conclusion</p> <p>Implementing nursing BPGs, which included recommendations on patient referrals produced mixed results. Nurses' familiarity with referral resources does not necessarily change their referral practices. Nurses can play a vital role in initiating and supporting appropriate patient referrals. BPGs should include specific recommendations on effective referral processes and this information should be tailored to the community setting where implementation is taking place.</p

    Long-lasting effects of land use history on soil fungal communities in second-growth tropical rain forests

    Get PDF
    Our understanding of the long-lasting effects of human land use on soil fungal communities in tropical forests is limited. Yet, over 70% of all remaining tropical forests are growing in former agricultural or logged areas. We investigated the relationship among land use history, biotic and abiotic factors, and soil fungal community composition and diversity in a second-growth tropical forest in Puerto Rico. We coupled high-throughput DNA sequencing with tree community and environmental data to determine whether land use history had an effect on soil fungal community descriptors. We also investigated the biotic and abiotic factors that underlie such differences and asked whether the relative importance of biotic (tree diversity, basal tree area, and litterfall biomass) and abiotic (soil type, pH, iron, and total carbon, water flow, and canopy openness) factors in structuring soil fungal communities differed according to land use history. We demonstrated long-lasting effects of land use history on soil fungal communities. At our research site, most of the explained variation in soil fungal composition (R2 = 18.6%), richness (R2 = 11.4%), and evenness (R2 = 10%) was associated with edaphic factors. Areas previously subject to both logging and farming had a soil fungal community with lower beta diversity and greater evenness of fungal operational taxonomic units (OTUs) than areas subject to light logging. Yet, fungal richness was similar between the two areas of historical land use. Together, these results suggest that fungal communities in disturbed areas are more homogeneous and diverse than in areas subject to light logging. Edaphic factors were the most strongly correlated with soil fungal composition, especially in areas subject to light logging, where soils are more heterogenous. High functional tree diversity in areas subject to both logging and farming led to stronger correlations between biotic factors and fungal composition than in areas subject to light logging. In contrast, fungal richness and evenness were more strongly correlated with biotic factors in areas of light logging, suggesting that these metrics might reflect long-term associations in old-growth forests. The large amount of unexplained variance in fungal composition suggests that these communities are structured by both stochastic and niche assemblage processes
    corecore