105 research outputs found

    insights into assembly processes

    Get PDF
    Changes in species diversity following volcanic eruptions have been studied extensively, but our knowledge on functional diversity and community assembly under such conditions is very limited. Here, we study the processes following the destruction of vegetation after a volcanic eruption. Specifically, we investigate (1) the temporal patterns of taxonomic and functional diversity over time since a previous eruption (alpha diversity) and beta diversity, (2) the temporal patterns of 26 individual traits (vegetative characteristics, plant taxa ecological preferences, and regenerative characteristics) providing more detailed information on species strategies at the initial and later stages of succession, and (3) the processes driving species assembly and whether they changed over time since the eruption an eruption. We analyzed data recorded during five floristic censuses that took place between 1911 and 2011, calculated alpha and beta facets of taxonomic and functional diversity and examined how community structure changed over time, using 26 functional characteristics, based on their ability to discern primary from later colonists, including longevity, growth form, Ellenberg's indicator values, seed production and weight, flower size and sex, pollination type, and dispersal mode. Null model analysis was used to test whether the observed functional diversity deviates from random expectations. Alpha diversity, both taxonomic and functional, increased over time after an eruption, while beta diversity did not display a clear trend. This finding indicates that mainly abiotic processes determine species assembly over time after an eruption (at least for the time span studied here), contrary to theoretical expectations. It is most interesting that, simultaneously, some aspects of diversity indicated the effect of biotic interactions (facilitation and competition) on the assembly of species a few years after an eruption. This finding implies a legacy effect, since a high percentage of perennial species was noticed in the assemblage right after the eruption, as well as the effect of the harsh environmental conditions on the assembly of the plant communities. In conclusion, our results indicate the role of legacy effects in succession (most probably through the survival of underground plant parts) and underline the importance of disturbance history in providing the context needed for understanding effects of past events on succession

    Ecological strategies in stable and disturbed environments depend on species specialisation

    Get PDF
    Ecological strategies are integral to understanding species survival in different environments. However, how habitat specialisation is involved in such strategies is not fully understood, particularly in heterogeneous and disturbed environments. Here, we studied the trait associations between specialisation, dispersal ability, competitiveness, reproductive investment and survival rate in a spatially explicit metacommunity model under various disturbance rates. Though no unique trait values were associated with specialisation, relationships were uncovered depending on environmental factors. We found strong trait associations mainly for generalist species, while specialist species exhibited a larger range of trait combinations. Trait associations were driven first by the disturbance rate and second by species’ dispersal ability and generation overlap. With disturbance, low dispersal ability was strongly selected against, for specialists as well as for generalists. Our results demonstrate that habitat specialisation is critical for the emergence of trait strategies in metacommunities and that disturbance in interaction with dispersal ability limits not only the range of trait values but also the type of possible trait associations

    Electronic patient-reported outcomes monitoring during lung cancer chemotherapy: A nested cohort within the PRO-TECT pragmatic trial (AFT-39)

    Get PDF
    Objectives: Patients with lung cancer have high symptom burden and diminished quality of life. Electronic patient-reported outcome (PRO) platforms deliver repeated longitudinal surveys via web or telephone to patients and alert clinicians about concerning symptoms. This study aims to determine feasibility of electronic PRO monitoring in lung cancer patients receiving treatment in community settings. Methods: Adults receiving treatment for advanced or metastatic lung cancer at 26 community sites were invited to participate in a prospective trial of weekly electronic PRO symptom monitoring for 12 months (NCT03249090). Surveys assessing patients’ satisfaction with the electronic PRO system were administered at 3 months. Descriptive statistics were generated for demographics, survey completion rates, symptom occurrence, and provider PRO alert management approaches. Pairwise relationships between symptom items were evaluated using intra-individual repeated-measures correlation coefficients. Results: Lung cancer patients (n = 118) participating in electronic PROs were older (mean 64.4 vs 61.9 years, p = 0.03), had worse performance status (p = 0.002), more comorbidities (p = 0.02), and less technology experience than patients with other cancers. Of delivered weekly PRO surveys over 12 months, 91% were completed. Nearly all (97%) patients reported concerning (i.e., severe or worsening) symptoms during participation, with 33% of surveys including concerning symptoms. Pain was the most frequent and longest lasting symptom and was associated with reduced activity level. More than half of alerts to clinicians for concerning symptoms led to intervention. The majority (87%) would recommend using electronic PRO monitoring to other lung cancer patients. Conclusions: Remote longitudinal weekly monitoring of patients with lung cancer using validated electronic PRO surveys was feasible in a multicenter, community-based pragmatic study. A high symptom burden specific to lung cancer was detected and clinician outreach in response to alerts was frequent, suggesting electronic PROs may be a beneficial strategy for identifying actionable symptoms and allow opportunities to optimize well-being in this population

    Negative perceptions of aging and decline in walking speed: A self-fulfilling prophecy

    Get PDF
    Introduction Walking speed is a meaningful marker of physical function in the aging population. While it is a primarily physical measure, experimental studies have shown that merely priming older adults with negative stereotypes about aging results in immediate declines in objective walking speed. What is not clear is whether this is a temporary experimental effect or whether negative aging stereotypes have detrimental effects on long term objective health. We sought to explore the association between baseline negative perceptions of aging in the general population and objective walking speed 2 years later. Method 4,803 participations were assessed over 2 waves of The Irish Longitudinal Study on Ageing (TILDA), a prospective, population representative study of adults aged 50+ in the Republic of Ireland. Wave 1 measures – which included the Aging Perceptions Questionnaire, walking speed and all covariates - were taken between 2009 and 2011. Wave 2 measures – which included a second measurement of walking speed and covariates - were collected 2 years later between March and December 2012. Walking speed was measured as the number of seconds to complete the Timed Up-And-Go (TUG) task. Participations with a history of stroke, Parkinson’s disease or an MMSE < 18 were excluded. Results After full adjustment for all covariates (age, gender, level of education, disability, chronic conditions, medications, global cognition and baseline TUG) negative perceptions of aging at baseline were associated with slower TUG speed 2 years later (B=.03, 95% CI = .01 to 05, p< .01). Conclusions Walking speed has previously been considered to be a consequence of physical decline but these results highlight the direct role of psychological state in predicting an objective aging outcome. Negative perceptions about aging are a potentially modifiable risk factor of some elements of physical decline in aging

    Translating land cover/land use classifications to habitat taxonomies for landscape monitoring: A Mediterranean assessment

    Get PDF
    Periodic monitoring of biodiversity changes at a landscape scale constitutes a key issue for conservation managers. Earth observation (EO) data offer a potential solution, through direct or indirect mapping of species or habitats. Most national and international programs rely on the use of land cover (LC) and/or land use (LU) classification systems. Yet, these are not as clearly relatable to biodiversity in comparison to habitat classifications, and provide less scope for monitoring. While a conversion from LC/LU classification to habitat classification can be of great utility, differences in definitions and criteria have so far limited the establishment of a unified approach for such translation between these two classification systems. Focusing on five Mediterranean NATURA 2000 sites, this paper considers the scope for three of the most commonly used global LC/LU taxonomies—CORINE Land Cover, the Food and Agricultural Organisation (FAO) land cover classification system (LCCS) and the International Geosphere-Biosphere Programme to be translated to habitat taxonomies. Through both quantitative and expert knowledge based qualitative analysis of selected taxonomies, FAO-LCCS turns out to be the best candidate to cope with the complexity of habitat description and provides a framework for EO and in situ data integration for habitat mapping, reducing uncertainties and class overlaps and bridging the gap between LC/LU and habitats domains for landscape monitoring—a major issue for conservation. This study also highlights the need to modify the FAO-LCCS hierarchical class description process to permit the addition of attributes based on class-specific expert knowledge to select multi-temporal (seasonal) EO data and improve classification. An application of LC/LU to habitat mapping is provided for a coastal Natura 2000 site with high classification accuracy as a result

    Using classification and regression tree modelling to investigate response shift patterns in dentine hypersensitivity

    Get PDF
    BACKGROUND: Dentine hypersensitivity (DH) affects people's quality of life (QoL). However changes in the internal meaning of QoL, known as Response shift (RS) may undermine longitudinal assessment of QoL. This study aimed to describe patterns of RS in people with DH using Classification and Regression Trees (CRT) and to explore the convergent validity of CRT with the then-test and ideals approaches. METHODS: Data from an 8-week clinical trial of mouthwashes for dentine hypersensitivity (n = 75) using the Dentine Hypersensitivity Experience Questionnaire (DHEQ) as the outcome measure, were analysed. CRT was used to examine 8-week changes in DHEQ total score as a dependent variable with clinical status for DH and each DHEQ subscale score (restrictions, coping, social, emotional and identity) as independent variables. Recalibration was inferred when the clinical change was not consistent with the DHEQ change score using a minimally important difference for DHEQ of 22 points. Reprioritization was inferred by changes in the relative importance of each subscale to the model over time. RESULTS: Overall, 50.7% of participants experienced a clinical improvement in their DH after treatment and 22.7% experienced an important improvement in their quality of life. Thirty-six per cent shifted their internal standards downward and 14.7% upwards, suggesting recalibration. Reprioritization occurred over time among the social and emotional impacts of DH. CONCLUSIONS: CRT was a useful method to reveal both, the types and nature of RS in people with a mild health condition and demonstrated convergent validity with design based approaches to detect RS

    European Red List of Habitats Part 2. Terrestrial and freshwater habitats

    Get PDF

    Elevational Patterns of Species Richness, Range and Body Size for Spiny Frogs

    Get PDF
    Quantifying spatial patterns of species richness is a core problem in biodiversity theory. Spiny frogs of the subfamily Painae (Anura: Dicroglossidae) are widespread, but endemic to Asia. Using spiny frog distribution and body size data, and a digital elevation model data set we explored altitudinal patterns of spiny frog richness and quantified the effect of area on the richness pattern over a large altitudinal gradient from 0–5000 m a.s.l. We also tested two hypotheses: (i) the Rapoport's altitudinal effect is valid for the Painae, and (ii) Bergmann's clines are present in spiny frogs. The species richness of Painae across four different altitudinal band widths (100 m, 200 m, 300 m and 400 m) all showed hump-shaped patterns along altitudinal gradient. The altitudinal changes in species richness of the Paini and Quasipaini tribes further confirmed this finding, while the peak of Quasipaini species richness occurred at lower elevations than the maxima of Paini. The area did not explain a significant amount of variation in total, nor Paini species richness, but it did explain variation in Quasipaini. Five distinct groups across altitudinal gradient were found. Species altitudinal ranges did not expand with an increase in the midpoints of altitudinal ranges. A significant negative correlation between body size and elevation was exhibited. Our findings demonstrate that Rapoport's altitudinal rule is not a compulsory attribute of spiny frogs and also suggest that Bergmann's rule is not generally applicable to amphibians. The study highlights a need to explore the underlying mechanisms of species richness patterns, particularly for amphibians in macroecology
    corecore