220 research outputs found

    Plant Litter Quality Affects the Accumulation Rate, Composition, and Stability of Mineral-associated Soil Organic Matter

    Get PDF
    Mineral-associated organic matter (MAOM) is a relatively large and stable fraction of soil organic matter (SOM). Plant litters with high rates of mineralization (high quality litters) are hypothesized to promote the accumulation of MAOM with greater efficiency than plant litters with low rates of mineralization (low-quality litters) because litters with high rates of mineralization maximize the synthesis of microbial products and most MAOM is microbial-derived. However, the effect of litter quality on MAOM is inconsistent. We conducted four repeated short-term incubations (46-d each) of four plant litters (alfalfa, oats, maize and soybean) in two low-carbon subsoils (sandy loam and silty loam) with and without nutrient addition. Our short-term incubations focused on the initial stage of litter decomposition during the time when litter quality has a measureable effect on mineralization rates. Plant litter quality had a much greater effect on litter-C mineralization rate and MAOM-C accumulation than did soil type or nutrient addition. Soils amended with high-quality oat and alfalfa litters had greater MAOM-C accumulation than soils amended with low-quality maize and soybean litters. However, soils amended with high-quality litters also had greater litter-C mineralization than soils amended with low-quality litters. As a result, the accumulation of MAOM-C per unit of litter-C mineralization was lower in soils amended with high-vs. low-quality litters (0.65 vs. 1.39 g MAOM-C accumulated g−1 C mineralized). Cellulose and hemicelluose indices of accumulated MAOM were greater for maize and soybean than oats and alfalfa, however, most carbohydrates in MAOM were plant-derived regardless of litter quality. At the end of the incubations, more of the accumulated MAOM-N was potentially mineralizable in soils amended with high quality litters. Nevertheless, most of the litter-C remained as residual litter; just 12% was mineralized to CO2 and 13% was transferred to MAOM. Our results demonstrate several unexpected effects of litter quality on MAOM stabilization including the direct stabilization of plant-derived carbohydrates

    Shifting attention in viewer- and object-based reference frames after unilateral brain injury

    Get PDF
    The aims of the present study were to investigate the respective roles that object- and viewer-based reference frames play in reorienting visual attention, and to assess their influence after unilateral brain injury. To do so, we studied 16 right hemisphere injured (RHI) and 13 left hemisphere injured (LHI) patients. We used a cueing design that manipulates the location of cues and targets relative to a display comprised of two rectangles (i.e., objects). Unlike previous studies with patients, we presented all cues at midline rather than in the left or right visual fields. Thus, in the critical conditions in which targets were presented laterally, reorienting of attention was always from a midline cue. Performance was measured for lateralized target detection as a function of viewer-based (contra- and ipsilesional sides) and object-based (requiring reorienting within or between objects) reference frames. As expected, contralesional detection was slower than ipsilesional detection for the patients. More importantly, objects influenced target detection differently in the contralesional and ipsilesional fields. Contralesionally, reorienting to a target within the cued object took longer than reorienting to a target in the same location but in the uncued object. This finding is consistent with object-based neglect. Ipsilesionally, the means were in the opposite direction. Furthermore, no significant difference was found in object-based influences between the patient groups (RHI vs. LHI). These findings are discussed in the context of reference frames used in reorienting attention for target detection

    Teaching Intercultural Competence in Translator Training

    Get PDF
    In this position paper we define an interculturally competent translator as one that demonstrates a high level of intercultural knowledge, skills, attitude and flexibility throughout his or her professional engagements. We argue that to attain this goal in translator training intercultural competence needs to be introduced into the curriculum explicitly and in a conceptually clear manner. In this article we provide an overview of earlier attempts at discussing the role of intercultural communication in translator training curricula and we discuss the various pedagogical and practical challenges involved. We also look at some future challenges, identifying increasing societal diversity as both a source of added urgency into intercultural training and a challenge for traditional biculturally based notions of translators’ intercultural competence and we argue for the central role of empathy. Finally, and importantly, we introduce the contributions to the special issue

    Epigenetic Silencing of the Circadian Clock Gene CRY1 is Associated with an Indolent Clinical Course in Chronic Lymphocytic Leukemia

    Get PDF
    Disruption of circadian rhythm is believed to play a critical role in cancer development. Cryptochrome 1 (CRY1) is a core component of the mammalian circadian clock and we have previously shown its deregulated expression in a subgroup of patients with chronic lymphocytic leukemia (CLL). Using real-time RT-PCR in a cohort of 76 CLL patients and 35 normal blood donors we now demonstrate that differential CRY1 mRNA expression in high-risk (HR) CD38+/immunoglobulin variable heavy chain gene (IgVH) unmutated patients as compared to low-risk (LR) CD38−/IgVH mutated patients can be attributed to down-modulation of CRY1 in LR CLL cases. Analysis of the DNA methylation profile of the CRY1 promoter in a subgroup of 57 patients revealed that CRY1 expression in LR CLL cells is silenced by aberrant promoter CpG island hypermethylation. The methylation pattern of the CRY1 promoter proved to have high prognostic impact in CLL where aberrant promoter methylation predicted a favourable outcome. CRY1 mRNA transcript levels did not change over time in the majority of patients where sequential samples were available for analysis. We also compared the CRY1 expression in CLL with other lymphoid malignancies and observed epigenetic silencing of CRY1 in a patient with B cell acute lymphoblastic leukemia (B-ALL)

    A proteomic survival predictor for COVID-19 patients in intensive care

    Get PDF
    Global healthcare systems are challenged by the COVID-19 pandemic. There is a need to optimize allocation of treatment and resources in intensive care, as clinically established risk assessments such as SOFA and APACHE II scores show only limited performance for predicting the survival of severely ill COVID-19 patients. Additional tools are also needed to monitor treatment, including experimental therapies in clinical trials. Comprehensively capturing human physiology, we speculated that proteomics in combination with new data-driven analysis strategies could produce a new generation of prognostic discriminators. We studied two independent cohorts of patients with severe COVID-19 who required intensive care and invasive mechanical ventilation. SOFA score, Charlson comorbidity index, and APACHE II score showed limited performance in predicting the COVID-19 outcome. Instead, the quantification of 321 plasma protein groups at 349 timepoints in 50 critically ill patients receiving invasive mechanical ventilation revealed 14 proteins that showed trajectories different between survivors and non-survivors. A predictor trained on proteomic measurements obtained at the first time point at maximum treatment level (i.e. WHO grade 7), which was weeks before the outcome, achieved accurate classification of survivors (AUROC 0.81). We tested the established predictor on an independent validation cohort (AUROC 1.0). The majority of proteins with high relevance in the prediction model belong to the coagulation system and complement cascade. Our study demonstrates that plasma proteomics can give rise to prognostic predictors substantially outperforming current prognostic markers in intensive care

    Evaluating the quality of social work supervision in UK children's services: comparing self-report and independent observations

    Get PDF
    Understanding how different forms of supervision support good social work practice and improve outcomes for people who use services is nearly impossible without reliable and valid evaluative measures. Yet the question of how best to evaluate the quality of supervision in different contexts is a complicated and as-yet-unsolved challenge. In this study, we observed 12 social work supervisors in a simulated supervision session offering support and guidance to an actor playing the part of an inexperienced social worker facing a casework-related crisis. A team of researchers analyzed these sessions using a customized skills-based coding framework. In addition, 19 social workers completed a questionnaire about their supervision experiences as provided by the same 12 supervisors. According to the coding framework, the supervisors demonstrated relatively modest skill levels, and we found low correlations among different skills. In contrast, according to the questionnaire data, supervisors had relatively high skill levels, and we found high correlations among different skills. The findings imply that although self-report remains the simplest way to evaluate supervision quality, other approaches are possible and may provide a different perspective. However, developing a reliable independent measure of supervision quality remains a noteworthy challenge

    Clinical and virological characteristics of hospitalised COVID-19 patients in a German tertiary care centre during the first wave of the SARS-CoV-2 pandemic: a prospective observational study

    Get PDF
    Purpose: Adequate patient allocation is pivotal for optimal resource management in strained healthcare systems, and requires detailed knowledge of clinical and virological disease trajectories. The purpose of this work was to identify risk factors associated with need for invasive mechanical ventilation (IMV), to analyse viral kinetics in patients with and without IMV and to provide a comprehensive description of clinical course. Methods: A cohort of 168 hospitalised adult COVID-19 patients enrolled in a prospective observational study at a large European tertiary care centre was analysed. Results: Forty-four per cent (71/161) of patients required invasive mechanical ventilation (IMV). Shorter duration of symptoms before admission (aOR 1.22 per day less, 95% CI 1.10-1.37, p < 0.01) and history of hypertension (aOR 5.55, 95% CI 2.00-16.82, p < 0.01) were associated with need for IMV. Patients on IMV had higher maximal concentrations, slower decline rates, and longer shedding of SARS-CoV-2 than non-IMV patients (33 days, IQR 26-46.75, vs 18 days, IQR 16-46.75, respectively, p < 0.01). Median duration of hospitalisation was 9 days (IQR 6-15.5) for non-IMV and 49.5 days (IQR 36.8-82.5) for IMV patients. Conclusions: Our results indicate a short duration of symptoms before admission as a risk factor for severe disease that merits further investigation and different viral load kinetics in severely affected patients. Median duration of hospitalisation of IMV patients was longer than described for acute respiratory distress syndrome unrelated to COVID-19
    corecore