100 research outputs found

    The SLS-Berlin: Validation of a German Computer-Based Screening Test to Measure Reading Proficiency in Early and Late Adulthood

    Get PDF
    Reading proficiency, i.e., successfully integrating early word-based information and utilizing this information in later processes of sentence and text comprehension, and its assessment is subject to extensive research. However, screening tests for German adults across the life span are basically non-existent. Therefore, the present article introduces a standardized computerized sentence-based screening measure for German adult readers to assess reading proficiency including norm data from 2,148 participants covering an age range from 16 to 88 years. The test was developed in accordance with the children’s version of the Salzburger LeseScreening (SLS, Wimmer and Mayringer, 2014). The SLS-Berlin has a high reliability and can easily be implemented in any research setting using German language. We present a detailed description of the test and report the distribution of SLS-Berlin scores for the norm sample as well as for two subsamples of younger (below 60 years) and older adults (60 and older). For all three samples, we conducted regression analyses to investigate the relationship between sentence characteristics and SLS-Berlin scores. In a second validation study, SLS-Berlin scores were compared with two (pseudo)word reading tests, a test measuring attention and processing speed and eye-movements recorded during expository text reading. Our results confirm the SLS-Berlin’s sensitivity to capture early word decoding and later text related comprehension processes. The test distinguished very well between skilled and less skilled readers and also within less skilled readers and is therefore a powerful and efficient screening test for German adults to assess interindividual levels of reading proficiency

    A Diffusion Model Analysis

    Get PDF
    Effects of stimulus length on reaction times (RTs) in the lexical decision task are the topic of extensive research. While slower RTs are consistently found for longer pseudo-words, a finding coined the word length effect (WLE), some studies found no effects for words, and yet others reported faster RTs for longer words. Moreover, the WLE depends on the orthographic transparency of a language, with larger effects in more transparent orthographies. Here we investigate processes underlying the WLE in lexical decision in German-English bilinguals using a diffusion model (DM) analysis, which we compared to a linear regression approach. In the DM analysis, RT-accuracy distributions are characterized using parameters that reflect latent sub-processes, in particular evidence accumulation and decision-independent perceptual encoding, instead of typical parameters such as mean RT and accuracy. The regression approach showed a decrease in RTs with length for pseudo-words, but no length effect for words. However, DM analysis revealed that the null effect for words resulted from opposing effects of length on perceptual encoding and rate of evidence accumulation. Perceptual encoding times increased with length for words and pseudo-words, whereas the rate of evidence accumulation increased with length for real words but decreased for pseudo-words. A comparison between DM parameters in German and English suggested that orthographic transparency affects perceptual encoding, whereas effects of length on evidence accumulation are likely to reflect contextual information and the increase in available perceptual evidence with length. These opposing effects may account for the inconsistent findings on WLEs

    A Hierarchical Diffusion Model Analysis of Age Effects on Visual Word Recognition

    Get PDF
    Reading is one of the most popular leisure activities and it is routinely performed by most individuals even in old age. Successful reading enables older people to master and actively participate in everyday life and maintain functional independence. Yet, reading comprises a multitude of subprocesses and it is undoubtedly one of the most complex accomplishments of the human brain. Not surprisingly, findings of age-related effects on word recognition and reading have been partly contradictory and are often confined to only one of four central reading subprocesses, i.e., sublexical, orthographic, phonological and lexico-semantic processing. The aim of the present study was therefore to systematically investigate the impact of age on each of these subprocesses. A total of 1,807 participants (young, N = 384; old, N = 1,423) performed four decision tasks specifically designed to tap one of the subprocesses. To account for the behavioral heterogeneity in older adults, this subsample was split into high and low performing readers. Data were analyzed using a hierarchical diffusion modeling approach, which provides more information than standard response time/accuracy analyses. Taking into account incorrect and correct response times, their distributions and accuracy data, hierarchical diffusion modeling allowed us to differentiate between age- related changes in decision threshold, non-decision time and the speed of information uptake. We observed longer non-decision times for older adults and a more conservative decision threshold. More importantly, high-performing older readers outperformed younger adults at the speed of information uptake in orthographic and lexico-semantic processing, whereas a general age- disadvantage was observed at the sublexical and phonological levels. Low- performing older readers were slowest in information uptake in all four subprocesses. Discussing these results in terms of computational models of word recognition, we propose age-related disadvantages for older readers to be caused by inefficiencies in temporal sampling and activation and/or inhibition processes

    Decision-Making in Teaching Processes and the Role of Mood: A Study with Preservice Teachers in Germany

    Get PDF
    The internship that preservice teachers complete early in the course of their studies paves the way for their transition from the role of student to that of teacher. It gives them a first opportunity to apply theoretical knowledge and develop practical skills, especially to improve their decision-making competences in the three-part process of teaching: planning a lesson, teaching it, and reflecting on the teaching performance (PTR). The present study addresses two research questions. First, to what extent do preservice teachers perceive themselves to be more competent in PTR after their initial teaching internship? Second, to what extent does the individual mood correlate with any reported improvement? 592 preservice teachers participated in the study. Using latent change score modelling, we found learning gains in all three dimensions of PTR. In addition, the results show that negative mood predicts processes of planning and reflecting following the internship, but has no effect on the actual teaching of the lesson

    Structural gray matter features and behavioral preliterate skills predict future literacy – A machine learning approach

    Get PDF
    When children learn to read, their neural system undergoes major changes to become responsive to print. There seem to be nuanced interindividual differences in the neurostructural anatomy of regions that later become integral parts of the reading network. These differences might affect literacy acquisition and, in some cases, might result in developmental disorders like dyslexia. Consequently, the main objective of this longitudinal study was to investigate those interindividual differences in gray matter morphology that might facilitate or hamper future reading acquisition. We used a machine learning approach to examine to what extent gray matter macrostructural features and cognitive-linguistic skills measured before formal literacy teaching could predict literacy 2 years later. Forty-two native German-speaking children underwent T1-weighted magnetic resonance imaging and psychometric testing at the end of kindergarten. They were tested again 2 years later to assess their literacy skills. A leave-one-out cross-validated machine-learning regression approach was applied to identify the best predictors of future literacy based on cognitive-linguistic preliterate behavioral skills and cortical measures in a priori selected areas of the future reading network. With surprisingly high accuracy, future literacy was predicted, predominantly based on gray matter volume in the left occipito-temporal cortex and local gyrification in the left insular, inferior frontal, and supramarginal gyri. Furthermore, phonological awareness significantly predicted future literacy. In sum, the results indicate that the brain morphology of the large-scale reading network at a preliterate age can predict how well children learn to read

    Barriers to integrating direct oral anticoagulants into anticoagulation clinic care: A mixedâ methods study

    Full text link
    BackgroundOutpatient anticoagulation clinics were initially developed to care for patients taking vitamin K antagonists such as warfarin. There has not been a systematic evaluation of the barriers and facilitators to integrating direct oral anticoagulant (DOAC) care into outpatient anticoagulation clinics.MethodsWe performed a mixed methods study consisting of an online survey of anticoagulation clinic providers and semiâ structured interviews with anticoagulation clinic leaders and managers between March and May of 2017. Interviews were transcribed and coded, exploring for themes around barriers and facilitators to DOAC care within anticoagulation clinics. Survey questions pertaining to the specific themes identified in the interviews were analyzed using summary statistics.ResultsSurvey responses were collected from 159 unique anticoagulation clinics and 20 semiâ structured interviews were conducted. Three primary barriers to DOAC care in the anticoagulation clinic were described by the interviewees: (a) a lack of provider awareness for ongoing monitoring and services provided by the anticoagulation clinic; (b) financial challenges to providing care to DOAC patients in an anticoagulation clinic model; and (c) clinical knowledge versus scope of care by the anticoagulation staff. These themes linked to three key areas of variation, including: (a) the size and hospital affiliation of the anticoagulation clinic; (b) the use of faceâ toâ face versus telephoneâ based care; and (c) the use of nurses or pharmacists in the anticoagulation clinic.ConclusionsAnticoagulation clinics in the United States experience important barriers to integrating DOAC care. These barriers vary based on the clinic size, model for warfarin care, and staff credentials (nursing or pharmacy).Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/147845/1/rth212157.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/147845/2/rth212157_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/147845/3/rth212157-sup-0001-Supinfo.pd

    Periprocedural bridging anticoagulation in patients with venous thromboembolism: A registry- based cohort study

    Full text link
    BackgroundUse of bridging anticoagulation increases a patient’s bleeding risk without clear evidence of thrombotic prevention among warfarin- treated patients with atrial fibrillation. Contemporary use of bridging anticoagulation among warfarin- treated patients with venous thromboembolism (VTE) has not been studied.MethodsWe identified warfarin- treated patients with VTE who temporarily stopped warfarin for a surgical procedure between 2010 and 2018 at six health systems. Using the 2012 American College of Chest Physicians guideline, we assessed use of periprocedural bridging anticoagulation based on recurrent VTE risk. Recurrent VTE risk and 30- day outcomes (bleeding, thromboembolism, emergency department visit) were each assessed using logistic regression adjusted for multiple procedures per patient.ResultsDuring the study period, 789 warfarin- treated patients with VTE underwent 1529 procedures (median, 2; interquartile range, 1- 4). Unadjusted use of bridging anticoagulation was more common in patients at high risk for VTE recurrence (99/171, 57.9%) than for patients at moderate (515/1078, 47.8%) or low risk of recurrence (134/280, 47.86%). Bridging anticoagulation use was higher in high- risk patients compared with low- or moderate- risk patients in both unadjusted (P = .013) and patient- level cluster- adjusted analyses (P = .031). Adherence to American College of Chest Physicians guidelines in high- and low- risk patients did not change during the study period (odds ratio, 0.98 per year; 95% confidence interval, 0.91- 1.05). Adverse events were rare and not statistically different between the two treatment groups.ConclusionsBridging anticoagulation was commonly overused among low- risk patients and underused among high- risk patients treated with warfarin for VTE. Adverse events were rare and not different between the two treatment groups.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/156139/2/jth14903_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/156139/1/jth14903.pd

    Creatinine monitoring patterns in the setting of direct oral anticoagulant therapy for non-valvular atrial fibrillation

    Get PDF
    Guidelines and experts note that patients with atrial fibrillation require regular renal function monitoring to ensure safe use of direct oral anticoagulants (DOACs). Insufficient monitoring could lead to inappropriate dosing and adverse events. Our objective was to describe the frequency of insufficient creatinine monitoring among patients on DOACs, and to describe clinical factors associated with insufficient monitoring. We hypothesized that renal impairment would be associated with insufficient monitoring. A retrospective cohort study was performed with data from the Michigan Anticoagulant Quality Improvement Initiative. Patients were included if they initiated DOAC therapy for stroke prevention related to atrial fibrillation, remained on therapy for ≥ 1 year, and had baseline creatinine and weight measurements. Creatinine clearance (CrCl) was calculated via Cockcroft-Gault equation. Our outcome was the presence of insufficient creatinine monitoring, defined as: \u3c 1 creatinine level/year for patients with CrCl \u3e 50, or \u3c 2 creatinine levels/year for patients with CrCl ≤ 50. Multivariable analysis was done via logistic regression. Study population included 511 patients. In overall, 14.0% of patients received insufficient monitoring. Among patients with CrCl \u3e 50, 11.5% had \u3c 1 creatinine level/year. Among patients with CrCl ≤ 50, 27.1% received \u3c 2 creatinine levels/year. Baseline renal dysfunction was associated with a higher likelihood of insufficient creatinine monitoring (adjusted odds ratio 3.64, 95% confidence interval 1.81-7.29). This shows a significant gap in the monitoring of patients on DOACs-patients with renal impairment are already at higher risk for adverse events. Future studies are needed to describe the barriers in monitoring these patients and to identify how to optimally address them
    corecore