3 research outputs found
Recommended from our members
To stay or go? A mixed methods study of psychiatry trainees’ intentions to leave training
This mixed methods research study aimed to test a tailored version of the job demands-resources (JD-R) model to uncover what factors contribute to psychiatry trainees’ intentions to leave their training and how. A Web-based survey measured psychiatry trainees’ work conditions, well-being, occupational commitment, and intentions to leave training. The results were analyzed using structural equation modeling featuring validated constructs. Narrative interviews were analyzed using thematic analysis following the tailored JD-R model. Of 159 current London trainees who completed the questionnaire, 22.1% were thinking a lot about leaving training. Trainees with higher job demands, fewer resources, and less ability to detach from their work experienced higher burnout levels. More engaged and less burned-out trainees were more committed to their occupation and less inclined to leave training. The interviews identified that trainees’ decision to leave was not linear and took time to make. Trainees found their work environment challenging and reported reduced well-being and rethinking their career paths. The JD-R model is a useful tool to understand how medical trainees’ job demands and resources need to be balanced to maintain their well-being and, in turn, how this affects their commitment to the occupation and training
Revisiting the D-RECT tool: Validation of an instrument measuring residents’ learning climate perceptions
<div><p></p><p><b>Introduction:</b> Credible evaluation of the learning climate requires valid and reliable instruments in order to inform quality improvement activities. Since its initial validation the Dutch Residency Educational Climate Test (D-RECT) has been increasingly used to evaluate the learning climate, yet it has not been tested in its final form and on the actual level of use – the department.</p><p><b>Aim:</b> Our aim was to re-investigate the internal validity and reliability of the D-RECT at the resident and department levels.</p><p><b>Methods:</b> D-RECT evaluations collected during 2012–2013 were included. Internal validity was assessed using exploratory and confirmatory factor analyses. Reliability was assessed using generalizability theory.</p><p><b>Results:</b> In total, 2306 evaluations and 291 departments were included. Exploratory factor analysis showed a 9-factor structure containing 35 items: teamwork, role of specialty tutor, coaching and assessment, formal education, resident peer collaboration, work is adapted to residents’ competence, patient sign-out, educational atmosphere, and accessibility of supervisors. Confirmatory factor analysis indicated acceptable to good fit. Three resident evaluations were needed to assess the overall learning climate reliably and eight residents to assess the subscales.</p><p><b>Conclusion:</b> This study reaffirms the reliability and internal validity of the D-RECT in measuring residency training learning climate. Ongoing evaluation of the instrument remains important.</p></div