22 research outputs found

    Guidelines: the do’s, don’ts and don’t knows of feedback for clinical education

    Get PDF
    IntroductionThe guidelines offered in this paper aim to amalgamate the literature on formative feedback into practical Do’s, Don’ts and Don’t Knows for individual clinical supervisors and for the institutions that support clinical learning.MethodsThe authors built consensus by an iterative process. Do’s and Don’ts were proposed based on authors’ individual teaching experience and awareness of the literature, and the amalgamated set of guidelines were then refined by all authors and the evidence was summarized for each guideline. Don’t Knows were identified as being important questions to this international group of educators which if answered would change practice. The criteria for inclusion of evidence for these guidelines were not those of a systematic review, so indicators of strength of these recommendations were developed which combine the evidence with the authors’ consensus.ResultsA set of 32 Do and Don’t guidelines with the important Don’t Knows was compiled along with a summary of the evidence for each. These are divided into guidelines for the individual clinical supervisor giving feedback to their trainee (recommendations about both the process and the content of feedback) and guidelines for the learning culture (what elements of learning culture support the exchange of meaningful feedback, and what elements constrain it?)ConclusionFeedback is not easy to get right, but it is essential to learning in medicine, and there is a wealth of evidence supporting the Do’s and warning against the Don’ts. Further research into the critical Don’t Knows of feedback is required. A new definition is offered: Helpful feedback is a supportive conversation that clarifies the trainee’s awareness of their developing competencies, enhances their self-efficacy for making progress, challenges them to set objectives for improvement, and facilitates their development of strategies to enable that improvement to occur

    Action research: towards excellence in teaching, assessment and feedback for clinical consultation skills

    Get PDF
    Background Consultation skills are the core competencies required at graduation of the doctor as a practitioner. Every medical school has its own system of teaching and assessing consultation skills. These are generally amalgams of previous curricula and not rigorously developed. We took the opportunity presented by a new undergraduate medical curriculum to systematically develop the consultation skills curriculum from classroom teaching to OSCE assessment and formative workplace-based assessment and feedback. Methods The consultation skills curriculum and assessment system were developed by action research. Data were collected using mixed methods involving questionnaires, focus groups, participant interviews, student reflective summaries and routine monitoring of usage of an app which we developed for generating feedback summaries in the clinical workplace. Participants were teachers and students at Keele University school of medicine. In addition, clinical tutors from seven other UK medical schools participated in a Delphi study of undergraduate medical consultation skills competencies. Results A case study of curriculum development by action research is presented in nine published papers. Conclusion This work has contributed to medical education knowledge as follows: an instrument for assessment of consultation skills has been developed and validated; and a set of strategies for improvement of these consultation skills have been developed and validated. It has added to understanding about transfer of learning from the classroom to the workplace; the impact of assessment grades on medical students’ learning and self-perception; and the value of a system of formal workplace-based assessment. Additionally this work was one of the first applications of realist methods in medical education research, and it has developed guidance on feedback in the workplace for individual tutors and educational institutions

    Can learning from workplace feedback be enhanced by reflective writing? A realist evaluation in UK undergraduate medical education

    Get PDF
    Introduction Doctors and medical students in the UK are currently required to provide evidence of learning by reflective writing on (among other things) feedback from colleagues. Although the theoretical value of reflecting-on-action is clear, research is still needed to know how to realise the potential of written reflection in medical education. This study arose out of efforts to improve medical student engagement with a reflective writing exercise. We used realist methodology to explain the disinclination of the majority to do written reflection on workplace feedback, and the benefits to the minority. Method Realist evaluation is a suitable approach to researching complex interventions which have worked for some and not for others. Focus groups were held over a three-year period with year 3 and 4 students. Focus group transcripts were coded for context-mechanism-outcome configurations (the realist approach to analysing data) explaining students’ choice not to write a reflection, to write a ‘tick-box’ reflection or to write for learning. A sub-set of eight students’ reflections were also analysed to ascertain evidence of learning through reflection. Results and Discussion 27 students participated in 4 focus groups. Three summary theories emerged showing the importance of context. Firstly, written reflection is effortful and benefits those who invest in it for intrinsic reasons in situations when they need to think more deeply about a learning event. Secondly, following a reflective feedback discussion writing a reflection may add little because the learning has already taken place. Thirdly, external motivation tends to result in writing a ‘tick-box’ reflection

    Feasibility study of student telehealth interviews

    Get PDF
    BackgroundThe COVID-19 pandemic has led to medical students being taught remote clinical communication modalities (telephone and video). Junior students have not generally been included in this and have had less patient contact than previously. This study aimed to examine the feasibility from the junior student viewpoint of conducting both modalities of patient telehealth interviews.MethodsAn electronic questionnaire was used to discover Year 1 student reasons for their preferred modality after they had conducted one telephone and one video interview in pairs with a patient volunteer. Student views on the challenge and benefits of each were also sought.FindingsA total of 55 (32.7% of the cohort) responded, of whom 82% preferred video consultation, 75.6% of those stating being able to see their patient/partner was a key factor. About 5% preferred telephone interview, and 13% had no preference. Telephone interviews were perceived as the more challenging (40% versus 12.7%); however, challenge did not directly link with lack of comfort. There were some technical/connectivity issues with both modalities, and the telephone call system was more complex to set up. Turn-taking was more difficult by telephone without visual cues.DiscussionThis is the first direct comparison study in junior medical students of real patient interviews by video or telephone. Students embraced the challenge and, although preferring video and finding telephone more challenging, valued each as an educational experience.ConclusionsTelehealth interviews with patients for junior students are feasible, give needed patient exposure, practical insights into remote modalities and consolidate communication skills learnt in the classroom

    Understanding and developing procedures for video-based assessment in medical education

    Get PDF
    IntroductionNovel uses of video aim to enhance assessment in health-professionals education. Whilst these uses presume equivalence between video and live scoring, some research suggests that poorly understood variations could challenge validity. We aimed to understand examiners’ and students’ interaction with video whilst developing procedures to promote its optimal use.MethodsUsing design-based research we developed theory and procedures for video use in assessment, iteratively adapting conditions across simulated OSCE stations. We explored examiners’ and students’ perceptions using think-aloud, interviews and focus group. Data were analysed using constructivist grounded-theory methods.ResultsVideo-based assessment produced detachment and reduced volitional control for examiners. Examiners ability to make valid video-based judgements was mediated by the interaction of station content and specifically selected filming parameters. Examiners displayed several judgemental tendencies which helped them manage videos’ limitations but could also bias judgements in some circumstances. Students rarely found carefully-placed cameras intrusive and considered filming acceptable if adequately justified.DiscussionSuccessful use of video-based assessment relies on balancing the need to ensure station-specific information adequacy; avoiding disruptive intrusion; and the degree of justification provided by video’s educational purpose. Video has the potential to enhance assessment validity and students’ learning when an appropriate balance is achieved

    How well do UK assistantships equip medical students for graduate practice? Think EPAs

    Get PDF
    The goal of better medical student preparation for clinical practice drives curricular initiatives worldwide. Learning theory underpins Entrustable Professional Activities (EPAs) as a means of safe transition to independent practice. Regulators mandate senior assistantships to improve practice readiness. It is important to know whether meaningful EPAs occur in assistantships, and with what impact. Final year students at one UK medical school kept learning logs and audio-diaries for six one-week periods during a year-long assistantship. Further data were also obtained through interviewing participants when students and after three months as junior doctors. This was combined with data from new doctors from 17 other UK schools. Realist methods explored what worked for whom and why. 32 medical students and 70 junior doctors participated. All assistantship students reported engaging with EPAs but gaps in the types of EPAs undertaken exist, with level of entrustment and frequency of access depending on the context. Engagement is enhanced by integration into the team and shared understanding of what constitutes legitimate activities. Improving the shared understanding between student and supervisor of what constitutes important assistantship activity may result in an increase in the amount and/or quality of EPAs achieved

    Training healthcare professionals to be ready for practice in an era of social distancing: A realist evaluation

    Get PDF
    Background Programme changes due to the COVID-19 pandemic have impacted variably on preparation for practice of healthcare professional students. Explanations for such variability in outcomes between institutions and healthcare professions have yet to be explored. The aim of our study was to understand what clinical learning, whilst under socially distanced restrictions, worked and why (or why not). Methods We conducted a realist evaluation of the undergraduate healthcare programmes at one UK university in 2020-21. The initial programme theories to be tested in this study were derived from discussions with programme leads about the changes they implemented due to the pandemic. Study participants were students and teaching faculty. Online interview transcripts were coded, identifying why the interventions in the programme had worked or not. This resulted in a set of ‘context-mechanism-outcome’ (CMO) statements about each intervention. The initial programme theories were refined as a result. Results and discussion 29 students and 22 faculty members participated. 18 CMO configurations were identified relating to clinical skills learning and 25 relating to clinical placements. Clinical skills learning was successful whether in person, remote or hybrid if it followed the steps of: demonstration – explanation – mental rehearsal – attempt with feedback. Where it didn’t work there was usually a lack of observation and corrective feedback. Placements were generally highly valued despite gaps in experience. Being useful on placements was felt to be good preparation for practice. Participant explanations from junior students about the value of various modes of induction to clinical workplace activity may also be relevant post-pandemic
    corecore