3 research outputs found

    Express Team-Based Learning (eTBL): A Time-Efficient TBL Approach in Neuroradiology

    No full text
    Rationale and Objectives Team-based learning (TBL) is a student-centred, teacher-directed instructional method that promotes active learning. The application phase of TBL stimulates group discussion and critical thinking, which could be useful for learning radiology. We designed and evaluated two modified TBL-sessions on computed tomography and magnetic resonance imaging diagnostics in neuroradiology. Our aim was to examine what effects engaging students in in-class team application tasks had on student learning. Materials and Methods A cross-over study was conducted, including 105 third-year medical students using two modified TBL sessions as the active learning intervention compared with two traditional lectures as a control. Student learning was assessed by results on the neuroradiology part of the end-of-year written examination. Student engagement and perceptions were assessed using the Student Self-Report of Engagement Measure and an additional four Likert-type items. Results There were no statistically significant differences in student scores on the examination. Students reported high levels of engagement, and reported being more satisfied overall with the TBL sessions than traditional lectures. Students rated the TBL sessions higher than lectures on ability to make difficult material comprehensible, ability to engage students and to give them feedback. Conclusion The modified TBL sessions halved in-class teaching time and by omitting the readiness assurance tests, there was more in-class time to focus on problem-solving of real clinical cases. Moreover, shorter sessions may ease implementation of TBL in the curriculum and allow for more frequent sessions. Students were more satisfied with eTBL than lectures, and reported high levels of engagement

    Examining the educational impact of the mini-CEX: a randomised controlled study

    No full text
    Background: The purpose of this study is to evaluate the mini-Clinical Evaluation Exercise (mini-CEX) as a formative assessment tool among undergraduate medical students, in terms of student perceptions, effects on direct observation and feedback, and educational impact. Methods: Cluster randomised study of 38 fifth-year medical students during a 16-week clinical placement. Hospitals were randomised to provide a minimum of 8 mini-CEXs per student (intervention arm) or continue with ad-hoc feedback (control arm). After finishing their clinical placement, students completed an Objective Structured Clinical Examination (OSCE), a written test and a survey. Results: All participants in the intervention group completed the pre-planned number of assessments, and 60% found them to be useful during their clinical placement. Overall, there were no statistically significant differences between groups in reported quantity or quality of direct observation and feedback. Observed mean scores were marginally higher on the OSCE and written test in the intervention group, but not statistically significant. Conclusions: There is considerable potential in assessing medical students during clinical placements and routine practice, but the educational impact of formative assessments remains mostly unknown. This study contributes with a robust study design, and may serve as a basis for future research

    Improving assessment quality in professional higher education: Could external peer review be the answer?

    No full text
    Summative assessment in professional higher education is important for student learning and making sound decisions about advancement and certification. Despite rigorous pre-test quality assurance procedures, problematic assessment items are always discovered post-test. This article examines the implementation of external peer review of items by clinicians in a six-year undergraduate medical programme. The purpose of the article is to identify to what extent clinicians consider multiple choice items to be acceptable for use in examinations, and what comments they provide on items they believe should be revised or not be used at all. 170 clinicians were recruited and reviewed 1353 multiple choice questions. Results showed that one out of five items reviewed were not approved. There were three main reasons for not approving items: (i) relevance of item content, (ii) accuracy of item content and (iii) technical item writing flaws. The article provides insight into a promising quality assurance procedure suitable for in-house examinations in professional higher education
    corecore