5 research outputs found

    Interactive film scenes for tutor training in problem-based learning (PBL): dealing with difficult situations

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In problem-based learning (PBL), tutors play an essential role in facilitating and efficiently structuring tutorials to enable students to construct individual cognitive networks, and have a significant impact on students' performance in subsequent assessments. The necessity of elaborate training to fulfil this complex role is undeniable. In the plethora of data on PBL however, little attention has been paid to tutor training which promotes competence in the moderation of specific difficult situations commonly encountered in PBL tutorials.</p> <p>Methods</p> <p>Major interactive obstacles arising in PBL tutorials were identified from prior publications. Potential solutions were defined by an expert group. Video clips were produced addressing the tutor's role and providing exemplary solutions. These clips were embedded in a PBL tutor-training course at our medical faculty combining PBL self-experience with a non-medical case. Trainees provided pre- and post-intervention self-efficacy ratings regarding their PBL-related knowledge, skills, and attitudes, as well as their acceptance and the feasibility of integrating the video clips into PBL tutor-training (all items: 100 = completely agree, 0 = don't agree at all).</p> <p>Results</p> <p>An interactive online tool for PBL tutor training was developed comprising 18 video clips highlighting difficult situations in PBL tutorials to encourage trainees to develop and formulate their own intervention strategies. In subsequent sequences, potential interventions are presented for the specific scenario, with a concluding discussion which addresses unresolved issues.</p> <p>The tool was well accepted and considered worth the time spent on it (81.62 ± 16.91; 62.94 ± 16.76). Tutors considered the videos to prepare them well to respond to specific challenges in future tutorials (75.98 ± 19.46). The entire training, which comprised PBL self-experience and video clips as integral elements, improved tutor's self-efficacy with respect to dealing with problematic situations (pre: 36.47 ± 26.25, post: 66.99 ± 21.01; p < .0001) and significantly increased appreciation of PBL as a method (pre: 61.33 ± 24.84, post: 76.20 ± 20.12; p < .0001).</p> <p>Conclusions</p> <p>The interactive tool with instructional video clips is designed to broaden the view of future PBL tutors in terms of recognizing specific obstacles to functional group dynamics and developing individual intervention strategies. We show that this tool is well accepted and can be successfully integrated into PBL tutor-training. Free access is provided to the entire tool at <url>http://www.medizinische-fakultaet-hd.uni-heidelberg.de/fileadmin/PBLTutorTraining/player.swf</url>.</p

    Exploring the validity and reliability of a questionnaire for evaluating virtual patient design with a special emphasis on fostering clinical reasoning

    No full text
    Background: Virtual patients (VPs) are increasingly used to train clinical reasoning. So far, no validated evaluation instruments for VP design are available. Aims: We examined the validity of an instrument for assessing the perception of VP design by learners. Methods: Three sources of validity evidence were examined: (i) Content was examined based on theory of clinical reasoning and an international VP expert team. (ii) The response process was explored in think-aloud pilot studies with medical students and in content analyses of free text questions accompanying each item of the instrument. (iii) Internal structure was assessed by exploratory factor analysis (EFA) and inter-rater reliability by generalizability analysis. Results: Content analysis was reasonably supported by the theoretical foundation and the VP expert team. The think-aloud studies and analysis of free text comments supported the validity of the instrument. In the EFA, using 2547 student evaluations of a total of 78 VPs, a three-factor model showed a reasonable fit with the data. At least 200 student responses are needed to obtain a reliable evaluation of a VP on all three factors. Conclusion: The instrument has the potential to provide valid information about VP design, provided that many responses per VP are available
    corecore