25 research outputs found

    An online spaced-education game to teach and assess medical students: a multi-institutional prospective trial

    No full text
    To investigate whether a spaced-education (SE) game can be an effective means of teaching core content to medical students and a reliable and valid method of assessing their knowledge. This nine-month trial (2008-2009) enrolled students from three U.S. medical schools. The SE game consisted of 100 validated multiple-choice questions-explanations in preclinical/clinical domains. Students were e-mailed two questions daily. Adaptive game mechanics re-sent questions in three or six weeks if answered, respectively, incorrectly or correctly. Questions expired if not answered on time (appointment dynamic). Students retired questions by answering each correctly twice consecutively (progression dynamic). Posting of relative performance fostered competition. Main outcome measures were baseline and completion scores. Seven-hundred thirty-one students enrolled. Median baseline score was 53% (interquartile range [IQR] 16) and varied significantly by year (P<.001, dmax=2.08), school (P<.001, dmax=0.75), and gender (P<.001, d=0.38). Median completion score was 93% (IQR 12) and varied significantly by year (P=.001, dmax=1.12), school (P<.001, dmax=0.34), and age (P=.019, dmax=0.43). Scores did not differ significantly between years 3 and 4. Seventy percent of enrollees (513/731) requested to participate in future SE games. An SE game is an effective and well-accepted means of teaching core content and a reliable and valid method to assess student knowledge. SE games may be valuable tools to identify and remediate students who could benefit from additional educational support

    A randomized, controlled trial of team-based competition to increase learner participation in quality-improvement education

    No full text
    ObjectiveSeveral barriers challenge resident engagement in learning quality improvement (QI). We investigated whether the incorporation of team-based game mechanics into an evidence-based online learning platform could increase resident participation in a QI curriculum.DesignRandomized, controlled trial.SettingTertiary-care medical center residency training programs.ParticipantsResident physicians (n = 422) from nine training programs (anesthesia, emergency medicine, family medicine, internal medicine, ophthalmology, orthopedics, pediatrics, psychiatry and general surgery) randomly allocated to a team competition environment (n = 200) or the control group (n = 222).InterventionSpecialty-based team assignment with leaderboards to foster competition, and alias assignment to de-identify individual participants.Main outcome measuresParticipation in online learning, as measured by percentage of questions attempted (primary outcome) and additional secondary measures of engagement (i.e. response time). Changes in participation measures over time between groups were assessed with a repeated measures ANOVA framework.ResultsResidents in the intervention arm demonstrated greater participation than the control group. The percentage of questions attempted at least once was greater in the competition group (79% [SD ± 32] versus control, 68% [SD ± 37], P= 0.03). Median response time was faster in the competition group (P= 0.006). Differences in participation continued to increase over the duration of the intervention, as measured by average response time and cumulative percent of questions attempted (each P&lt; 0.001).ConclusionsTeam competition increases resident participation in an online course delivering QI content. Medical educators should consider game mechanics to optimize participation when designing learning experiences
    corecore