3 research outputs found
Novel Academic Tabletop 2022 (NAT22): A dynamic dice-based emergency medicine education tool
Gamification is an effective teaching tool that improves engagement and knowledge retention. Tabletop role-playing games are dynamic games that use random chance and foster player/leader partnership. To date, there are no teaching tools that mimic dynamic or unpredictable patient presentations. This style of game may work well as a tool for medical education in a simulation-based modality. In this report, we document the rules, materials, and training required to reproduce a hybrid game created to combine facets of simulation and tabletop role-playing games (TRPGs) to create a dynamic medical education tool. After testing the game for flaws and fluidity of gameplay, we plan to collect data evaluating emergency medicine residents\u27 enjoyability and knowledge retention. In this article, we describe a novel TRPG simulation hybrid game that we hypothesize will improve learner enjoyability/engagement and have similar educational benefits to standard medical education
Cross-cutting principles for planetary health education
Since the 2015 launch of the Rockefeller Foundation Lancet Commission on planetary health,1 an enormous groundswell of interest in planetary health education has emerged across many disciplines, institutions, and geographical regions. Advancing these global efforts in planetary health education will equip the next generation of scholars to address crucial questions in this emerging field and support the development of a community of practice. To provide a foundation for the growing interest and efforts in this field, the Planetary Health Alliance has facilitated the first attempt to create a set of principles for planetary health education that intersect education at all levels, across all scales, and in all regions of the world—ie, a set of cross-cutting principles
Recommended from our members
Development of a lecture evaluation tool rooted in cognitive load theory: A modified Delphi study.
BACKGROUND: Didactics play a key role in medical education. There is no standardized didactic evaluation tool to assess quality and provide feedback to instructors. Cognitive load theory provides a framework for lecture evaluations. We sought to develop an evaluation tool, rooted in cognitive load theory, to assess quality of didactic lectures. METHODS: We used a modified Delphi method to achieve expert consensus for items in a lecture evaluation tool. Nine emergency medicine educators with expertise in cognitive load participated in three modified Delphi rounds. In the first two rounds, experts rated the importance of including each item in the evaluation rubric on a 1 to 9 Likert scale with 1 labeled as not at all important and 9 labeled as extremely important. In the third round, experts were asked to make a binary choice of whether the item should be included in the final evaluation tool. In each round, the experts were invited to provide written comments, edits, and suggested additional items. Modifications were made between rounds based on item scores and expert feedback. We calculated descriptive statistics for item scores. RESULTS: We completed three Delphi rounds, each with 100% response rate. After Round 1, we removed one item, made major changes to two items, made minor wording changes to nine items, and modified the scale of one item. Following Round 2, we eliminated three items, made major wording changes to one item, and made minor wording changes to one item. After the third round, we made minor wording changes to two items. We also reordered and categorized items for ease of use. The final evaluation tool consisted of nine items. CONCLUSIONS: We developed a lecture assessment tool rooted in cognitive load theory specific to medical education. This tool can be applied to assess quality of instruction and provide important feedback to speakers