Interactive Virtual Training: Implementation for Early Career Teachers to Practice Classroom Behavior Management

Abstract

Teachers that are equipped with the skills to manage and prevent disruptive behaviors increase the potential for their students to achieve academically and socially. Student success increases when prevention strategies and effective classroom behavior management (CBM) are implemented in the classroom. However, teachers with less than 5 years of experience, early career teachers (ECTs), are ill equipped to handle disruptive students. ECTs describe disruptive behaviors as a major factor for stress given their limited training in CBM. As a result, disruptive behaviors are reported by ECTs as one of the main reasons for leaving the field. Virtual training environments (VTEs) combined with advances in virtual social agents can support the training of CBM. Although VTEs for teachers already exist, requirements to guide future research and development of similar training systems have not been defined. We propose a set of six requirements for VTEs for teachers. Our requirements were established from a survey of the literature and from iterative lifecycle activities to build our own VTE for teachers. We present different evaluations of our VTE using methodologies and metrics we developed to assess whether all requirements were met. Our VTE simulates interactions with virtual animated students based on real classroom situations to help ECTs practice their CBM. We enhanced our classroom simulator to further explore two aspects of our requirements: interaction devices and emotional virtual agents. Interactions devices were explored by comparing the effect of immersive technologies on users\u27 experience (UX) such as presence, co-presence, engagement and believability. We adapted our VTE originally built for desktop computer, to be compatible with two immersive VR platforms. Results show that our VTE generates high levels of UX across all VR platforms. Furthermore, we enhanced our virtual students to display emotions using facial expressions as current studies do not address whether emotional virtual agents provide the same level of UX across different VR platforms. We assessed the effects of VR platforms and display of emotions on UX. Our analysis shows that facial expressions have greater impact when using a desktop computer. We propose future work on immersive VTEs using emotional virtual agents

    Similar works