14 research outputs found
Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems
Learning involves a substantial amount of cognitive, social and emotional states. Therefore, recognizing and understanding these states in the context of learning is key in designing informed interventions and addressing the needs of the individual student to provide personalized education. In this paper, we explore the automatic detection of learner’s nonverbal behaviors involving hand-over-face gestures, head and eye movements and emotions via facial expressions during learning. The proposed computer vision-based behavior monitoring method uses a low-cost webcam and can easily be integrated with modern tutoring technologies. We investigate these behaviors in-depth over time in a classroom session of 40 minutes involving reading and problem-solving exercises. The exercises in the sessions are divided into three categories: an easy, medium and difficult topic within the context of undergraduate computer science. We found that there is a significant increase in head and eye movements as time progresses, as well as with the increase of difficulty level. We demonstrated that there is a considerable occurrence of hand-over-face gestures (on average 21.35%) during the 40 minutes session and is unexplored in the education domain. We propose a novel deep learning approach for automatic detection of hand-over-face gestures in images with a classification accuracy of 86.87%. There is a prominent increase in hand-over-face gestures when the difficulty level of the given exercise increases. The hand-over-face gestures occur more frequently during problem-solving (easy 23.79%, medium 19.84% and difficult 30.46%) exercises in comparison to reading (easy 16.20%, medium 20.06% and difficult 20.18%)
Recommended from our members
Sit still and pay attention: Using the Wii Balance-Board to detect lapses in concentration in children during psychophysical testing.
During psychophysical testing, a loss of concentration can cause observers to answer incorrectly, even when the stimulus is clearly perceptible. Such lapses limit the accuracy and speed of many psychophysical measurements. This study evaluates an automated technique for detecting lapses based on body movement (postural instability). Thirty-five children (8-11 years of age) and 34 adults performed a typical psychophysical task (orientation discrimination) while seated on a Wii Fit Balance Board: a gaming device that measures center of pressure (CoP). Incorrect responses on suprathreshold catch trials provided the "reference standard" measure of when lapses in concentration occurred. Children exhibited significantly greater variability in CoP on lapse trials, indicating that postural instability provides a feasible, real-time index of concentration. Limitations and potential applications of this method are discussed