4 research outputs found
Layered evaluation of interactive adaptive systems : framework and formative methods
Peer reviewedPostprin
Sensor-Free or Sensor-Full: A Comparison of Data Modalities in Multi-Channel Affect Detection
ABSTRACT Computational models that automatically detect learners' affective states are powerful tools for investigating the interplay of affect and learning. Over the past decade, affect detectors-which recognize learners' affective states at run-time using behavior logs and sensor data-have advanced substantially across a range of K-12 and postsecondary education settings. Machine learningbased affect detectors can be developed to utilize several types of data, including software logs, video/audio recordings, tutorial dialogues, and physical sensors. However, there has been limited research on how different data modalities combine and complement one another, particularly across different contexts, domains, and populations. In this paper, we describe work using the Generalized Intelligent Framework for Tutoring (GIFT) to build multi-channel affect detection models for a serious game on tactical combat casualty care. We compare the creation and predictive performance of models developed for two different data modalities: 1) software logs of learner interactions with the serious game, and 2) posture data from a Microsoft Kinect sensor. We find that interaction-based detectors outperform posture-based detectors for our population, but show high variability in predictive performance across different affect. Notably, our posture-based detectors largely utilize predictor features drawn from the research literature, but do not replicate prior findings that these features lead to accurate detectors of learner affect
Sensors Model Student Self Concept in the Classroom
Abstract. In this paper we explore findings from three experiments that use minimally invasive sensors with a web based geometry tutor to create a user model. Minimally invasive sensor technology is mature enough to equip classrooms of up to 25 students with four sensors at the same time while using a computer based intelligent tutoring system. The sensors, which are on each studentās chair, mouse, monitor, and wrist, provide data about posture, movement, grip tension, arousal, and facially expressed mental states. This data may provide adaptive feedback to an intelligent tutoring system based on an individual studentās affective states. The experiments show that when sensor data supplements a user model based on tutor logs, the model reflects a larger percentage of the students ā self-concept than a user model based on the tutor logs alone. The models are further expanded to classify four ranges of emotional self-concept including frustration, interest, confidence, and excitement with over 78 % accuracy. The emotional predictions are a first step for intelligent tutor systems to create sensor based personalized feedback for each student in a classroom environment. Bringing sensors to our childrenās schools addresses real problems of students ā relationship to mathematics as they are learning the subject.