3,541 research outputs found

    Robust Modeling of Epistemic Mental States

    Full text link
    This work identifies and advances some research challenges in the analysis of facial features and their temporal dynamics with epistemic mental states in dyadic conversations. Epistemic states are: Agreement, Concentration, Thoughtful, Certain, and Interest. In this paper, we perform a number of statistical analyses and simulations to identify the relationship between facial features and epistemic states. Non-linear relations are found to be more prevalent, while temporal features derived from original facial features have demonstrated a strong correlation with intensity changes. Then, we propose a novel prediction framework that takes facial features and their nonlinear relation scores as input and predict different epistemic states in videos. The prediction of epistemic states is boosted when the classification of emotion changing regions such as rising, falling, or steady-state are incorporated with the temporal features. The proposed predictive models can predict the epistemic states with significantly improved accuracy: correlation coefficient (CoERR) for Agreement is 0.827, for Concentration 0.901, for Thoughtful 0.794, for Certain 0.854, and for Interest 0.913.Comment: Accepted for Publication in Multimedia Tools and Application, Special Issue: Socio-Affective Technologie

    Do I Have Your Attention: A Large Scale Engagement Prediction Dataset and Baselines

    Full text link
    The degree of concentration, enthusiasm, optimism, and passion displayed by individual(s) while interacting with a machine is referred to as `user engagement'. Engagement comprises of behavioral, cognitive, and affect related cues. To create engagement prediction systems that can work in real-world conditions, it is quintessential to learn from rich, diverse datasets. To this end, a large scale multi-faceted engagement in the wild dataset EngageNet is proposed. 31 hours duration data of 127 participants representing different illumination conditions are recorded. Thorough experiments are performed exploring the applicability of different features, action units, eye gaze, head pose, and MARLIN. Data from user interactions (question-answer) are analyzed to understand the relationship between effective learning and user engagement. To further validate the rich nature of the dataset, evaluation is also performed on the EngageWild dataset. The experiments show the usefulness of the proposed dataset. The code, models, and dataset link are publicly available at https://github.com/engagenet/engagenet_baselines

    Instructional Feedback III: How Do Instructor Facework Tactics and Immediacy Cues Interact to Predict Student Perceptions of Being Mentored?

    Get PDF
    Mentoring is a trusting, developmental supervisory relationship whose success largely depends on participants\u27 interpersonal abilities. Feedback interventions with mentees commonly present interactional challenges to maintaining that relationship, yet are integral to any teaching–learning context. In this study we examined whether and how two key, trainable teacher communication abilities—face-threat mitigation (FTM) and nonverbal immediacy—predicted students\u27 perceptions of being mentored by a teacher. Levels of actual FTM tactics and teacher nonverbal immediacy (TNI) cues were manipulated in a feedback intervention situation on video and analyzed across a 2x2 design. Factorial MANCOVA analysis of perceived mentoring detected significant multivariate main effects for FTM tactics and for TNI cues, no significant two-way interaction effect between those two interpersonal variables, and differences in how TNI and FTM each contributed to predicting mentoring\u27s four measured dimensions. Theoretical and pedagogical implications are discussed in light of facework, approach–avoidance, feedback intervention, and leader–member exchange theories

    Unobtrusive Assessment Of Student Engagement Levels In Online Classroom Environment Using Emotion Analysis

    Get PDF
    Measuring student engagement has emerged as a significant factor in the process of learning and a good indicator of the knowledge retention capacity of the student. As synchronous online classes have become more prevalent in recent years, gauging a student\u27s attention level is more critical in validating the progress of every student in an online classroom environment. This paper details the study on profiling the student attentiveness to different gradients of engagement level using multiple machine learning models. Results from the high accuracy model and the confidence score obtained from the cloud-based computer vision platform - Amazon Rekognition were then used to statistically validate any correlation between student attentiveness and emotions. This statistical analysis helps to identify the significant emotions that are essential in gauging various engagement levels. This study identified emotions like calm, happy, surprise, and fear are critical in gauging the student\u27s attention level. These findings help in the earlier detection of students with lower attention levels, consequently helping the instructors focus their support and guidance on the students in need, leading to a better online learning environment

    Real Time Detection and Analysis of Facial Features to Measure Student Engagement with Learning Objects

    Get PDF
    This paper describes a software application that records student engagement in an on-screen task. The application records in real time the on-screen activity and simultaneously estimates the emotional state and head pose of the learner. The head pose is used to detect when the screen is being viewed and the emotional state provides feedback on the form of engagement. The application works without recording images of the learner. On completing the task, the percentage of time spent viewing the screen and statistics on emotional state (neutral, happy, sad) are produced. A graph depicting the learner’s engagement and emotional state synchronised with the screen captured video is also produced. It is envisaged that the tool will find application in learning activity and learning object design

    Nonverbal Communication and Its Role in Building Rapport: A Mixed Methods Study of K-12 Teachers

    Get PDF
    This study set out to measure the impact of nonverbal communication (NVC) teacher behaviors on student perceptions of rapport and to determine which of these behaviors were conscious. Six teachers at three grade levels were participants in the study. The NV behaviors of teachers were quantified and their effect on student perceptions of rapport was measured by student surveys. Teachers’ awareness of their NVC skills was established thorugh an analysis of interviews. The mixed-methods convergent parallel methodology contributed to a rich collection of data that was analyzed using multiple strategies. The literature provides extensive evidence that NVC behaviors contribute to student perceptions of rapport. Evidence is particularly robust at the college level (Andersen,1980 ; Finn et al., 2009; McCroskey et al., 1995). This study resulted in multiple findings. The teachers in this study shared a wide variety of NV behaviors that contributed to rapport, although with varying levels of awareness. The level of awareness did not have an impact on student perceptions of rapport, consistent with Pentland and Heibeck’s (2010) study. Finally, although the study makes a contribution to future research, teachers’ NV behaviors did not yield significant results when correlated with perceptions of rapport

    Using Virtual Worlds to Identify Multidimensional Student Engagement in High School Foreign Language Learning Classrooms

    Get PDF
    Virtual world environments have evolved from object-oriented, text-based online games to complex three-dimensional immersive social spaces where the lines between reality and computer-generated begin to blur. Educators use virtual worlds to create engaging three-dimensional learning spaces for students, but the impact of virtual worlds in comparison to the traditional face-to-face counterpart has been uncertain in terms of multidimensional student engagement. Research has a need to determine the impact of virtual worlds on student engagement in comparison to the traditional face-to-face environment. The study examined the effects of virtual world and face-to-face learning environments on high school foreign language students\u27 emotional, cognitive, and behavioral engagement, as well as combined engagement. A two-way MANOVA was used to determine the effect of traditional face-to-face and virtual world learning environments on combined student engagement. A 2 x 2 analysis of covariance was used to determine the effect of traditional face-to-face and virtual world learning environments on emotional student engagement. A 2 x 2 analysis of covariance was also used to determine the effect of traditional face-to-face and virtual world learning environments on cognitive student engagement. A t-test was used to determine the effect of traditional face-to-face and virtual world learning environments on behavioral engagement. The study did not find evidence of overall, cognitive, emotional, or behavioral engagement difference between the two learning environments. The findings indicate the virtual world environment is similar to the traditional face-to-face environment in terms of student engagement. School administrators and teachers can benefit from this research when determining effective means of creating highly engaging learning environments for students. Virtual worlds can be a medium for engaging learning opportunities for students in face-to-face and virtual schools. Additional research in this area is recommended to determine the impact of virtual worlds with different student populations and subject areas

    iFocus: A Framework for Non-intrusive Assessment of Student Attention Level in Classrooms

    Get PDF
    The process of learning is not merely determined by what the instructor teaches, but also by how the student receives that information. An attentive student will naturally be more open to obtaining knowledge than a bored or frustrated student. In recent years, tools such as skin temperature measurements and body posture calculations have been developed for the purpose of determining a student\u27s affect, or emotional state of mind. However, measuring eye-gaze data is particularly noteworthy in that it can collect measurements non-intrusively, while also being relatively simple to set up and use. This paper details how data obtained from such an eye-tracker can be used to predict a student\u27s attention as a measure of affect over the course of a class. From this research, an accuracy of 77% was achieved using the Extreme Gradient Boosting technique of machine learning. The outcome indicates that eye-gaze can be indeed used as a basis for constructing a predictive model

    Multimodal Visual Sensing: Automated Estimation of Engagement

    Get PDF
    Viele moderne Anwendungen der künstlichen Intelligenz beinhalten bis zu einem gewissen Grad ein Verständnis der menschlichen Aufmerksamkeit, Aktivität, Absicht und Kompetenz aus multimodalen visuellen Daten. Nonverbale Verhaltenshinweise, die mit Hilfe von Computer Vision und Methoden des maschinellen Lernens erkannt werden, enthalten wertvolle Informationen zum Verständnis menschlicher Verhaltensweisen, einschließlich Aufmerksamkeit und Engagement. Der Einsatz solcher automatisierten Methoden im Bildungsbereich birgt ein enormes Potenzial. Zu den nützlichen Anwendungen gehören Analysen im Klassenzimmer zur Messung der Unterrichtsqualität und die Entwicklung von Interventionen zur Verbesserung des Unterrichts auf der Grundlage dieser Analysen sowie die Analyse von Präsentationen, um Studenten zu helfen, ihre Botschaften überzeugend und effektiv zu vermitteln. Diese Dissertation stellt ein allgemeines Framework vor, das auf multimodaler visueller Erfassung basiert, um Engagement und verwandte Aufgaben anhand visueller Modalitäten zu analysieren. Während sich der Großteil der Engagement-Literatur im Bereich des affektiven und sozialen Computings auf computerbasiertes Lernen und auf Lernspiele konzentriert, untersuchen wir die automatisierte Engagement-Schätzung im Klassenzimmer unter Verwendung verschiedener nonverbaler Verhaltenshinweise und entwickeln Methoden zur Extraktion von Aufmerksamkeits- und emotionalen Merkmalen. Darüber hinaus validieren wir die Effizienz der vorgeschlagenen Ansätze an realen Daten, die aus videografierten Klassen an Universitäten und weiterführenden Schulen gesammelt wurden. Zusätzlich zu den Lernaktivitäten führen wir eine Verhaltensanalyse von Studenten durch, die kurze wissenschaftliche Präsentationen unter Verwendung von multimodalen Hinweisen, einschließlich Gesichts-, Körper- und Stimmmerkmalen, halten. Neben dem Engagement und der Präsentationskompetenz nähern wir uns dem Verständnis des menschlichen Verhaltens aus einer breiteren Perspektive, indem wir die Analyse der gemeinsamen Aufmerksamkeit in einer Gruppe von Menschen, die Wahrnehmung von Lehrern mit Hilfe von egozentrischer Kameraperspektive und mobilen Eyetrackern sowie die automatisierte Anonymisierung von audiovisuellen Daten in Studien im Klassenzimmer untersuchen. Educational Analytics bieten wertvolle Möglichkeiten zur Verbesserung von Lernen und Lehren. Die Arbeit in dieser Dissertation schlägt einen rechnerischen Rahmen zur Einschätzung des Engagements und der Präsentationskompetenz von Schülern vor, zusammen mit unterstützenden Computer-Vision-Problemen.Many modern applications of artificial intelligence involve, to some extent, an understanding of human attention, activity, intention, and competence from multimodal visual data. Nonverbal behavioral cues detected using computer vision and machine learning methods include valuable information for understanding human behaviors, including attention and engagement. The use of such automated methods in educational settings has a tremendous potential for good. Beneficial uses include classroom analytics to measure teaching quality and the development of interventions to improve teaching based on these analytics, as well as presentation analysis to help students deliver their messages persuasively and effectively. This dissertation presents a general framework based on multimodal visual sensing to analyze engagement and related tasks from visual modalities. While the majority of engagement literature in affective and social computing focuses on computer-based learning and educational games, we investigate automated engagement estimation in the classroom using different nonverbal behavioral cues and developed methods to extract attentional and emotional features. Furthermore, we validate the efficiency of proposed approaches on real-world data collected from videotaped classes at university and secondary school. In addition to learning activities, we perform behavior analysis on students giving short scientific presentations using multimodal cues, including face, body, and voice features. Besides engagement and presentation competence, we approach human behavior understanding from a broader perspective by studying the analysis of joint attention in a group of people, teachers' perception using egocentric camera view and mobile eye trackers, and automated anonymization of audiovisual data in classroom studies. Educational analytics present valuable opportunities to improve learning and teaching. The work in this dissertation suggests a computational framework for estimating student engagement and presentation competence, together with supportive computer vision problems
    • …
    corecore