933 research outputs found

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    A conceptual framework for an affective tutoring system using unobtrusive affect sensing for enhanced tutoring outcomes

    Get PDF
    PhD ThesisAffect plays a pivotal role in influencing the student’s motivation and learning achievements. The ability of expert human tutors to achieve enhanced learning outcomes is widely attributed to their ability to sense the affect of their tutees and to continually adapt their tutoring strategies in response to the dynamically changing affect throughout the tutoring session. In this thesis, I explore the feasibility of building an Affective Tutoring System (ATS) which senses the student’s affect on a moment-to-moment basis with the use of unobtrusive sensors in the context of computer programming tutoring. The novel use of keystrokes and mouse clicks for affect sensing is proposed here as they are ubiquitous and unobtrusive. I first establish the viability of using keystrokes and contextual logs for affect sensing first on a per exercise session level and then on a more granular basis of 30 seconds. Subsequently, I move on to investigate the use of multiple sensing channels e.g. facial, keystrokes, mouse clicks, contextual logs and head postures to enhance the availability and accuracy of sensing. The results indicated that it is viable to use keystrokes for affect sensing. In addition, the combination of multiple sensor modes enhances the accuracy of affect sensing. From the results, the sensor modes that are most significant for affect sensing are the head postures and facial modes. Nevertheless, keystrokes make up for the periods of unavailability of the former. With the affect sensing (both sensing of frustration and disengagement) in place, I moved on to architect and design the ATS and conducted an experimental study and a series of focus group discussions to evaluate the ATS. The results showed that the ATS is rated positively by the participants for usability and acceptance. The ATS is also effective in enhancing the learning of the studentsNanyang Polytechni

    Approaches, applications, and challenges in physiological emotion recognition — a tutorial overview

    Get PDF
    An automatic emotion recognition system can serve as a fundamental framework for various applications in daily life from monitoring emotional well-being to improving the quality of life through better emotion regulation. Understanding the process of emotion manifestation becomes crucial for building emotion recognition systems. An emotional experience results in changes not only in interpersonal behavior but also in physiological responses. Physiological signals are one of the most reliable means for recognizing emotions since individuals cannot consciously manipulate them for a long duration. These signals can be captured by medical-grade wearable devices, as well as commercial smart watches and smart bands. With the shift in research direction from laboratory to unrestricted daily life, commercial devices have been employed ubiquitously. However, this shift has introduced several challenges, such as low data quality, dependency on subjective self-reports, unlimited movement-related changes, and artifacts in physiological signals. This tutorial provides an overview of practical aspects of emotion recognition, such as experiment design, properties of different physiological modalities, existing datasets, suitable machine learning algorithms for physiological data, and several applications. It aims to provide the necessary psychological and physiological backgrounds through various emotion theories and the physiological manifestation of emotions, thereby laying a foundation for emotion recognition. Finally, the tutorial discusses open research directions and possible solutions

    Tailoring Interaction. Sensing Social Signals with Textiles.

    Get PDF
    Nonverbal behaviour is an important part of conversation and can reveal much about the nature of an interaction. It includes phenomena ranging from large-scale posture shifts to small scale nods. Capturing these often spontaneous phenomena requires unobtrusive sensing techniques that do not interfere with the interaction. We propose an underexploited sensing modality for sensing nonverbal behaviours: textiles. As a material in close contact with the body, they provide ubiquitous, large surfaces that make them a suitable soft interface. Although the literature on nonverbal communication focuses on upper body movements such as gestures, observations of multi-party, seated conversations suggest that sitting postures, leg and foot movements are also systematically related to patterns of social interaction. This thesis addressees the following questions: Can the textiles surrounding us measure social engagement? Can they tell who is speaking, and who, if anyone, is listening? Furthermore, how should wearable textile sensing systems be designed and what behavioural signals could textiles reveal? To address these questions, we have designed and manufactured bespoke chairs and trousers with integrated textile pressure sensors, that are introduced here. The designs are evaluated in three user studies that produce multi-modal datasets for the exploration of fine-grained interactional signals. Two approaches to using these bespoke textile sensors are explored. First, hand crafted sensor patches in chair covers serve to distinguish speakers and listeners. Second, a pressure sensitive matrix in custom-made smart trousers is developed to detect static sitting postures, dynamic bodily movement, as well as basic conversational states. Statistical analyses, machine learning approaches, and ethnographic methods show that by moni- toring patterns of pressure change alone it is possible to not only classify postures with high accuracy, but also to identify a wide range of behaviours reliably in individuals and groups. These findings es- tablish textiles as a novel, wearable sensing system for applications in social sciences, and contribute towards a better understanding of nonverbal communication, especially the significance of posture shifts when seated. If chairs know who is speaking, if our trousers can capture our social engagement, what role can smart textiles have in the future of human interaction? How can we build new ways to map social ecologies and tailor interactions

    Inside Out: Detecting Learners' Confusion to Improve Interactive Digital Learning Environments

    Get PDF
    Confusion is an emotion that is likely to occur while learning complex information. This emotion can be beneficial to learners in that it can foster engagement, leading to deeper understanding. However, if learners fail to resolve confusion, its effect can be detrimental to learning. Such detrimental learning experiences are particularly concerning within digital learning environments (DLEs), where a teacher is not physically present to monitor learner engagement and adapt the learning experience accordingly. However, with better information about a learner's emotion and behavior, it is possible to improve the design of interactive DLEs (IDLEs) not only in promoting productive confusion but also in preventing overwhelming confusion. This article reviews different methodological approaches for detecting confusion, such as self-report and behavioral and physiological measures, and discusses their implications within the theoretical framework of a zone of optimal confusion. The specificities of several methodologies and their potential application in IDLEs are discussed

    Revealing the Hidden Structure of Affective States During Emotion Regulation in Synchronous Online Collaborative Learning

    Get PDF
    This study aims to explore the use of advanced technologies such as artificial intelligence (AI) to reveal learners' emotion regulation. In particular, this study attempts to discover the hidden structure of affective states associated with facial expression during challenges, interactions, and strategies for emotion regulation in the context of synchronous online collaborative learning. The participants consist of 18 higher education students (N=18) who collaboratively worked in groups. The Hidden Markov Model (HMM) results indicated interesting transition patterns of latent state of emotion and provided insights into how learners engage in the emotion regulation process. This study demonstrates a new opportunity for theoretical and methodology advancement in the exploration of AI in researching socially shared regulation in collaborative learning

    Associating Facial Expressions and Upper-Body Gestures with Learning Tasks for Enhancing Intelligent Tutoring Systems

    Get PDF
    Learning involves a substantial amount of cognitive, social and emotional states. Therefore, recognizing and understanding these states in the context of learning is key in designing informed interventions and addressing the needs of the individual student to provide personalized education. In this paper, we explore the automatic detection of learner’s nonverbal behaviors involving hand-over-face gestures, head and eye movements and emotions via facial expressions during learning. The proposed computer vision-based behavior monitoring method uses a low-cost webcam and can easily be integrated with modern tutoring technologies. We investigate these behaviors in-depth over time in a classroom session of 40 minutes involving reading and problem-solving exercises. The exercises in the sessions are divided into three categories: an easy, medium and difficult topic within the context of undergraduate computer science. We found that there is a significant increase in head and eye movements as time progresses, as well as with the increase of difficulty level. We demonstrated that there is a considerable occurrence of hand-over-face gestures (on average 21.35%) during the 40 minutes session and is unexplored in the education domain. We propose a novel deep learning approach for automatic detection of hand-over-face gestures in images with a classification accuracy of 86.87%. There is a prominent increase in hand-over-face gestures when the difficulty level of the given exercise increases. The hand-over-face gestures occur more frequently during problem-solving (easy 23.79%, medium 19.84% and difficult 30.46%) exercises in comparison to reading (easy 16.20%, medium 20.06% and difficult 20.18%)

    Automated posture analysis for detecting learner's affective state

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2002.Includes bibliographical references (leaves 87-94).As means of improving the ability of the computer to respond in a way that facilitates a productive and enjoyable learning experience, this thesis proposes a system for the automated recognition and dynamical analysis of natural occurring postures when a child is working in a learning-computer situation. Specifically, an experiment was conducted with 10 children between 8 and 11 years old to elicit natural occurring behaviors during a learning-computer task. Two studies were carried out; the first study reveals that 9 natural occurring postures are frequently repeated during the children's experiment; the second one shows that three teachers could reliably recognize 5 affective states (high interest, interest, low interest, taking a break and boredom). Hence, a static posture recognition system that distinguishes the set of 9 postures was built. This system senses the postures using two matrices of pressure sensors mounted on the seat and back of a chair. The matrices capture the pressure body distribution of a person sitting on the chair. Using Gaussian Mixtures and feed-forward Neural Network algorithms, the system classifies the postures in real time. It achieves an overall accuracy of 87.6% when it is tested with children's postures that were not included in the training set. Also, the children's posture sequences were dynamically analyzed using a Hidden Markov Model for representing each of the 5 affective states found by the teachers. As a result, only the affective states of high interest, low interest, and taking a break were recognized with an overall accuracy of 87% when tested with new postures sequences coming from children included in the training set. In contrast, when the system was tested with posture sequences coming from two subjects that were not included in the training set, it had an overall accuracy of 76%.by Selene Atenea Mota Toledo.S.M
    • 

    corecore