1,109 research outputs found
Multimodal Interaction in a Haptic Environment
In this paper we investigate the introduction of haptics in a multimodal tutoring environment. In this environment a haptic device is used to control a virtual piece of sterile cotton and a virtual injection needle. Speech input and output is provided to interact with a virtual tutor, available as a talking head, and a virtual patient. We introduce the haptic tasks and how different agents in the multi-agent system are made responsible for them. Notes are provided about the way we introduce an affective model in the tutor agent
On the Development of Adaptive and User-Centred Interactive Multimodal Interfaces
Multimodal systems have attained increased attention in recent years, which has made possible important
improvements in the technologies for recognition, processing, and generation of multimodal information.
However, there are still many issues related to multimodality which are not clear, for example, the
principles that make it possible to resemble human-human multimodal communication. This chapter
focuses on some of the most important challenges that researchers have recently envisioned for future
multimodal interfaces. It also describes current efforts to develop intelligent, adaptive, proactive, portable
and affective multimodal interfaces
Affective e-learning approaches, technology and implementation model: a systematic review
A systematic literature study including articles from 2016 to 2022 was done to evaluate the various approaches, technologies, and implementation models involved in measuring student engagement during learning. The review’s objective was to compile and analyze all studies that investigated how instructors can gauge students’ mental states while teaching and assess the most effective teaching methods. Additionally, it aims to extract and assess expanded methodologies from chosen research publications to offer suggestions and answers to researchers and practitioners. Planning, carrying out the analysis, and publishing the results have all received significant attention in the research approach. The study’s findings indicate that more needs to be done to evaluate student participation objectively and follow their development for improved academic performance. Physiological approaches should be given more support among the alternatives. While deep learning implementation models and contactless technology should interest more researchers. And, the recommender system should be integrated into e-learning system. Other approaches, technologies, and methodology articles, on the other hand, lacked authenticity in conveying student feeling
A virtual diary companion
Chatbots and embodied conversational agents show turn based conversation behaviour. In current research we almost always assume that each utterance of a human conversational partner should be followed by an intelligent and/or empathetic reaction of chatbot or embodied agent. They are assumed to be alert, trying to please the user. There are other applications which have not yet received much attention and which require a more patient or relaxed attitude, waiting for the right moment to provide feedback to the human partner. Being able and willing to listen is one of the conditions for being successful. In this paper we have some observations on listening behaviour research and introduce one of our applications, the virtual diary companion
Affective Environment for Java Programming Using Facial and EEG Recognition
Abstract. We have developed an affective and intelligent learning environment that helps students to improve their Java programming skills. This environment evaluates cognitive and affective aspects of students in order to define the level of difficulty of the exercises that are more suitable for the them in its current condition. The cognitive aspects are: the number of mistakes, the difficulty level of the current exercise and the time spent in the solution. The affective aspects are: the acquired emotion from a facial expression and the acquired valence from electroencephalogram signals. This environment also uses a neural network for face recognition of basic emotions, a support vector machine to define the valence of emotion and a fuzzy inference engine to evaluate the cognitive and affective aspects
A conceptual framework for an affective tutoring system using unobtrusive affect sensing for enhanced tutoring outcomes
PhD ThesisAffect plays a pivotal role in influencing the student’s motivation and learning
achievements. The ability of expert human tutors to achieve enhanced learning outcomes is
widely attributed to their ability to sense the affect of their tutees and to continually adapt
their tutoring strategies in response to the dynamically changing affect throughout the tutoring
session. In this thesis, I explore the feasibility of building an Affective Tutoring System
(ATS) which senses the student’s affect on a moment-to-moment basis with the use of
unobtrusive sensors in the context of computer programming tutoring. The novel use of
keystrokes and mouse clicks for affect sensing is proposed here as they are ubiquitous and
unobtrusive. I first establish the viability of using keystrokes and contextual logs for affect
sensing first on a per exercise session level and then on a more granular basis of 30 seconds.
Subsequently, I move on to investigate the use of multiple sensing channels e.g. facial,
keystrokes, mouse clicks, contextual logs and head postures to enhance the availability and
accuracy of sensing. The results indicated that it is viable to use keystrokes for affect sensing.
In addition, the combination of multiple sensor modes enhances the accuracy of affect
sensing. From the results, the sensor modes that are most significant for affect sensing are the
head postures and facial modes. Nevertheless, keystrokes make up for the periods of
unavailability of the former. With the affect sensing (both sensing of frustration and
disengagement) in place, I moved on to architect and design the ATS and conducted an
experimental study and a series of focus group discussions to evaluate the ATS. The results
showed that the ATS is rated positively by the participants for usability and acceptance. The
ATS is also effective in enhancing the learning of the studentsNanyang Polytechni
Evaluating the Emotional State of a User Using a Webcam
In online learning is more difficult for teachers identify to see how individual students behave. Student’s emotions like self-esteem, motivation, commitment, and others that are believed to be determinant in student’s performance can not be ignored, as they are known (affective states and also learning styles) to greatly influence student’s learning. The ability of the computer to evaluate the emotional state of the user is getting bigger attention. By evaluating the emotional state, there is an attempt to overcome the barrier between man and non-emotional machine. Recognition of a real time emotion in e-learning by using webcams is research area in the last decade. Improving learning through webcams and microphones offers relevant feedback based upon learner’s facial expressions and verbalizations. The majority of current software does not work in real time – scans face and progressively evaluates its features. The designed software works by the use neural networks in real time which enable to apply the software into various fields of our lives and thus actively influence its quality. Validation of face emotion recognition software was annotated by using various experts. These expert findings were contrasted with the software results. An overall accuracy of our software based on the requested emotions and the recognized emotions is 78%. Online evaluation of emotions is an appropriate technology for enhancing the quality and efficacy of e-learning by including the learner´s emotional states
- …