21 research outputs found

    Multimodal Interaction in a Haptic Environment

    Get PDF
    In this paper we investigate the introduction of haptics in a multimodal tutoring environment. In this environment a haptic device is used to control a virtual piece of sterile cotton and a virtual injection needle. Speech input and output is provided to interact with a virtual tutor, available as a talking head, and a virtual patient. We introduce the haptic tasks and how different agents in the multi-agent system are made responsible for them. Notes are provided about the way we introduce an affective model in the tutor agent

    A virtual diary companion

    Get PDF
    Chatbots and embodied conversational agents show turn based conversation behaviour. In current research we almost always assume that each utterance of a human conversational partner should be followed by an intelligent and/or empathetic reaction of chatbot or embodied agent. They are assumed to be alert, trying to please the user. There are other applications which have not yet received much attention and which require a more patient or relaxed attitude, waiting for the right moment to provide feedback to the human partner. Being able and willing to listen is one of the conditions for being successful. In this paper we have some observations on listening behaviour research and introduce one of our applications, the virtual diary companion

    No Grice: Computers that Lie, Deceive and Conceal

    Get PDF
    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behavior, and our interactions. Fusion of such information and reasoning about such information makes it possible, using computational models of human behavior and activities, to provide context- and person-aware interpretations of human behavior and activities, including determination of attitudes, moods, and emotions. Sensors include cameras, microphones, eye trackers, position and proximity sensors, tactile or smell sensors, et cetera. Sensors can be embedded in an environment, but they can also move around, for example, if they are part of a mobile social robot or if they are part of devices we carry around or are embedded in our clothes or body. \ud \ud Our daily life behavior and daily life interactions are recorded and interpreted. How can we use such environments and how can such environments use us? Do we always want to cooperate with these environments; do these environments always want to cooperate with us? In this paper we argue that there are many reasons that users or rather human partners of these environments do want to keep information about their intentions and their emotions hidden from these smart environments. On the other hand, their artificial interaction partner may have similar reasons to not give away all information they have or to treat their human partner as an opponent rather than someone that has to be supported by smart technology.\ud \ud This will be elaborated in this paper. We will survey examples of human-computer interactions where there is not necessarily a goal to be explicit about intentions and feelings. In subsequent sections we will look at (1) the computer as a conversational partner, (2) the computer as a butler or diary companion, (3) the computer as a teacher or a trainer, acting in a virtual training environment (a serious game), (4) sports applications (that are not necessarily different from serious game or education environments), and games and entertainment applications

    Human and Virtual Agents Interacting in the Virtuality Continuum

    Get PDF

    A study of using SmartBox to embed emotion awareness through stimulation into e-learning environments

    Get PDF
    (c) 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Emotions strongly influence human's behavior in individual and social situations and must be seriously considered in any human activity, such as e-Learning. Indeed, the embedding of emotional awareness features into virtual learning environments could offer a more authentic and challenging e-Learning experience, either individual or collaborative. However, the lack of empirical results together with the complexity attributed to the management by computers of human emotions and affective data, seriously limits the advances in e-Learning as it impedes to virtualize many real-world learning situations in which emotions play a significant role. In this paper, we investigate the use of the SmartBox device for emotion measurement of distance learners during their study as well as the development of affective strategies based on the SmartBox's stimulation capabilities. The aim is to collect emotion data from different sources in order to provide the most appropriate affective responses that positively influence distance learners' study and results and ultimately enhance the e-Learning process.Peer ReviewedPostprint (author's final draft

    Virtual Reflexes

    Full text link
    Virtual Reality is used successfully to treat people for regular phobias. A new challenge is to develop Virtual Reality Exposure Training for social skills. Virtual actors in such systems have to show appropriate social behavior including emotions, gaze, and keeping distance. The behavior must be realistic and real-time. Current approaches consist of four steps: 1) trainee social signal detection, 2) cognitive-affective interpretation, 3) determination of the appropriate bodily responses, and 4) actuation. The "cognitive" detour of such approaches does not match the directness of human bodily reflexes and causes unrealistic responses and delay. Instead, we propose virtual reflexes as concurrent sensory-motor processes to control virtual actors. Here we present a virtual reflexes architecture, explain how emotion and cognitive modulation are embedded, detail its workings, and give an example description of an aggression training application

    Analyzing how emotion awareness influences students' motivation, engagement, self-regulation and learning outcome

    Get PDF
    Starting from Volume 17 Issue 4, all published articles of the Journal of Educational Technology & Society are available under Creative Commons CC-BY-ND-NC 3.0 license.Considering social and emotional competence in learning, emotion awareness aims to detect the emotions that students show during their learning interactions and make these emotions explicit to them. Being aware of their emotions, students become more conscious of their situation, what may prompt them to behavioral change. The main goal of this work is to analyze the effects of emotion awareness, supported by specific teaching strategies, on students’ motivation, engagement, self-regulation and learning outcome in long-term blended collaborative learning practices. A bilateral goal also involves an initial study that explores the way emotion awareness affects teacher’s attitude and feedback as well as the competencies that teachers need to have in order to achieve a positive change on students’ affective and cognitive state. To this end a quasi-experimental study was designed with high school students. The results of this study show that when students are aware of their emotions and guided by specific teaching strategies, their learning performance improves in relation to their motivation, engagement and self-regulation. Likewise, when teachers are conscious of students’ emotional state their attitude and feedback become more effective and timely.Peer ReviewedPostprint (published version
    corecore