8 research outputs found

    Technology-Supported Storytelling (TSST) Strategy in Virtual World for Multicultural Education

    Get PDF
    Learning culture through stories is an effective way for multicultural education, since stories are one of the most powerful and personal ways that we learn about the world. Storytelling, the process of telling stories, is a form of communication and a universal expression of culture. With the development of technology, storytelling emerges out of diverse ways. This study explores the storytelling in virtual worlds for multicultural education, and devises a Technology-Supported storytelling (TSST) strategy by examining and considering the characteristics of virtual worlds which could be incorporated into the storytelling, and then uses this strategy to teach Korean culture to students with different culture background. With this innovative TSST strategy in virtual world, this study expects to provide a guide to practice for teaching multicultural in digital era

    Emotional Facial Expression Based On Action Units and Facial Muscle

    Get PDF
    The virtual human play vital roles in virtual reality and game. The process of Enriching the virtual human through their expression is one of the aspect that most researcher studied and improved. This study aims to demonstrate the combination of facial action units (FACS) and facial muscle to produce a realistic facial expression. The result of experiment succeed on producing particular expression such as anger, happy, sad which are able to convey the emotional state of the virtual human. This achievement is believed to bring full mental immersion towards virtual human and audience. The future works will able to generate a complex virtual human expression that combine physical factos such as wrinkle, fluid dynamics for tears or sweating

    Intelligent Avatar on E-Learning using Facial Expression and Haptic

    Get PDF
     The process of introducing emotion can be improved through three-dimensional (3D) tutoring system. The problem that still not solved is how to provide realistic tutor (avatar) in virtual environment. This paper  propose an approach to teach children on understanding emotion sensation through facial expression and sense of touch (haptic).The algorithm is created by calculating constant factor (f) based on maximum value of RGB and magnitude force then magnitude force range will be associated into particular colour. The Integration process will be started from rendering the facial expression then followed by adjusting the vibration power to emotion value. The result that achieved on experiment, it show around 71% students agree with the classification of magnitude force into emotion representation. Respondents commented that high magnitude force create similar sensation when respondents feel anger, while low magnitude force is more relaxing to respondents. Respondents also said that haptic and facial expression is very interactive and realistic

    Framework of controlling 3d virtual human emotional walking using BCI

    Get PDF
    A Brain-Computer Interface (BCI) is the device that can read and acquire the brain activities. A human body is controlled by Brain-Signals, which considered as a main controller. Furthermore, the human emotions and thoughts will be translated by brain through brain signals and expressed as human mood. This controlling process mainly performed through brain signals, the brain signals is a key component in electroencephalogram (EEG). Based on signal processing the features representing human mood (behavior) could be extracted with emotion as a major feature. This paper proposes a new framework in order to recognize the human inner emotions that have been conducted on the basis of EEG signals using a BCI device controller. This framework go through five steps starting by classifying the brain signal after reading it in order to obtain the emotion, then map the emotion, synchronize the animation of the 3D virtual human, test and evaluate the work. Based on our best knowledge there is no framework for controlling the 3D virtual human. As a result for implementing our framework will enhance the game field of enhancing and controlling the 3D virtual humans’ emotion walking in order to enhance and bring more realistic as well. Commercial games and Augmented Reality systems are possible beneficiaries of this technique. © 2015 Penerbit UTM Press. All rights reserved

    Narrative design of sadness in Heavy Rain

    Get PDF
    Aware of all the problems videogames have faced trying to elicit sadness from its players, we decided to analyse the videogame Heavy Rain in virtue of its capabilities to induce sadness in the players. The game was studied under two different perspectives: the character’s non-verbal expressivity and the audiovisual artistic properties of the game. We have found that for the narrative design of sad interactive sequences, the game followed a three-stepped model made of: attachment, rupture, and passivity

    Affective level design for a role-playing videogame evaluated by a brain\u2013computer interface and machine learning methods

    Get PDF
    Game science has become a research field, which attracts industry attention due to a worldwide rich sell-market. To understand the player experience, concepts like flow or boredom mental states require formalization and empirical investigation, taking advantage of the objective data that psychophysiological methods like electroencephalography (EEG) can provide. This work studies the affective ludology and shows two different game levels for Neverwinter Nights 2 developed with the aim to manipulate emotions; two sets of affective design guidelines are presented, with a rigorous formalization that considers the characteristics of role-playing genre and its specific gameplay. An empirical investigation with a brain\u2013computer interface headset has been conducted: by extracting numerical data features, machine learning techniques classify the different activities of the gaming sessions (task and events) to verify if their design differentiation coincides with the affective one. The observed results, also supported by subjective questionnaires data, confirm the goodness of the proposed guidelines, suggesting that this evaluation methodology could be extended to other evaluation tasks

    Character emotion experience in virtual environments

    No full text
    The present paper presents an emotion module from an authoring tool of interactive storytelling being developed within the European Project—INSCAPE. The Atmosphere Editor (AE) is an INSCAPE software plug-in. Its aim is to help authors to easily create virtual interactive scenes that are recognized as emotional in order to contribute to higher coherence of their content and simultaneously to emphasize their communication purposes. It works through the attribution of emotional meaning to virtual environments and characters classes that act on the virtual story-world. Therefore, it is designed to produce a semantic intervention in the story but does not intend to transcend the storyteller work. AE presents then a taxonomy capable of sustaining the communicational optimization of the interactive narratives at an emotional level. The AE intervention develops in addition a possible pedagogical virtue permitting the learning by the story authors about potential emotional uses of specific virtual parameters. It permits also the INSCAPE user to understand the emotional semantics canons of the interactive virtual stories.Projecto Europeu: INSCAPE IP - EU RTD IST 2004-00415
    corecore