2 research outputs found

    Designing interactive virtual environments with feedback in health applications.

    Get PDF
    One of the most important factors to influence user experience in human-computer interaction is the user emotional reaction. Interactive environments including serious games that are responsive to user emotions improve their effectiveness and user satisfactions. Testing and training for user emotional competence is meaningful in healthcare field, which has motivated us to analyze immersive affective games using emotional feedbacks. In this dissertation, a systematic model of designing interactive environment is presented, which consists of three essential modules: affect modeling, affect recognition, and affect control. In order to collect data for analysis and construct these modules, a series of experiments were conducted using virtual reality (VR) to evoke user emotional reactions and monitoring the reactions by physiological data. The analysis results lead to the novel approach of a framework to design affective gaming in virtual reality, including the descriptions on the aspects of interaction mechanism, graph-based structure, and user modeling. Oculus Rift was used in the experiments to provide immersive virtual reality with affective scenarios, and a sample application was implemented as cross-platform VR physical training serious game for elderly people to demonstrate the essential parts of the framework. The measurements of playability and effectiveness are discussed. The introduced framework should be used as a guiding principle for designing affective VR serious games. Possible healthcare applications include emotion competence training, educational softwares, as well as therapy methods

    Energy Modeling and Architecture Exploration for Emotion Detection Systems

    No full text
    International audienceEnergy consumption is one of the main constraints which are faced by designers of communicating objects. Indeed, we are seeing a real increase in the number of communicating objects ranging from customer applications (video games, health care objects) to industrial applications (M2M, automatic car parking, drones). This paper presents a new high level model of autonomy estimation for wearable communicating objects, and more particularly for emotion detection systems and a design space exploration. The innovation in this methodology is to meet the autonomy constraint of health objects while maintaining a reasonable performance which is defined by the emotion recognition rate. The influence of the architecture (RF protocol, kind and number of sensors, ...) and its configuration of the system autonomy and emotion recognition rate is studied in order to propose the most suitable system. The different results show that we can get a high recognition rate and a sufficient autonomy for users
    corecore