605 research outputs found

    Sketch-based virtual human modelling and animation

    Get PDF
    Animated virtual humans created by skilled artists play a remarkable role in today’s public entertainment. However, ordinary users are still treated as audiences due to the lack of appropriate expertise, equipment, and computer skills. We developed a new method and a novel sketching interface, which enable anyone who can draw to “sketch-out” 3D virtual humans and animation. We devised a “Stick FigureFleshing-outSkin Mapping” graphical pipeline, which decomposes the complexity of figure drawing and considerably boosts the modelling and animation efficiency. We developed a gesture-based method for 3D pose reconstruction from 2D stick figure drawings. We investigated a “Creative Model-based Method”, which performs a human perception process to transfer users’ 2D freehand sketches into 3D human bodies of various body sizes, shapes and fat distributions. Our current system supports character animation in various forms including articulated figure animation, 3D mesh model animation, and 2D contour/NPR animation with personalised drawing styles. Moreover, this interface also supports sketch-based crowd animation and 2D storyboarding of 3D multiple character interactions. A preliminary user study was conducted to support the overall system design. Our system has been formally tested by various users on Tablet PC. After minimal training, even a beginner can create vivid virtual humans and animate them within minutes

    Trends and Techniques in Visual Gaze Analysis

    Full text link
    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.Comment: pages 89-93, The 5th Conference on Communication by Gaze Interaction - COGAIN 2009: Gaze Interaction For Those Who Want It Most, ISBN: 978-87-643-0475-

    nARratives of augmented worlds

    Get PDF
    This paper presents an examination of augmented reality (AR) as a rising form of interactive narrative that combines computer-generated elements with reality, fictional with non-fictional objects, in the same immersive experience. Based on contemporary theory in narratology, we propose to view this blending of reality worlds as a metalepsis, a transgression of reality and fiction boundaries, and argue that authors could benefit from using existing conventions of narration to emphasize the transgressed boundaries, as is done in other media. Our contribution is three-fold, first we analyze the inherent connection between narrative, immersion, interactivity, fictionality and AR using narrative theory, and second we comparatively survey actual works in AR narratives from the past 15 years based on these elements from the theory. Lastly, we postulate a future for AR narratives through the perspective of the advancing technologies of both interactive narratives and AR

    Hybrid Playground: Integración de herramientas y estrategias de los videojuegos en los parques infantiles

    Get PDF
    Basándonos en el concepto de ciudad híbrida (entendida como el resultado de la transformación de los actuales modelos de percepción y vivencia de la ciudad a partir del efecto de la integración de sistemas tecnológicos en el espacio público), proponemos la transformación de los parques infantiles urbanos en escenarios para el juego audiovisual interactivo, que propicien el desarrollo de experiencias de juego físico-digitales y dinamicen las relaciones de colaboración entre los usuarios. Los dispositivos de juego urbanos se convierten de esta manera en interfaces audiovisuales tangibles que permiten desarrollar estrategias lúdicas similares a los videojuegos.Based on the idea of hybrid cities (understood as the result of the transformations of the actual models of perception and experience of the city through the effects of the integration of technological systems in the public space), we propose to transform public adventure playgrounds into interactive audiovisual gaming spaces, provoking physical-digital game experiences and facilitating collaborative relations between users. Urban playground devices thus become tangible interfaces that allow for computer game-like experiences

    Gameplay experience in a gaze interaction game

    Full text link
    Assessing gameplay experience for gaze interaction games is a challenging task. For this study, a gaze interaction Half-Life 2 game modification was created that allowed eye tracking control. The mod was deployed during an experiment at Dreamhack 2007, where participants had to play with gaze navigation and afterwards rate their gameplay experience. The results show low tension and negative affects scores on the gameplay experience questionnaire as well as high positive challenge, immersion and flow ratings. The correlation between spatial presence and immersion for gaze interaction was high and yields further investigation. It is concluded that gameplay experience can be correctly assessed with the methodology presented in this paper.Comment: pages 49-54, The 5th Conference on Communication by Gaze Interaction - COGAIN 2009: Gaze Interaction For Those Who Want It Most, ISBN: 978-87-643-0475-

    Tapping into effective emotional reactions via a user driven audio design tool.

    Get PDF
    A major problem when tackling any audio design problem aimed at conveying important and informative content, is the imposing of the designer’s own emotion, taste and value systems on the finished design choices, rather than reflecting those of the end user. In the past the problem has been routed in the tendency to use passive test subjects in rigid environments. Subjects react to sounds without no means of controlling what they hear. This paper suggests a system for participatory sound design that generates results by activating test subjects and giving them significant control of the sounding experience under test. The audio design tool application described here, the AWESOME (Auditory Work Environment Simulation Machine) Sound Design Tool, sets out to enable the end user to have direct influence on the design process through a simple yet innovative technical applications This web based device allows the end users to make emotive decisions about the kinds of audio signals they find most appropriate for given situations. The results can be used to both generate general knowledge about listening experiences and more importantly, as direct user input in actual sound design processes

    Cuerpo, danza y pantalla: nuevas prácticas audiovisuales en torno a los espectáculos de danza

    Get PDF
    Desde las primeras prácticas de interrelación entre audiovisual y danza se ha escrito mucho sobre la especificidad de la videodanza como modalidad de la videocreación o videoarte y han surgido diversas exploraciones artísticas en los que el cuerpo, los parámetros musicales y la imagen secuencial en movimiento se integran. Si bien esto viene ocurriendo desde los años setenta, los procesos de digitalización y de convergencia han devuelto el interés a las fórmulas de hibridación entre estos medios. De hecho, en la teorización última sobre las modalidades escénicas interactivas, la danza no ha quedado atrás y parecen conformar un cuerpo permanente de innovación donde se trabaja sobre el concepto de danza remediada: los bailarines y coreógrafos han encontrado en la tecnología muchas posibilidades de exploración creativa, habiéndose generado una importante corriente de interés en torno al cuerpo en danza o movimiento.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    KickSoul: A Wearable System for Feet Interactions with Digital Devices

    Get PDF
    In this paper we present a wearable device that maps natural feet movements into inputs for digital devices. KickSoul consists of an insole with sensors embedded that tracks movements and triggers actions in devices that surround us. We present a novel approach to use our feet as input devices in mobile situations when our hands are busy. We analyze natural feet?s movements and their meaning before activating an action. This paper discusses different applications for this technology as well as the implementation of our prototype
    corecore