13,831 research outputs found

    Integrating internal behavioural models with external expression

    Get PDF
    Users will believe in a virtual character more if they can empathise with it and understand what ‘makes it tick’. This will be helped by making the motivations of the character, and other processes that go towards creating its behaviour, clear to the user. This paper proposes that this can be achieved by linking the behavioural or cognitive system of the character to expressive behaviour. This idea is discussed in general and then demonstrated with an implementation that links a simulation of perception to the animation of a character’s eyes

    An Extendable Multiagent Model for Behavioural Animation

    Get PDF
    This paper presents a framework for visually simulating the behaviour of actors in virtual environments. In principle, the environmental interaction follows a cyclic processing of perception, decision, and action. As natural life-forms perceive their environment by active sensing, our approach also tends to let the artificial actor actively sense the virtual world. This allows us to place the characters in non-preprocessed virtual dynamic environments, what we call generic environments. A main aspect within our framework is the strict distinction between a behaviour pattern, that we term model, and its instances, named characters, which use the pattern. This allows them sharing one or more behaviour models. Low-level tasks like sensing or acting are took over by so called subagents, which are subordinated modules extendedly plugged in the character. In a demonstration we exemplarily show the application of our framework. We place the same character in different environments and let it climb and descend stairs, ramps and hills autonomously. Additionally the reactiveness for moving objects is tested. In future, this approach shall go into action for a simulation of an urban environment

    On the simulation of interactive non-verbal behaviour in virtual humans

    Get PDF
    Development of virtual humans has focused mainly in two broad areas - conversational agents and computer game characters. Computer game characters have traditionally been action-oriented - focused on the game-play - and conversational agents have been focused on sensible/intelligent conversation. While virtual humans have incorporated some form of non-verbal behaviour, this has been quite limited and more importantly not connected or connected very loosely with the behaviour of a real human interacting with the virtual human - due to a lack of sensor data and no system to respond to that data. The interactional aspect of non-verbal behaviour is highly important in human-human interactions and previous research has demonstrated that people treat media (and therefore virtual humans) as real people, and so interactive non-verbal behaviour is also important in the development of virtual humans. This paper presents the challenges in creating virtual humans that are non-verbally interactive and drawing corollaries with the development history of control systems in robotics presents some approaches to solving these challenges - specifically using behaviour based systems - and shows how an order of magnitude increase in response time of virtual humans in conversation can be obtained and that the development of rapidly responding non-verbal behaviours can start with just a few behaviours with more behaviours added without difficulty later in development

    A Mimetic Strategy to Engage Voluntary Physical Activity In Interactive Entertainment

    Full text link
    We describe the design and implementation of a vision based interactive entertainment system that makes use of both involuntary and voluntary control paradigms. Unintentional input to the system from a potential viewer is used to drive attention-getting output and encourage the transition to voluntary interactive behaviour. The iMime system consists of a character animation engine based on the interaction metaphor of a mime performer that simulates non-verbal communication strategies, without spoken dialogue, to capture and hold the attention of a viewer. The system was developed in the context of a project studying care of dementia sufferers. Care for a dementia sufferer can place unreasonable demands on the time and attentional resources of their caregivers or family members. Our study contributes to the eventual development of a system aimed at providing relief to dementia caregivers, while at the same time serving as a source of pleasant interactive entertainment for viewers. The work reported here is also aimed at a more general study of the design of interactive entertainment systems involving a mixture of voluntary and involuntary control.Comment: 6 pages, 7 figures, ECAG08 worksho

    Interacting Unities: An Agent-Based System

    Get PDF
    Recently architects have been inspired by Thompsonis Cartesian deformations and Waddingtonis flexible topological surface to work within a dynamic field characterized by forces. In this more active space of interactions, movement is the medium through which form evolves. This paper explores the interaction between pedestrians and their environment by regarding it as a process occurring between the two. It is hypothesized that the recurrent interaction between pedestrians and environment can lead to a structural coupling between those elements. Every time a change occurs in each one of them, as an expression of its own structural dynamics, it triggers changes to the other one. An agent-based system has been developed in order to explore that interaction, where the two interacting elements, agents (pedestrians) and environment, are autonomous units with a set of internal rules. The result is a landscape where each agent locally modifies its environment that in turn affects its movement, while the other agents respond to the new environment at a later time, indicating that the phenomenon of stigmergy is possible to take place among interactions with human analogy. It is found that it is the environmentis internal rules that determine the nature and extent of change

    Synopsis of an engineering solution for a painful problem Phantom Limb Pain

    Get PDF
    This paper is synopsis of a recently proposed solution for treating patients who suffer from Phantom Limb Pain (PLP). The underpinning approach of this research and development project is based on an extension of “mirror box” therapy which has had some promising results in pain reduction. An outline of an immersive individually tailored environment giving the patient a virtually realised limb presence, as a means to pain reduction is provided. The virtual 3D holographic environment is meant to produce immersive, engaging and creative environments and tasks to encourage and maintain patients’ interest, an important aspect in two of the more challenging populations under consideration (over-60s and war veterans). The system is hoped to reduce PLP by more than 3 points on an 11 point Visual Analog Scale (VAS), when a score less than 3 could be attributed to distraction alone

    A false colouring real time visual saliency algorithm for reference resolution in simulated 3-D environments

    Get PDF
    In this paper we present a novel false colouring visual saliency algorithm and illustrate how it is used in the Situated Language Interpreter system to resolve natural language references

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    Gaze Behavior, Believability, Likability and the iCat

    Get PDF
    The iCat is a user-interface robot with the ability to express a range of emotions through its facial features. This paper summarizes our research whether we can increase the believability and likability of the iCat for its human partners through the application of gaze behaviour. Gaze behaviour serves several functions during social interaction such as mediating conversation flow, communicating emotional information and avoiding distraction by restricting visual input. There are several types of eye and head movements that are necessary for realizing these functions. We designed and evaluated a gaze behaviour system for the iCat robot that implements realistic models of the major types of eye and head movements found in living beings: vergence, vestibulo ocular reflexive, smooth pursuit movements and gaze shifts. We discuss how these models are integrated into the software environment of the iCat and can be used to create complex interaction scenarios. We report about some user tests and draw conclusions for future evaluation scenarios
    corecore