101,710 research outputs found

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Movement, Action, and Situation: Presence in Virtual Environments

    Get PDF
    Presence is commonly defined as the subjective feeling of "being there". It has been mainly conceived of as deriving from immersion, interaction, and social and narrative involvement with suitable technology. We argue that presence depends on a suitable integration of aspects relevant to an agent's movement and perception, to her actions, and to her conception of the overall situation in which she finds herself, as well as on how these aspects mesh with the possibilities for action afforded in the interaction with the virtual environment

    Interacting Unities: An Agent-Based System

    Get PDF
    Recently architects have been inspired by Thompsonis Cartesian deformations and Waddingtonis flexible topological surface to work within a dynamic field characterized by forces. In this more active space of interactions, movement is the medium through which form evolves. This paper explores the interaction between pedestrians and their environment by regarding it as a process occurring between the two. It is hypothesized that the recurrent interaction between pedestrians and environment can lead to a structural coupling between those elements. Every time a change occurs in each one of them, as an expression of its own structural dynamics, it triggers changes to the other one. An agent-based system has been developed in order to explore that interaction, where the two interacting elements, agents (pedestrians) and environment, are autonomous units with a set of internal rules. The result is a landscape where each agent locally modifies its environment that in turn affects its movement, while the other agents respond to the new environment at a later time, indicating that the phenomenon of stigmergy is possible to take place among interactions with human analogy. It is found that it is the environmentis internal rules that determine the nature and extent of change

    A situated cognition perspective on presence

    Get PDF
    During interaction with computer-based 3-D simulations like virtual reality, users may experience a sense of involvement called presence. Presence is commonly defined as the subjective feeling of "being there". We discuss the state of the art in this inno vative research area and introduce a situated cognition perspective on presence. We argue that presence depends on the proper integration of aspects relevant to an agent's movement and perception, to her actions, and to her conception of the overall situ a tion in which she finds herself, as well as on how these aspects mesh with the possibilities for action afforded in the interaction with the artifact. We also aim at showing that studies of presence offer a test-bed for different theories of situated co gnition.

    Exploring the Affective Loop

    Get PDF
    Research in psychology and neurology shows that both body and mind are involved when experiencing emotions (Damasio 1994, Davidson et al. 2003). People are also very physical when they try to communicate their emotions. Somewhere in between beings consciously and unconsciously aware of it ourselves, we produce both verbal and physical signs to make other people understand how we feel. Simultaneously, this production of signs involves us in a stronger personal experience of the emotions we express. Emotions are also communicated in the digital world, but there is little focus on users' personal as well as physical experience of emotions in the available digital media. In order to explore whether and how we can expand existing media, we have designed, implemented and evaluated /eMoto/, a mobile service for sending affective messages to others. With eMoto, we explicitly aim to address both cognitive and physical experiences of human emotions. Through combining affective gestures for input with affective expressions that make use of colors, shapes and animations for the background of messages, the interaction "pulls" the user into an /affective loop/. In this thesis we define what we mean by affective loop and present a user-centered design approach expressed through four design principles inspired by previous work within Human Computer Interaction (HCI) but adjusted to our purposes; /embodiment/ (Dourish 2001) as a means to address how people communicate emotions in real life, /flow/ (Csikszentmihalyi 1990) to reach a state of involvement that goes further than the current context, /ambiguity/ of the designed expressions (Gaver et al. 2003) to allow for open-ended interpretation by the end-users instead of simplistic, one-emotion one-expression pairs and /natural but designed expressions/ to address people's natural couplings between cognitively and physically experienced emotions. We also present results from an end-user study of eMoto that indicates that subjects got both physically and emotionally involved in the interaction and that the designed "openness" and ambiguity of the expressions, was appreciated and understood by our subjects. Through the user study, we identified four potential design problems that have to be tackled in order to achieve an affective loop effect; the extent to which users' /feel in control/ of the interaction, /harmony and coherence/ between cognitive and physical expressions/,/ /timing/ of expressions and feedback in a communicational setting, and effects of users' /personality/ on their emotional expressions and experiences of the interaction

    A proposed framework of an interactive semi-virtual environment for enhanced education of children with autism spectrum disorders

    Get PDF
    Education of people with special needs has recently been considered as a key element in the field of medical education. Recent development in the area of information and communication technologies may enable development of collaborative interactive environments which facilitate early stage education and provide specialists with robust tools indicating the person's autism spectrum disorder level. Towards the goal of establishing an enhanced learning environment for children with autism this paper attempts to provide a framework of a semi-controlled real-world environment used for the daily education of an autistic person according to the scenarios selected by the specialists. The proposed framework employs both real-world objects and virtual environments equipped with humanoids able to provide emotional feedback and to demonstrate empathy. Potential examples and usage scenarios for such environments are also described

    How Do You Like Me in This: User Embodiment Preferences for Companion Agents

    Get PDF
    We investigate the relationship between the embodiment of an artificial companion and user perception and interaction with it. In a Wizard of Oz study, 42 users interacted with one of two embodiments: a physical robot or a virtual agent on a screen through a role-play of secretarial tasks in an office, with the companion providing essential assistance. Findings showed that participants in both condition groups when given the choice would prefer to interact with the robot companion, mainly for its greater physical or social presence. Subjects also found the robot less annoying and talked to it more naturally. However, this preference for the robotic embodiment is not reflected in the users’ actual rating of the companion or their interaction with it. We reflect on this contradiction and conclude that in a task-based context a user focuses much more on a companion’s behaviour than its embodiment. This underlines the feasibility of our efforts in creating companions that migrate between embodiments while maintaining a consistent identity from the user’s point of view
    • 

    corecore