7,808 research outputs found

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Evaluating a dancer's performance using Kinect-based skeleton tracking

    Get PDF
    In this work, we describe a novel system that automatically evaluates dance performances against a gold-standard performance and provides visual feedback to the performer in a 3D virtual environment. The system acquires the motion of a performer via Kinect-based human skeleton tracking, making the approach viable for a large range of users, including home enthusiasts. Unlike traditional gaming scenarios, when the motion of a user must by kept in synch with a pre-recorded avatar that is displayed on screen, the technique described in this paper targets online interactive scenarios where dance choreographies can be set, altered, practiced and refined by users. In this work, we have addressed some areas of this application scenario. In particular, a set of appropriate signal processing and soft computing methodologies is proposed for temporally aligning dance movements from two different users and quantitatively evaluating one performance against another

    Touching the void: exploring virtual objects through a vibrotactile glove

    Get PDF
    This paper describes a simple low-cost approach to adding an element of haptic interaction within a virtual environment. Using off-the-shelf hardware and software we describe a simple setup that can be used to explore physically virtual objects in space. This setup comprises of a prototype glove with a number of vibrating actuators to provide the haptic feedback, a Kinect camera for the tracking of the user's hand and a virtual reality development environment. As proof of concept and to test the efficiency of the system as well as its potential applications, we developed a simple application where we created 4 different shapes within a virtual environment in order to try to explore them and guess their shape through touch alone

    Remote Real-Time Collaboration Platform enabled by the Capture, Digitisation and Transfer of Human-Workpiece Interactions

    Get PDF
    In this highly globalised manufacturing ecosystem, product design and verification activities, production and inspection processes, and technical support services are spread across global supply chains and customer networks. Therefore, a platform for global teams to collaborate with each other in real-time to perform complex tasks is highly desirable. This work investigates the design and development of a remote real-time collaboration platform by using human motion capture technology powered by infrared light based depth imaging sensors borrowed from the gaming industry. The unique functionality of the proposed platform is the sharing of physical contexts during a collaboration session by not only exchanging human actions but also the effects of those actions on the task environment. This enables teams to remotely work on a common task problem at the same time and also get immediate feedback from each other which is vital for collaborative design, inspection and verifications tasks in the factories of the future

    Environment capturing with Microsoft Kinect

    Get PDF

    Natural User Interface for Education in Virtual Environments

    Get PDF
    Education and self-improvement are key features of human behavior. However, learning in the physical world is not always desirable or achievable. That is how simulators came to be. There are domains where purely virtual simulators can be created in contrast to physical ones. In this research we present a novel environment for learning, using a natural user interface. We, humans, are not designed to operate and manipulate objects via keyboard, mouse or a controller. The natural way of interaction and communication is achieved through our actuators (hands and feet) and our sensors (hearing, vision, touch, smell and taste). That is the reason why it makes more sense to use sensors that can track our skeletal movements, are able to estimate our pose, and interpret our gestures. After acquiring and processing the desired – natural input, a system can analyze and translate those gestures into movement signals

    A Conceptual Framework for Motion Based Music Applications

    Get PDF
    Imaginary projections are the core of the framework for motion based music applications presented in this paper. Their design depends on the space covered by the motion tracking device, but also on the musical feature involved in the application. They can be considered a very powerful tool because they allow not only to project in the virtual environment the image of a traditional acoustic instrument, but also to express any spatially defined abstract concept. The system pipeline starts from the musical content and, through a geometrical interpretation, arrives to its projection in the physical space. Three case studies involving different motion tracking devices and different musical concepts will be analyzed. The three examined applications have been programmed and already tested by the authors. They aim respectively at musical expressive interaction (Disembodied Voices), tonal music knowledge (Harmonic Walk) and XX century music composition (Hand Composer)
    corecore