248 research outputs found
The $10 Million ANA Avatar XPRIZE Competition Advanced Immersive Telepresence Systems
The $10M ANA Avatar XPRIZE aimed to create avatar systems that can transport
human presence to remote locations in real time. The participants of this
multi-year competition developed robotic systems that allow operators to see,
hear, and interact with a remote environment in a way that feels as if they are
truly there. On the other hand, people in the remote environment were given the
impression that the operator was present inside the avatar robot. At the
competition finals, held in November 2022 in Long Beach, CA, USA, the avatar
systems were evaluated on their support for remotely interacting with humans,
exploring new environments, and employing specialized skills. This article
describes the competition stages with tasks and evaluation procedures, reports
the results, presents the winning teams' approaches, and discusses lessons
learned.Comment: Extended version of article accepted for competitions colum
Movement, Action, and Situation: Presence in Virtual Environments
Presence is commonly defined as the subjective feeling of "being there". It has been mainly conceived of as deriving from immersion, interaction, and social and narrative involvement with suitable technology. We argue that presence depends on a suitable integration of aspects relevant to an agent's movement and perception, to her actions, and to her conception of the overall situation in which she finds herself, as well as on how these aspects mesh with the possibilities for action afforded in the interaction with the virtual environment
Robust Immersive Telepresence and Mobile Telemanipulation: NimbRo wins ANA Avatar XPRIZE Finals
Robotic avatar systems promise to bridge distances and reduce the need for
travel. We present the updated NimbRo avatar system, winner of the $5M grand
prize at the international ANA Avatar XPRIZE competition, which required
participants to build intuitive and immersive robotic telepresence systems that
could be operated by briefly trained operators. We describe key improvements
for the finals, compared to the system used in the semifinals: To operate
without a power- and communications tether, we integrated a battery and a
robust redundant wireless communication system. Video and audio data are
compressed using low-latency HEVC and Opus codecs. We propose a new locomotion
control device with tunable resistance force. To increase flexibility, the
robot's upper-body height can be adjusted by the operator. We describe
essential monitoring and robustness tools which enabled the success at the
competition. Finally, we analyze our performance at the competition finals and
discuss lessons learned.Comment: M. Schwarz and C. Lenz contributed equall
Augmented Reality and Its Application
Augmented Reality (AR) is a discipline that includes the interactive experience of a real-world environment, in which real-world objects and elements are enhanced using computer perceptual information. It has many potential applications in education, medicine, and engineering, among other fields. This book explores these potential uses, presenting case studies and investigations of AR for vocational training, emergency response, interior design, architecture, and much more
The Reality of Virtual Environments: WPE II Paper
Recent advances in computer technology have made it now possible to create and display three-dimensional virtual environments for real-time exploration and interaction by a user. This paper surveys some of the research done in this field at such places as: NASA\u27s Ames Research Center, MIT\u27s Media Laboratory, The University of North Carolina at Chapel Hill, and the University of New Brunswick. Limitations to the reality of these simulations will be examined, focusing on input and output devices, computational complexity, as well as tactile and visual feedback
Transmission of Tactile Roughness through Master-slave Systems
Abstract-In this study, a tactile-roughness transmission system applicable to master-slave systems with a communication time delay is developed. The master-side system constructs a local model of target objects placed in the slave-side environment. Tactile feedbacks presented to an operator at the master side are produced by combining the physical properties of target objects in the local model and the kinetic information of the operator. The time delay between the operator's motion and the tactile feedback is cancelled because the stimuli are synchronized with the exploratory motions. The proposed system is applied to the transmission of tactileroughness. The tactile stimuli presented to the operator are vibratory stimuli whose amplitude and frequency are controlled. These stimuli are locally synthesized by combining the surface wavelength of target objects and the operator's hand velocity. Using the developed tactile-roughness transmission system, an experiment for transmitting the perceived roughness of grating scales was conducted. As a result, the roughness perceived by the operators was found to highly correlate with the roughness of the scales in the slave-side environment with a coefficient of 0.83
Electrotactile feedback applications for hand and arm interactions: A systematic review, meta-analysis, and future directions
Haptic feedback is critical in a broad range of
human-machine/computer-interaction applications. However, the high cost and low
portability/wearability of haptic devices remain unresolved issues, severely
limiting the adoption of this otherwise promising technology. Electrotactile
interfaces have the advantage of being more portable and wearable due to their
reduced actuators' size, as well as their lower power consumption and
manufacturing cost. The applications of electrotactile feedback have been
explored in human-computer interaction and human-machine-interaction for
facilitating hand-based interactions in applications such as prosthetics,
virtual reality, robotic teleoperation, surface haptics, portable devices, and
rehabilitation. This paper presents a technological overview of electrotactile
feedback, as well a systematic review and meta-analysis of its applications for
hand-based interactions. We discuss the different electrotactile systems
according to the type of application. We also discuss over a quantitative
congregation of the findings, to offer a high-level overview into the
state-of-art and suggest future directions. Electrotactile feedback systems
showed increased portability/wearability, and they were successful in rendering
and/or augmenting most tactile sensations, eliciting perceptual processes, and
improving performance in many scenarios. However, knowledge gaps (e.g.,
embodiment), technical (e.g., recurrent calibration, electrodes' durability)
and methodological (e.g., sample size) drawbacks were detected, which should be
addressed in future studies.Comment: 18 pages, 1 table, 8 figures, under review in Transactions on
Haptics. This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessible.Upon acceptance of the article by IEEE, the preprint
article will be replaced with the accepted versio
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
Towards a wearable interface for immersive telepresence in robotics
In this paper we present an architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface that provides the human user with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, including vision (with gaze control) and tactile feedback, which offers a richly immersive experience for the human user. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with others located in the remote environment. Our approach has been tested from a variety of distances, including university and business premises, and using wired, wireless and Internet based connections, using data compression to maintain the quality of the experience for the user. Initial testing has shown the wearable interface to be a robust system of immersive teleoperation, with a myriad of potential applications, particularly in social networking, gaming and entertainment
- …