198 research outputs found

    Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain

    Get PDF
    Room-scale Virtual Reality (VR) has become an affordable consumer reality, with applications ranging from entertainment to productivity. However, the limited physical space available for room-scale VR in the typical home or office environment poses a significant problem. To solve this, physical spaces can be extended by amplifying the mapping of physical to virtual movement (translational gain). Although amplified movement has been used since the earliest days of VR, little is known about how it influences reach-based interactions with virtual objects, now a standard feature of consumer VR. Consequently, this paper explores the picking and placing of virtual objects in VR for the first time, with translational gains of between 1x (a one-to-one mapping of a 3.5m*3.5m virtual space to the same sized physical space) and 3x (10.5m*10.5m virtual mapped to 3.5m*3.5m physical). Results show that reaching accuracy is maintained for up to 2x gain, however going beyond this diminishes accuracy and increases simulator sickness and perceived workload. We suggest gain levels of 1.5x to 1.75x can be utilized without compromising the usability of a VR task, significantly expanding the bounds of interactive room-scale VR

    Sketched Reality: Sketching Bi-Directional Interactions Between Virtual and Physical Worlds with AR and Actuated Tangible UI

    Full text link
    This paper introduces Sketched Reality, an approach that combines AR sketching and actuated tangible user interfaces (TUI) for bidirectional sketching interaction. Bi-directional sketching enables virtual sketches and physical objects to "affect" each other through physical actuation and digital computation. In the existing AR sketching, the relationship between virtual and physical worlds is only one-directional -- while physical interaction can affect virtual sketches, virtual sketches have no return effect on the physical objects or environment. In contrast, bi-directional sketching interaction allows the seamless coupling between sketches and actuated TUIs. In this paper, we employ tabletop-size small robots (Sony Toio) and an iPad-based AR sketching tool to demonstrate the concept. In our system, virtual sketches drawn and simulated on an iPad (e.g., lines, walls, pendulums, and springs) can move, actuate, collide, and constrain physical Toio robots, as if virtual sketches and the physical objects exist in the same space through seamless coupling between AR and robot motion. This paper contributes a set of novel interactions and a design space of bi-directional AR sketching. We demonstrate a series of potential applications, such as tangible physics education, explorable mechanism, tangible gaming for children, and in-situ robot programming via sketching.Comment: UIST 202

    Integration of Kinesthetic and Tactile Display: A Modular Design Concept

    No full text
    This paper describes the systematic design of a modular setup for several integrated kinesthetic and cutaneous (tactile) display configurations. The proposed modular integration of a kinesthetic display and several tactile displays in serial configuration provides a versatile experimental setup to explore the integration of the kinesthetic and tactile modality of the human perception. The kinesthetic base display is a hyper-redundant device and sufficiently powerful to carry each of the compact tactile displays. In addition to a detailed description of the partly novel displays, a series of preliminary evaluation experiments is presented

    Virtual Community Heritage:An Immersive Approach to Community Heritage

    Get PDF
       Our relationship with cultural heritage has been transformed by digital technologies. Opportunities have emerged to preserve and access cultural heritage material while engaging an audience at both regional and global level. Accessibility of technology has enabled audiences to participate in digital heritage curation process. Participatory practices and co-production methodologies have created new relationships between museums and communities, as they are engaged to become active participants in the co-design and co-creation of heritage material. Audiences are more interested in experiences vs services nowadays and museums and heritage organisations have potential to entertain while providing engaging experiences beyond their physical walls. Mixed reality is an emerging method of engagement that has allowed enhanced interaction beyond traditional 3D visualisation models into fully immersive worlds. There is potential to transport audiences to past worlds that enhance their experience and understanding of cultural heritage

    Preliminary design of a multi-touch ultrasonic tactile stimulator

    Get PDF
    This paper presents a method to control ultrasonic waves on a beam, allowing to obtain a Multi-touch ultrasonic tactile stimulation in two points, to give the sensation to two fingers, from two piezoelectric transducers. The multi-modal approach and the vector control method are used to regulate the vibration amplitude, in order to modulate the friction coefficient with the fingers. An analytical modelling is presented, with experimental validation. Finally a psychophysical experiment shows that a multi-touch ultrasonic tactile stimulation is possible.This work has been carried out within the framework of the project StimTac of IRCICA (institut de recherche sur les composants logiciels et matériel pour la communication avancée), and the Project Mint of Inria

    A Universal Volumetric Haptic Actuation Platform

    Get PDF
    In this paper, we report a method of implementing a universal volumetric haptic actuation platform which can be adapted to fit a wide variety of visual displays with flat surfaces. This platform aims to enable the simulation of the 3D features of input interfaces. This goal is achieved using four readily available stepper motors in a diagonal cross configuration with which we can quickly change the position of a surface in a manner that can render these volumetric features. In our research, we use a Microsoft Surface Go tablet placed on the haptic enhancement actuation platform to replicate the exploratory features of virtual keyboard keycaps displayed on the touchscreen. We ask seven participants to explore the surface of a virtual keypad comprised of 12 keycaps. As a second task, random key positions are announced one at a time, which the participant is expected to locate. These experiments are used to understand how and with what fidelity the volumetric feedback could improve performance (detection time, track length, and error rate) of detecting the specific keycaps location with haptic feedback and in the absence of visual feedback. Participants complete the tasks with great success (p < 0.05). In addition, their ability to feel convex keycaps is confirmed within the subjective comments.Peer reviewe

    The IT potential of haptics: Touch access for people with disabilities

    Get PDF
    In his licentiate thesis, Calle Sjöström sums up his own and Certec's experience from almost five years' work on haptic interfaces for people with disabilities. The haptic technology tested have great potential for future development, but need refinemen

    Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

    Get PDF
    This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics

    Capturing tactile properties of real surfaces for haptic reproduction

    Get PDF
    Tactile feedback of an object’s surface enables us to discern its material properties and affordances. This understanding is used in digital fabrication processes by creating objects with high-resolution surface variations to influence a user’s tactile perception. As the design of such surface haptics commonly relies on knowledge from real-life experiences, it is unclear how to adapt this information for digital design methods. In this work, we investigate replicating the haptics of real materials. Using an existing process for capturing an object’s microgeometry, we digitize and reproduce the stable surface information of a set of 15 fabric samples. In a psychophysical experiment, we evaluate the tactile qualities of our set of original samples and their replicas. From our results, we see that direct reproduction of surface variations is able to influence different psychophysical dimensions of the tactile perception of surface textures. While the fabrication process did not preserve all properties, our approach underlines that replication of surface microgeometries benefits fabrication methods in terms of haptic perception by covering a large range of tactile variations. Moreover, by changing the surface structure of a single fabricated material, its material perception can be influenced. We conclude by proposing strategies for capturing and reproducing digitized textures to better resemble the perceived haptics of the originals

    Virtual Reality Application of Interactive Interior: The Development Study for Computer and Android Smartphones

    Get PDF
    This research describes the process, obstacles, and results in making a Virtual Reality Application of interactive interior using Unity 3D. In this research, we propose a VR application that supports interaction and VR views between users and the digital environment using computers and android smartphones. The study has three stages, namely input, process, and output. C# script was developed for user interaction with the digital environment. The user can interact in a small interior room with digital objects such as moving around, changing furniture color, and switching the light on or off. The research successfully created a virtual reality application using Unity3D for computers and android smartphones. This study clearly describes the step of making virtual reality applications and the problems encountered during the process—also, a more stabilized method for developing the interactive user feature. Moreover, the research brings another opportunity for VR application development that focuses on other VR development factors such as immersion, imagination, and insight
    • …
    corecore