9 research outputs found

    Vibratory tactile display for textures

    Get PDF
    We have developed a tactile display that produces vibratory stimulus to a fingertip in contact with a vibrating tactor matrix. The display depicts tactile surface textures while the user is exploring a virtual object surface. A piezoelectric actuator drives the individual tactor in accordance with both the finger movement and the surface texture being traced. Spatiotemporal display control schemes were examined for presenting the fundamental surface texture elements. The temporal duration of vibratory stimulus was experimentally optimized to simulate the adaptation process of cutaneous sensation. The selected duration time for presenting a single line edge agreed with the time threshold of tactile sensation. Then spatial stimulus disposition schemes were discussed for representation of other edge shapes. As an alternative means not relying on amplitude control, a method of augmented duration at the edge was investigated. Spatial resolution of the display was measured for the lines presented both in perpendicular and parallel to a finger axis. Discrimination of texture density was also measured on random dot textures

    Switching Torque Converter: Concept and Preliminary Implementation

    No full text

    Effects of self-avatar cast shadow and foot vibration on telepresence, virtual walking experience, and cybersickness from omnidirectional movie

    No full text
    Human locomotion is most naturally achieved through walking, which is good for both mental and physical health. To provide a virtual walking experience to seated users, a system utilizing foot vibrations and simulated optical flow was developed. The current study sought to augment this system and examine the effect of an avatar's cast shadow and foot vibrations on the virtual walking experience and cybersickness. The omnidirectional movie and the avatar's walking animation were synchronized, with the cast shadow reflecting the avatar's movement on the ground. Twenty participants were exposed to the virtual walking in six conditions (with/without foot vibrations and no/short/long shadow) and were asked to rate their sense of telepresence, walking experience, and occurrences of cybersickness. Our findings indicate that the synchronized foot vibrations enhanced telepresence as well as self-motion, walking, and leg-action sensations, while also reducing instances of nausea and disorientation sickness. The avatar's cast shadow was found to improve telepresence and leg-action sensation, but had no impact on self-motion and walking sensation. These results suggest that observation of the self-body cast shadow does not directly improve walking sensation, but is effective in enhancing telepresence and leg-action sensation, while foot vibrations are effective in improving telepresence and walking experience and reducing instances of cybersickness

    High-Frequency Cybersickness Prediction Using Deep Learning Techniques With Eye-Related Indices

    No full text
    Cybersickness is a growing concern in the field of virtual reality (VR). It is characterized by symptoms, such as headache, sweating, disorientation, and nausea. These symptoms can considerably hinder the users’ immersive experience in VR environments, leading to a pressing need for effective solutions to combat cybersickness. In this study, we aim to tackle cybersickness by presenting a novel high-frequency approach for detecting the timing at which users experience cybersickness. Our approach uses 1-, 5-, or 10-s time-series eye-related indices processed by deep learning algorithms to predict cybersickness severity. In five-fold cross-validation, we achieved 71.09% accuracy in classifying four classes of cybersickness severity when individuals were not distinguished. Furthermore, with individualized cross-validation, we achieved an accuracy of up to approximately 80%. Our approach outperforms other cybersickness prediction studies as it provides the highest frequency in predicting cybersickness. It is anticipated that our approach will be valuable not only for immediate evaluation by researchers investigating cybersickness mitigation but also for early detection and notification of users experiencing cybersickness symptoms. By predicting cybersickness, our approach has the potential to promote the future advancement of VR technology

    Resolving Object References in Multimodal Dialogues for Immersive Virtual Environments

    No full text
    Pfeiffer T, Latoschik ME. Resolving Object References in Multimodal Dialogues for Immersive Virtual Environments. In: Ikei Y, Göbel M, Chen J, eds. Proceedings of the IEEE Virtual Reality 2004. 2004: 35-42.This paper describes the underlying concepts and the technical implementation of a system for resolving multimodal references in Virtual Reality (VR). In this system the temporal and semantic relations intrinsic to referential utterances are expressed as a constraint satisfaction problem, where the propositional value of each referential unit during a multimodal dialogue updates incrementally the active set of constraints. As the system is based on findings of human cognition research it also regards, e.g., constraints implicitly assumed by human communicators. The implementation takes VR related real-time and immersive conditions into account and adapts its architecture to well known scene-graph based design patterns by introducing a socalled reference resolution engine. Regarding the conceptual work as well as regarding the implementation, special care has been taken to allow further refinements and modifications to the underlying resolving processes on a high level basis

    Vehicle-ride sensation sharing system with stereoscopic 3D visual perception and vibro-vestibular feedback for immersive remote collaboration

    No full text
    In this study, using a personal vehicle (i.e. Segway) and a wheelchair-type motion display, we proposed a vehicle-ride sensation sharing system to enable a local rider to collaborate with a remote driver immersively. The local rider sitting in the motion display can receive both the 3D visual perception and the vibro-vestibular sensation. The remote driver side of the system was developed by attaching the Segway with two 360-degree cameras and a stabilizer to capture stereoscopic 3D images and send them to each eye of a head-mounted display worn by a local rider. By modifying a conventional wheelchair with a simple, lightweight mechanism for actuation and vibration by two DC motors, we developed the prototype of the vibro-vestibular display for local riders. Then, we investigated the effectiveness of a vibro-vestibular wheelchair. The result showed that the acceleration/deceleration of the wheelchair proportional to that of visual cue could significantly reduce virtual reality (VR) sickness by approximately 54% and increase the sense of riding a vehicle by approximately 2.25 times. Moreover, we conducted a demo experience in SIGGRAPH ASIA 2019 for 3 days and 89 participants filled the questionnaire related to our system validation. The results suggested that vibro-vestibular feedback by the wheelchair is important for remote collaboration that uses a mobile vehicle.</p
    corecore