9 research outputs found

    Evaluation of the softness and its impression of visual stimuli in VR space

    Get PDF
    To examine the softness and impression of visual objects in VR (Virtual Reality) space, the impression of the visual stimuli in VR space was measured using the subjective evaluation of a seven-rank scale by changing with each the value of the deformation resistance of the stimuli, of shapes, and colors. The value of the deformation resistance of the stimuli expresses the degree of deformation to return to the original of the object when touching it in VR space. The lower value indicates the larger deformation like pudding and the higher one is the smaller one like thick rubber they were used three types of values lower and higher, and no-deformation of the objects. The shapes of objects as the stimuli were three shapes (sphere, cube, pyramid). The colors of the stimuli were selected from five colors (red, green, green, gray, and white) and they have used two types of the feeling of materials (matte and metallic) in each color. Ten participants were asked to subjectively evaluate the softness and impression of the stimulus. In the results, the evaluation changes from soft to hard by increasing the values of deformation resistance in all the stimuli in VR space. It is suggested that the degree of the deformation to return to the original can express the softness of objects when touching them in VR space even though the user does not touch them physically. It also discussed the relationship between softness and its impression of the stimuli in VR space

    Head Mounted Display Interaction Evaluation: Manipulating Virtual Objects in Augmented Reality

    Get PDF
    Augmented Reality (AR) is getting close to real use cases,which is driving the creation of innovative applications and the unprecedented growth of Head-Mounted Display (HMD) devices in consumer availability. However, at present there is a lack of guidelines, common form factors and standard interaction paradigms between devices, which has resulted in each HMD manufacturer creating their own specifications. This paper presents the first experimental evaluation of two AR HMDs evaluating their interaction paradigms, namely we used the HoloLens v1 (metaphoric interaction) and Meta2 (isomorphic interaction). We report on precision, interactivity and usability metrics in an object manipulation task-based user study. 20 participants took part in this study and significant differences were found between interaction paradigms of the devices for move tasks, where the isomorphic mapped interaction outperformed the metaphoric mapped interaction in both time to completion and accuracy, while the contrary was found for the resize task. From an interaction perspective, the isomorphic mapped interaction (using the Meta2) was perceived as more natural and usable with a significantly higher usability score and a significantly lower task-load index. However, when task accuracy and time to completion is key mixed interaction paradigms need to be considered

    Usability Analysis of an off-the-shelf Hand Posture Estimation Sensor for Freehand Physical Interaction in Egocentric Mixed Reality

    Get PDF
    This paper explores freehand physical interaction in egocentric Mixed Reality by performing a usability study on the use of hand posture estimation sensors. We report on precision, interactivity and usability metrics in a task-based user study, exploring the importance of additional visual cues when interacting. A total of 750 interactions were recorded from 30 participants performing 5 different interaction tasks (Move, Rotate: Pitch (Y axis) and Yaw (Z axis), Uniform scale: enlarge and shrink). Additional visual cues resulted in an average shorter time to interact, however, no consistent statistical differences were found in between groups for performance and precision results. The group with additional visual cues gave the system and average System Usability Scale (SUS) score of 72.33 (SD = 16.24) while the other scored a 68.0 (SD = 18.68). Overall, additional visual cues made the system being perceived as more usable, despite the fact that the use of these two different conditions had limited effect on precision and interactivity metrics

    Too Hot to Handle: An Evaluation of the Effect of Thermal Visual Representation on User Grasping Interaction in Virtual Reality

    Get PDF
    Influence of interaction fidelity and rendering quality on perceived user experience have been largely explored in Virtual Reality (VR). However, differences in interaction choices triggered by these rendering cues have not yet been explored. We present a study analysing the effect of thermal visual cues and contextual information on 50 participants' approach to grasp and move a virtual mug. This study comprises 3 different temperature cues (baseline empty, hot and cold) and 4 contextual representations; all embedded in a VR scenario. We evaluate 2 different hand representations (abstract and human) to assess grasp metrics. Results show temperature cues influenced grasp location, with the mug handle being predominantly grasped with a smaller grasp aperture for the hot condition, while the body and top were preferred for baseline and cold conditions

    Pseudo-haptics survey: Human-computer interaction in extended reality & teleoperation

    Get PDF
    Pseudo-haptic techniques are becoming increasingly popular in human-computer interaction. They replicate haptic sensations by leveraging primarily visual feedback rather than mechanical actuators. These techniques bridge the gap between the real and virtual worlds by exploring the brain’s ability to integrate visual and haptic information. One of the many advantages of pseudo-haptic techniques is that they are cost-effective, portable, and flexible. They eliminate the need for direct attachment of haptic devices to the body, which can be heavy and large and require a lot of power and maintenance. Recent research has focused on applying these techniques to extended reality and mid-air interactions. To better understand the potential of pseudo-haptic techniques, the authors developed a novel taxonomy encompassing tactile feedback, kinesthetic feedback, and combined categories in multimodal approaches, ground not covered by previous surveys. This survey highlights multimodal strategies and potential avenues for future studies, particularly regarding integrating these techniques into extended reality and collaborative virtual environments.info:eu-repo/semantics/publishedVersio

    All Hands on Deck: Choosing Virtual End Effector Representations to Improve Near Field Object Manipulation Interactions in Extended Reality

    Get PDF
    Extended reality, or XR , is the adopted umbrella term that is heavily gaining traction to collectively describe Virtual reality (VR), Augmented reality (AR), and Mixed reality (MR) technologies. Together, these technologies extend the reality that we experience either by creating a fully immersive experience like in VR or by blending in the virtual and real worlds like in AR and MR. The sustained success of XR in the workplace largely hinges on its ability to facilitate efficient user interactions. Similar to interacting with objects in the real world, users in XR typically interact with virtual integrants like objects, menus, windows, and information that convolve together to form the overall experience. Most of these interactions involve near-field object manipulation for which users are generally provisioned with visual representations of themselves also called self-avatars. Representations that involve only the distal entity are called end-effector representations and they shape how users perceive XR experiences. Through a series of investigations, this dissertation evaluates the effects of virtual end effector representations on near-field object retrieval interactions in XR settings. Through studies conducted in virtual, augmented, and mixed reality, implications about the virtual representation of end-effectors are discussed, and inferences are made for the future of near-field interaction in XR to draw upon from. This body of research aids technologists and designers by providing them with details that help in appropriately tailoring the right end effector representation to improve near-field interactions, thereby collectively establishing knowledge that epitomizes the future of interactions in XR

    Natural freehand grasping of virtual objects for augmented reality

    Get PDF
    Grasping is a primary form of interaction with the surrounding world, and is an intuitive interaction technique by nature due to the highly complex structure of the human hand. Translating this versatile interaction technique to Augmented Reality (AR) can provide interaction designers with more opportunities to implement more intuitive and realistic AR applications. The work presented in this thesis uses quantifiable measures to evaluate the accuracy and usability of natural grasping of virtual objects in AR environments, and presents methods for improving this natural form of interaction. Following a review of physical grasping parameters and current methods of mediating grasping interactions in AR, a comprehensive analysis of natural freehand grasping of virtual objects in AR is presented to assess the accuracy, usability and transferability of this natural form of grasping to AR environments. The analysis is presented in four independent user studies (120 participants, 30 participants for each study and 5760 grasping tasks in total), where natural freehand grasping performance is assessed for a range of virtual object sizes, positions and types in terms of accuracy of grasping, task completion time and overall system usability. Findings from the first user study in this work highlighted two key problems for natural grasping in AR; namely inaccurate depth estimation and inaccurate size estimation of virtual objects. Following the quantification of these errors, three different methods for mitigating user errors and assisting users during natural grasping were presented and analysed; namely dual view visual feedback, drop shadows and additional visual feedback when adding user based tolerances during interaction tasks. Dual view visual feedback was found to significantly improve user depth estimation, however this method also significantly increased task completion time. Drop shadows provided an alternative, and a more usable solution, to dual view visual feedback through significantly improving depth estimation, task completion time and the overall usability of natural grasping. User based tolerances negated the fundamental problem of inaccurate size estimation of virtual objects, through enabling users to perform natural grasping without the need of being highly accurate in their grasping performance, thus providing evidence that natural grasping can be usable in task based AR environments. Finally recommendations for allowing and further improving natural grasping interaction in AR environments are provided, along with guidelines for translating this form of natural grasping to other AR environments and user interfaces

    KEER2022

    Get PDF
    AvanttĂ­tol: KEER2022. DiversitiesDescripciĂł del recurs: 25 juliol 202

    Grasping a virtual object with a bare hand

    No full text
    corecore