61 research outputs found

    Multi-point STM: Effects of Drawing Speed and Number of Focal Points on Users’ Responses using Ultrasonic Mid-Air Haptics

    Get PDF
    Spatiotemporal modulation (STM) is used to render tactile patterns with ultrasound arrays. Previous research only explored the effects of single-point STM parameters, such as drawing speed (Vd). Here we explore the effects of multi-point STM on both perceptual (intensity) and emotional (valence/arousal) responses. This introduces a new control parameter for STM - the number of focal points (Nfp) – on top of conventional STM parameter (Vd). Our results from a study with 30 participants showed a negative effect of Nfp on perceived intensity and arousal, but no significant effects on valence. We also found the effects of Vd still aligned with prior results for single-point, even when different Nfp were used, suggesting that effects observed from single-point also apply to multi-point STM. We finally derive recommendations, such as using single-point STM to produce stimuli with higher intensity and/or arousal, or using multi-point STM for milder and more relaxing (less arousing) experience

    It Sounds Cool: Exploring Sonification of Mid-Air Haptic Textures Exploration on Texture Judgments, Body Perception, and Motor Behaviour

    Get PDF
    Ultrasonic mid-air haptic technology allows for the perceptual rendering of textured surfaces onto the user's hand. Unlike real textured surfaces, however, mid-air haptic feedback lacks implicit multisensory cues needed to reliably infer a texture's attributes (e.g., its roughness). In this paper, we combined mid-air haptic textures with congruent sound feedback to investigate how sonification could influence people's (1) explicit judgment of the texture attributes, (2) explicit sensations of their own hand, and (3) implicit motor behavior during haptic exploration. Our results showed that audio cues (presented solely or combined with haptics) influenced participants' judgment of the texture attributes (roughness, hardness, moisture and viscosity), produced some hand sensations (the feeling of having a hand smoother, softer, looser, more flexible, colder, wetter and more natural), and changed participants' speed (moving faster or slower) while exploring the texture. We then conducted a principal component analysis to better understand and visualize the found results and conclude with a short discussion on how audio-haptic associations can be used to create embodied experiences in emerging application scenarios in the metaverse

    Systematic literature review of hand gestures used in human computer interaction interfaces

    Get PDF
    Gestures, widely accepted as a humans' natural mode of interaction with their surroundings, have been considered for use in human-computer based interfaces since the early 1980s. They have been explored and implemented, with a range of success and maturity levels, in a variety of fields, facilitated by a multitude of technologies. Underpinning gesture theory however focuses on gestures performed simultaneously with speech, and majority of gesture based interfaces are supported by other modes of interaction. This article reports the results of a systematic review undertaken to identify characteristics of touchless/in-air hand gestures used in interaction interfaces. 148 articles were reviewed reporting on gesture-based interaction interfaces, identified through searching engineering and science databases (Engineering Village, Pro Quest, Science Direct, Scopus and Web of Science). The goal of the review was to map the field of gesture-based interfaces, investigate the patterns in gesture use, and identify common combinations of gestures for different combinations of applications and technologies. From the review, the community seems disparate with little evidence of building upon prior work and a fundamental framework of gesture-based interaction is not evident. However, the findings can help inform future developments and provide valuable information about the benefits and drawbacks of different approaches. It was further found that the nature and appropriateness of gestures used was not a primary factor in gesture elicitation when designing gesture based systems, and that ease of technology implementation often took precedence

    TG2 : simulating haptic impact in immersive systems

    Get PDF
    In this work we explore the haptic pendulum, a device originally developed in our VIS (Visualization, Interaction and Simulation) Lab to generate sensation of holding different weights in VR. The goal of the present project was to redesign and assess the device as a means to provide force and mobility to VR haptics. It is known in the research community that force feedback device designs prioritize a grounded construction, where the ground offers an inertial support to be able to produce force. This is not ideal for VR where free motion is desirable. While previous works offer some alternatives by attaching devices to body parts or using propellers, none of them proposed a 2 degrees-of-freedom mass displacement handheld controller. Our pendulum device consists of a mass that is driven by two servo motors on the surface of an imaginary hemisphere on top of the user’s hand. The motors and mass construction is fixed upon a standard VR controller (HTC Vive) that is held by the user that interacts in the virtual environment. The pendulum was rewired from the original weight percep tion configuration to convey directional impulses instead of weights. While weights are stable forces towards the floor, directional impulses are instantaneous forces in controlled directions. We then designed and conducted an experiment with users to assess how the impulse stimuli are perceived. We tested three dimensions for the system capabilities. The ability to convey different directions, different intensities and sequences of impulses. Results show that directions can be identified, although not precisely, that the intensities tested are mostly well identified, and that sequences of impulses are correctly perceived even with sub-second time interval between impulses

    The Perception/Action loop: A Study on the Bandwidth of Human Perception and on Natural Human Computer Interaction for Immersive Virtual Reality Applications

    Get PDF
    Virtual Reality (VR) is an innovating technology which, in the last decade, has had a widespread success, mainly thanks to the release of low cost devices, which have contributed to the diversification of its domains of application. In particular, the current work mainly focuses on the general mechanisms underling perception/action loop in VR, in order to improve the design and implementation of applications for training and simulation in immersive VR, especially in the context of Industry 4.0 and the medical field. On the one hand, we want to understand how humans gather and process all the information presented in a virtual environment, through the evaluation of the visual system bandwidth. On the other hand, since interface has to be a sort of transparent layer allowing trainees to accomplish a task without directing any cognitive effort on the interaction itself, we compare two state of the art solutions for selection and manipulation tasks, a touchful one, the HTC Vive controllers, and a touchless vision-based one, the Leap Motion. To this aim we have developed ad hoc frameworks and methodologies. The software frameworks consist in the creation of VR scenarios, where the experimenter can choose the modality of interaction and the headset to be used and set experimental parameters, guaranteeing experiments repeatability and controlled conditions. The methodology includes the evaluation of performance, user experience and preferences, considering both quantitative and qualitative metrics derived from the collection and the analysis of heterogeneous data, as physiological and inertial sensors measurements, timing and self-assessment questionnaires. In general, VR has been found to be a powerful tool able to simulate specific situations in a realistic and involving way, eliciting user\u2019s sense of presence, without causing severe cybersickness, at least when interaction is limited to the peripersonal and near-action space. Moreover, when designing a VR application, it is possible to manipulate its features in order to trigger or avoid triggering specific emotions and voluntarily create potentially stressful or relaxing situations. Considering the ability of trainees to perceive and process information presented in an immersive virtual environment, results show that, when people are given enough time to build a gist of the scene, they are able to recognize a change with 0.75 accuracy when up to 8 elements are in the scene. For interaction, instead, when selection and manipulation tasks do not require fine movements, controllers and Leap Motion ensure comparable performance; whereas, when tasks are complex, the first solution turns out to be more stable and efficient, also because visual and audio feedback, provided as a substitute of the haptic one, does not substantially contribute to improve performance in the touchless case

    Feasibility and effect of low-cost haptics on user immersion in virtual environments

    Get PDF
    Since the later 1990s research into Immersion, Presence and Interactivity in the context of digital media has been steadily evolving into an exciting area of experimentation, fuelled by advances in the visual, audio and tracking capabilities of Virtual Reality (VR) equipment, thanks to these improvements studies into the effectiveness of this equipment in producing an immersive experience are now possible. This is most commonly achieved by measuring the perceived level of Presence experienced by participants in virtual environments, with the higher the sense of Presence created, the more effective a VR system is deemed to be. However, due to the current limitations of Haptic interaction methods investigation into the role that touch plays in generating this sense of Presence is somewhat restricted. Following a structured process of design and research work, this project presents a new approach to creating Haptic Interaction by deploying a Haptic Prototyping Toolkit that enables Passive Haptic Interactions in Virtual Environments. The findings of this work provide the foundations for future research into the development of interaction methods of this type

    Measuring tactile sensitivity and mixed-reality-assisted exercise for carpal tunnel syndrome by ultrasound mid-air haptics

    Get PDF
    IntroductionCarpal tunnel syndrome (CTS) is the most common nerve entrapment neuropathy, which causes numbness and pain in the thumb, the index and middle fingers and the radial side of the ring finger. Regular hand exercises may improve the symptoms and prevent carpal tunnel surgery. This study applied a novel ultrasonic stimulation method to test tactile sensitivity in CTS and also a mixed-reality-assisted (MR-assisted) exercise program which measured hand movements and provided haptic feedback for rehabilitation.MethodsTwenty patients with mild unilateral CTS took part in the experiments. A mid-air haptics device (Ultrahaptics STRATOS Explore) was used to apply amplitude-modulated ultrasound waves (carrier frequency: 40 kHz) onto the skin to create tactile stimulation mechanically. Participants performed a two-alternative forced-choice task for measuring tactile thresholds at 250-Hz modulation frequency. They were tested at the index fingers and the thenar eminences of both hands. Additionally, 15 CTS patients used an MR-assisted program to do hand exercises with haptic feedback. Exercise performance was assessed by calculating errors between target and actual hand configurations. System Usability Scale (SUS) was adopted to verify the practical usability of the program.ResultsThresholds at the thenar eminences of the affected and healthy hands were not significantly different. While the thresholds at the healthy index fingers could be measured, those of the affected fingers were all higher than the stimulation level produced by the maximum output from the ultrasound device. In the exercise program, a significant positive correlation (ρ = 0.89, p < 0.001) was found between the performance scores and the SUS scores, which were above the criterion value established in the literature.DiscussionThe results show that thenar tactile sensitivity is not affected in mild CTS as expected from the palmar cutaneous branch of the median nerve (PCBm), but index finger threshold is likely to be higher. Overall, this study suggests that mid-air haptics, with certain improvements, may be used as a preliminary test in the clinical setting. Moreover, the device is promising to develop gamified rehabilitation programs and for the treatment follow-up of CTS

    Mid-Air Gestural Interaction with a Large Fogscreen

    Get PDF
    Projected walk-through fogscreens have been created, but there is little research on the evaluation of the interaction performance with fogscreens. The present study investigated mid-air hand gestures for interaction with a large fogscreen. Participants (N = 20) selected objects from a fogscreen using tapping and dwell-based gestural techniques, with and without vibrotactile/haptic feedback. In terms of Fitts’ law, the throughput was about 1.4 bps to 2.6 bps, suggesting that gestural interaction with a large fogscreen is a suitable and effective input method. Our results also suggest that tapping without haptic feedback has good performance and potential for interaction with a fogscreen, and that tactile feedback is not necessary for effective mid-air interaction. These findings have implications for the design of gestural interfaces suitable for interaction with fogscreens.Peer reviewe
    • 

    corecore