5,841 research outputs found
Mid-Air Haptics for Control Interfaces
Control interfaces and interactions based on touch-less gesture tracking devices have become a prevalent research topic in both industry and academia. Touch-less devices offer a unique interaction immediateness that makes them ideal for applications where direct contact with a physical controller is not desirable. On the other hand, these controllers inherently lack active or passive haptic feedback to inform users about the results of their interaction. Mid-air haptic interfaces, such as those using focused ultrasound waves, can close the feedback loop and provide new tools for the design of touch-less, un-instrumented control interactions. The goal of this workshop is to bring together the growing mid-air haptic research community to identify and discuss future challenges in control interfaces and their application in AR/VR, automotive, music, robotics and teleoperation
Sampled data systems passivity and discrete port-Hamiltonian systems
In this paper, we present a novel way to approach the interconnection of a continuous and a discrete time physical system first presented in [1][2] [3]. This is done in a way which preserves passivity of the coupled system independently of the sampling time T. This strategy can be used both in the field of telemanipulation, for the implementation of a passive master/slave system on a digital transmission line with varying time delays and possible loss of packets (e.g., the Internet), and in the field of haptics, where the virtual environment should `feel¿ like a physical equivalent system
Vibrotactile sensitivity in active touch: effect of pressing force
An experiment was conducted to study the effects of force produced by active touch on vibrotactile perceptual thresholds. The task consisted in pressing the fingertip against a flat rigid surface that provided either sinusoidal or broadband vibration. Three force levels were considered, ranging from light touch to hard press. Finger contact areas were measured during the experiment, showing positive correlation with the respective applied forces. Significant effects on thresholds were found for vibration type and force level. Moreover, possibly due to the concurrent effect of large (unconstrained) finger contact areas, active pressing forces, and long duration stimuli, the measured perceptual thresholds are considerably lower than what previously reported in the literature
Personalising Vibrotactile Displays through Perceptual Sensitivity Adjustment
Haptic displays are commonly limited to transmitting a discrete
set of tactile motives. In this paper, we explore the
transmission of real-valued information through vibrotactile
displays. We simulate spatial continuity with three perceptual
models commonly used to create phantom sensations: the linear,
logarithmic and power model. We show that these generic
models lead to limited decoding precision, and propose a
method for model personalization adjusting to idiosyncratic
and spatial variations in perceptual sensitivity. We evaluate
this approach using two haptic display layouts: circular, worn
around the wrist and the upper arm, and straight, worn along
the forearm. Results of a user study measuring continuous
value decoding precision show that users were able to decode
continuous values with relatively high accuracy (4.4% mean
error), circular layouts performed particularly well, and personalisation
through sensitivity adjustment increased decoding
precision
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
- …