3,476 research outputs found
Haptic Experience and the Design of Drawing Interfaces
Haptic feedback has the potential to enhance usersâ sense of being engaged and creative in their artwork. Current work on providing haptic feedback in computer-based drawing applications has focused mainly on the realism of the haptic sensation rather than the usersâ experience of that sensation in the context of their creative work. We present a study that focuses on user experience of three haptic drawing interfaces. These interfaces were based on two different haptic metaphors, one of which mimicked familiar drawing tools (such as pen, pencil or crayon on smooth or rough paper) and the other of which drew on abstract descriptors of haptic experience (roughness, stickiness, scratchiness and smoothness). It was found that users valued having control over the haptic sensation; that each metaphor was preferred by approximately half of the participants; and that the real world metaphor interface was considered more helpful than the abstract one, whereas the abstract interface was considered to better support creativity. This suggests that future interfaces for artistic work should have user-modifiable interaction styles for controlling the haptic sensation
Real-time hybrid cutting with dynamic fluid visualization for virtual surgery
It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery
HAPTIC VISUALIZATION USING VISUAL TEXTURE INFORMATION
Haptic enables users to interact and manipulate virtual objects. Although haptic
research has influenced many areas yet the inclusion of computer haptic into
computer vision, especially content based image retrieval (CBIR), is still few and
limited. The purpose of this research is to design and validate a haptic texture search
framework that will allow texture retrieval to be done not just visually but also
haptically. Hence, this research is addressing the gap between the computer haptic and
CBIR fields.
In this research, the focus is on cloth textures. The design of the proposed
framework involves haptic texture rendering algorithm and query algorithm. The
proposed framework integrates computer haptic and content based image retrieval
(CBIR) where haptic texture rendering is performed based on extracted cloth data. For
the query purposes, the data are characterized and the texture similarity is calculated.
Wavelet decomposition is utilized to extract data information from texture data. In
searching process, the data are retrieved based on data distribution.
The experiments to validate the framework have shown that haptic texture
rendering can be performed by employing techniques that involve either a simple
waveform or visual texture information. While rendering process was performed
instability forces were generated during the rendering process was due to the
limitation of the device. In the query process, accuracy is determined by the number
of feature vector elements, data extraction, and similarity measurement algorithm. A
user testing to validate the framework shows that usersâ perception of haptic feedback
differs depending on the different type of rendering algorithm. A simple rendering
algorithm, i.e. sine wave, produces a more stable force feedback, yet lacks surface
details compared to the visual texture information approach
Combining Texture-Derived Vibrotactile Feedback, Concatenative Synthesis and Photogrammetry for Virtual Reality Rendering
(Abstract to follow
Recommended from our members
Modelling of surface identifying characteristics using Fourier series
Texture and small-scale surface details are widely recognised as playing an important role in the haptic identification of objects. In order to simulate realistic textures in haptic virtual environments, it has become increasingly necessary to identify a robust technique for modelling of surface profiles. This paper describes a method whereby Fourier series spectral analysis is employed in order to describe the measured surface profiles of several characteristic surfaces. The results presented suggest that a bandlimited Fourier series can be used to provide a realistic approximation to surface amplitude profiles
MetaSpace II: Object and full-body tracking for interaction and navigation in social VR
MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple
users can not only see and hear but also interact with each other, grasp and
manipulate objects, walk around in space, and get tactile feedback. MS2 allows
walking in physical space by tracking each user's skeleton in real-time and
allows users to feel by employing passive haptics i.e., when users touch or
manipulate an object in the virtual world, they simultaneously also touch or
manipulate a corresponding object in the physical world. To enable these
elements in VR, MS2 creates a correspondence in spatial layout and object
placement by building the virtual world on top of a 3D scan of the real world.
Through the association between the real and virtual world, users are able to
walk freely while wearing a head-mounted device, avoid obstacles like walls and
furniture, and interact with people and objects. Most current virtual reality
(VR) environments are designed for a single user experience where interactions
with virtual objects are mediated by hand-held input devices or hand gestures.
Additionally, users are only shown a representation of their hands in VR
floating in front of the camera as seen from a first person perspective. We
believe, representing each user as a full-body avatar that is controlled by
natural movements of the person in the real world (see Figure 1d), can greatly
enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video:
http://living.media.mit.edu/projects/metaspace-ii
- âŚ