2,254 research outputs found

    A Novel Haptic Texture Display Based on Image Processing

    Get PDF

    Automatic Filter Design for Synthesis of Haptic Textures from Recorded Acceleration Data

    Get PDF
    Sliding a probe over a textured surface generates a rich collection of vibrations that one can easily use to create a mental model of the surface. Haptic virtual environments attempt to mimic these real interactions, but common haptic rendering techniques typically fail to reproduce the sensations that are encountered during texture exploration. Past approaches have focused on building a representation of textures using a priori ideas about surface properties. Instead, this paper describes a process of synthesizing probe-surface interactions from data recorded from real interactions. We explain how to apply the mathematical principles of Linear Predictive Coding (LPC) to develop a discrete transfer function that represents the acceleration response under specific probe-surface interaction conditions. We then use this predictive transfer function to generate unique acceleration signals of arbitrary length. In order to move between transfer functions from different probe-surface interaction conditions, we develop a method for interpolating the variables involved in the texture synthesis process. Finally, we compare the results of this process with real recorded acceleration signals, and we show that the two correlate strongly in the frequency domain

    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data

    Get PDF
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying oneā€™s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the userā€™s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts

    Haptic Displayof Realistic Tool Contact via Dynamically Compensated Control of a Dedicated Actuator

    Get PDF
    High frequency contact accelerations convey important information that the vast majority of haptic interfaces cannot render. Building on prior work, we present an approach to haptic interface design that uses a dedicated linear voice coil actuator and a dynamic system model to allow the user to feel these signals. This approach was tested through use in a bilateral teleoperation experiment where a user explored three textured surfaces under three different acceleration control architectures: none, constant gain, and dynamic compensation. The controllers that use the dedicated actuator vastly outperform traditional position-position control at conveying realistic contact accelerations. Analysis of root mean square error, linear regression, and discrete Fourier transforms of the acceleration data also indicate a slight performance benefit for dynamic compensation over constant gain

    High Frequency Acceleration Feedback Significantly Increases the Realism of Haptically Rendered Textured Surfaces

    Get PDF
    Almost every physical interaction generates high frequency vibrations, especially if one of the objects is a rigid tool. Previous haptics research has hinted that the inclusion or exclusion of these signals plays a key role in the realism of haptically rendered surface textures, but this connection has not been formally investigated until now. This paper presents a human subject study that compares the performance of a variety of surface rendering algorithms for a master-slave teleoperation system; each controller provides the user with a different combination of position and acceleration feedback, and subjects compared the renderings with direct tool-mediated exploration of the real surface. We use analysis of variance to examine quantitative performance metrics and qualitative realism ratings across subjects. The results of this study show that algorithms that include high-frequency acceleration feedback in combination with position feedback achieve significantly higher realism ratings than traditional position feedback alone. Furthermore, we present a frequency-domain metric for quantifying a controller\u27s acceleration feedback performance; given a constant surface stiffness, the median of this metric across subjects was found to have a significant positive correlation with median realism rating

    Haptography: capturing the feel of real objects to enable authentic haptic rendering (invited paper)

    Get PDF
    Haptic interfaces are designed to allow humans to touch virtual objects as though they were real. Unfortunately, virtual surface models currently require extensive hand tuning and do not feel authentic, which limits the usefulness and applicability of such systems. The proposed approach of haptography seeks to address this deficiency by basing models on haptic data recorded from real interactions between a human and a target object. The studio haptographer uses a fully instrumented stylus to tap, press, and stroke an item in a controlled environment while a computer system records positions, orientations, velocities, accelerations, and forces. The point-and-touch haptographer carries a simply instrumented stylus around during daily life, using it to capture interesting haptic properties of items in the real world. Recorded data is distilled into a haptograph, the haptic impression of the object or surface patch, including properties such as local shape, stiffness, friction, and texture. Finally, the feel of the probed object is recreated via a haptic interface by accounting for the device\u27s natural dynamics and focusing on the feedback of high-frequency accelerations

    Haptic Experience and the Design of Drawing Interfaces

    Get PDF
    Haptic feedback has the potential to enhance usersā€™ sense of being engaged and creative in their artwork. Current work on providing haptic feedback in computer-based drawing applications has focused mainly on the realism of the haptic sensation rather than the usersā€™ experience of that sensation in the context of their creative work. We present a study that focuses on user experience of three haptic drawing interfaces. These interfaces were based on two different haptic metaphors, one of which mimicked familiar drawing tools (such as pen, pencil or crayon on smooth or rough paper) and the other of which drew on abstract descriptors of haptic experience (roughness, stickiness, scratchiness and smoothness). It was found that users valued having control over the haptic sensation; that each metaphor was preferred by approximately half of the participants; and that the real world metaphor interface was considered more helpful than the abstract one, whereas the abstract interface was considered to better support creativity. This suggests that future interfaces for artistic work should have user-modifiable interaction styles for controlling the haptic sensation

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Hierarchical tactile sensation integration from prosthetic fingertips enables multi-texture surface recognition\u3csup\u3eā€ \u3c/sup\u3e

    Get PDF
    Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 Ā± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands
    • ā€¦
    corecore