552 research outputs found

    Dimensional Reduction of High-Frequencey Accelerations for Haptic Rendering

    Get PDF
    Haptics research has seen several recent efforts at understanding and recreating real vibrations to improve the quality of haptic feedback in both virtual environments and teleoperation. To simplify the modeling process and enable the use of single-axis actuators, these previous efforts have used just one axis of a three-dimensional vibration signal, even though the main vibration mechanoreceptors in the hand are know to detect vibrations in all directions. Furthermore, the fact that these mechanoreceptors are largely insensitive to the direction of high-frequency vibrations points to the existence of a transformation that can reduce three-dimensional high-frequency vibration signals to a one-dimensional signal without appreciable perceptual degradation. After formalizing the requirements for this transformation, this paper describes and compares several candidate methods of varying degrees of sophistication, culminating in a novel frequency-domain solution that performs very well on our chosen metrics

    Relaying the High-Frequency Contents of Tactile Feedback to Robotic Prosthesis Users: Design, Filtering, Implementation, and Validation

    Get PDF
    It is known that high-frequency tactile information conveys useful cues to discriminate important contact properties for manipulation, such as first contact and roughness. Despite this, no practical system, implementing a modality matching paradigm, has been developed so far to convey this information to users of upper-limb prostheses. The main obstacle to this implementation is the presence of unwanted vibrations generated by the artificial limb mechanics, which are not related to any haptic exploration task. In this letter, we describe the design of a digital system that can record accelerations from the fingers of an artificial hand and reproduce them on the user's skin through voice-coil actuators. Particular attention has been devoted to the design of the filter, needed to cancel all those vibrations measured by the sensors that do not convey information on meaningful contact events. The performance of the newly designed filter is also compared with the state of the art. Exploratory experiments with prosthesis users have identified some applications where this kind of feedback could lead to sensory-motor performance enhancement. Results show that the proposed system improves the perception of object-salient features such as first-contact events, roughness, and shape

    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data

    Get PDF
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts

    Haptic Transparency and Interaction Force Control for a Lower-Limb Exoskeleton

    Full text link
    Controlling the interaction forces between a human and an exoskeleton is crucial for providing transparency or adjusting assistance or resistance levels. However, it is an open problem to control the interaction forces of lower-limb exoskeletons designed for unrestricted overground walking. For these types of exoskeletons, it is challenging to implement force/torque sensors at every contact between the user and the exoskeleton for direct force measurement. Moreover, it is important to compensate for the exoskeleton's whole-body gravitational and dynamical forces, especially for heavy lower-limb exoskeletons. Previous works either simplified the dynamic model by treating the legs as independent double pendulums, or they did not close the loop with interaction force feedback. The proposed whole-exoskeleton closed-loop compensation (WECC) method calculates the interaction torques during the complete gait cycle by using whole-body dynamics and joint torque measurements on a hip-knee exoskeleton. Furthermore, it uses a constrained optimization scheme to track desired interaction torques in a closed loop while considering physical and safety constraints. We evaluated the haptic transparency and dynamic interaction torque tracking of WECC control on three subjects. We also compared the performance of WECC with a controller based on a simplified dynamic model and a passive version of the exoskeleton. The WECC controller results in a consistently low absolute interaction torque error during the whole gait cycle for both zero and nonzero desired interaction torques. In contrast, the simplified controller yields poor performance in tracking desired interaction torques during the stance phase.Comment: 17 pages, 12 figure

    Spectral Subtraction of Robot Motion Noise for Improved Event Detection in Tactile Acceleration Signals

    Get PDF
    New robots for teleoperation and autonomous manipulation are increasingly being equipped with high-bandwidth accelerometers for measuring the transient vibrational cues that occur during con- tact with objects. Unfortunately, the robot\u27s own internal mechanisms often generate significant high-frequency accelerations, which we term ego-vibrations. This paper presents an approach to characterizing and removing these signals from acceleration measurements. We adapt the audio processing technique of spectral subtraction over short time windows to remove the noise that is estimated to occur at the robot\u27s present joint velocities. Implementation for the wrist roll and gripper joints on a Willow Garage PR2 robot demonstrates that spectral subtraction significantly increases signal-to-noise ratio, which should improve vibrotactile event detection in both teleoperation and autonomous robotics

    A novel tactile display for softness and texture rendering in tele-operation tasks

    Get PDF
    Softness and texture high-frequency information represent fundamental haptic properties for every day life activities and environment tactual exploration. While several displays have been produced to convey either softness or high-frequency information, there is no or little evidence of systems that are able to reproduce both these properties in an integrated fashion. This aspect is especially crucial in medical tele-operated procedures, where roughness and stiffness of human tissues are both important to correctly identify given pathologies through palpation (e.g. in tele-dermatology). This work presents a fabric yielding display (FYD-pad), a fabric-based tactile display for softness and texture rendering. The system exploits the control of two motors to modify both the stretching state of the elastic fabric for softness rendering and to convey texture information on the basis of accelerometer-based data. At the same time, the measurement of the contact area can be used to control remote or virtual robots. In this paper, we discuss the architecture of FYD-pad and the techniques used for softness and texture reproduction as well as for synthesizing probe-surface interactions from real data. Tele-operation examples and preliminary experiments with humans are reported, which show the effectiveness of the device in delivering both softness and texture information

    Automatic Filter Design for Synthesis of Haptic Textures from Recorded Acceleration Data

    Get PDF
    Sliding a probe over a textured surface generates a rich collection of vibrations that one can easily use to create a mental model of the surface. Haptic virtual environments attempt to mimic these real interactions, but common haptic rendering techniques typically fail to reproduce the sensations that are encountered during texture exploration. Past approaches have focused on building a representation of textures using a priori ideas about surface properties. Instead, this paper describes a process of synthesizing probe-surface interactions from data recorded from real interactions. We explain how to apply the mathematical principles of Linear Predictive Coding (LPC) to develop a discrete transfer function that represents the acceleration response under specific probe-surface interaction conditions. We then use this predictive transfer function to generate unique acceleration signals of arbitrary length. In order to move between transfer functions from different probe-surface interaction conditions, we develop a method for interpolating the variables involved in the texture synthesis process. Finally, we compare the results of this process with real recorded acceleration signals, and we show that the two correlate strongly in the frequency domain

    Challenges and Opportunities for Designing Tactile Codecs from Audio Codecs

    Get PDF
    Haptic communications allows physical interaction over long distances and greatly complements conventional means of communications, such as audio and video. However, whilst standardized codecs for video and audio are well established, there is a lack of standardized codecs for haptics. This causes vendor lock-in and thereby greatly limits scalability, increases cost and prevents advanced usage scenarios with multi-sensors/actuators and multi-users. The aim of this paper is to introduce a new approach for understanding and encoding tactile signals, i.e. the sense of touch, among haptic interactions. Inspired by various audio codecs, we develop a similar methodology for tactile codecs. Notably, we demonstrate that tactile and audio signals are similar in both time and frequency domains, thereby allowing audio coding techniques to be adapted to tactile codecs with appropriate adjustments. We also present the differences between audio and tactile signals that should be considered in future designs. Moreover, in order to evaluate the performance of a tactile codec, we propose a potential direction of designing an objective quality metric which complements haptic mean opinion scores (h-MOS). This, we hope, will open the door for designing and assessing tactile codecs
    corecore