27 research outputs found

    Contact geometry and mechanics predict friction forces during tactile surface exploration

    Get PDF
    International audienceWhen we touch an object, complex frictional forces are produced, aiding us in perceiving surface features that help to identify the object at hand, and also facilitating grasping and manipulation. However, even during controlled tactile exploration, sliding friction forces fluctuate greatly, and it is unclear how they relate to the surface topography or mechanics of contact with the finger. We investigated the sliding contact between the finger and different relief surfaces, using high-speed video and force measurements. Informed by these experiments, we developed a friction force model that accounts for surface shape and contact mechanical effects, and is able to predict sliding friction forces for different surfaces and exploration speeds. We also observed that local regions of disconnection between the finger and surface develop near high relief features, due to the stiffness of the finger tissues. Every tested surface had regions that were never contacted by the finger; we refer to these as " tactile blind spots ". The results elucidate friction force production during tactile exploration, may aid efforts to connect sensory and motor function of the hand to properties of touched objects, and provide crucial knowledge to inform the rendering of realistic experiences of touch contact in virtual reality

    The development of a new artificial model of a finger for assessing transmitted vibrations.

    Get PDF
    Prolonged exposure of the hand to tool-induced vibrations is associated with the occurrence of conditions such as vibration white finger. This study involves the development of a new artificial model that approximates both loading and vibration behaviour of the human finger. The layered system uses polypropylene "bones", encased in a cylinder of low modulus, room-temperature curing silicone gel (to replicate subcutaneous tissues), with an outer layer of latex (to replicate the dermis and epidermis). A protocol for manufacture was developed and dynamic mechanical analysis was carried out on a range of gels in order to choose a range close to the mechanical properties of the human finger. The load-deflection behaviour under quasi-static loading was obtained using an indenter. The indentation measurements were then compared with a set of validation data obtained from human participant testing under the same conditions. A 2-D FE model of the finger was also used to assess vibration responses using existing parameters for a human finger and those obtained from the tested materials. Vibration analysis was conducted under swept sinusoidal excitations ranging from 10 to 400Hz whilst the FE finger model was pressed 6mm toward the handle. Results were found to compare well. This synthetic test-bed and protocol can now be used in future experiments for assessing finger-transmitted vibrations. For instance, it can aid in assessing anti-vibration glove materials without the need for human subjects and provide consistent control of test parameters such as grip force

    Multi-physics modelling and experimental validation of electrovibration based haptic devices

    Get PDF
    Electrovibration tactile displays exploit the polarisation of the finger pad, caused by an insulated high voltage supplied plate. This results in electrostatic attraction, which can be used to modulate the users' perception of an essentially flat surface and induce texture sensation. Two analytical models of electrovibration, based on parallel plate capacitor assumption, are demonstrably taken and assessed by comparisons with experimental results published in literature. In addition, an experimental setup was developed to measure the electrostatic force between the finger pad and a high voltage supplied plate in a static and out-of-contact state in order to support the use of parallel plate capacitor model. Development, validation, and application of a computational framework for modelling tactile scenarios on real and virtual surfaces rendered by electrovibration technique is presented. The framework incorporates fully parametric model in terms of materials and geometry of the finger pad, virtual and real surfaces, and can serve as a tool for virtual prototyping and haptic rendering in electrovibration tactile displays. This is achieved by controlling the applied voltage signal in order to guarantee similar lateral force cues in real and simulated surfaces

    Role of mechanics in tactile sensing of shape

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1995.Includes bibliographical references (leaves 199-205).by Kiran Dandekar.Ph.D

    Tactile sensing of shape : biomechanics of contact investigated using imaging and modeling

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2006.Includes bibliographical references (leaves 123-131).The overall goal of this research effort is to improve the understanding of the biomechanics of skin as it pertains to human tactile sense. During touch, mechanoreceptors beneath the skin surface are mechanically loaded due to physical contact of the skin with an object and respond with a series of neural impulses. This neural population response is decoded by the central nervous system to result in tactile perception of properties such as the shape, surface texture and softness of the object. The particular approach taken in this research is to develop a realistic model of the human fingertip based on empirical measurements of in vivo geometric and material properties of skin layers, so that the mechanical response of the fingertip skin to different shapes of objects in contact can be investigated, to help identify the relevant mechanism that triggers the mechanoreceptors in tactile encoding of object shape. To obtain geometric data on the ridged skin surface and the layers underneath together with their deformation patterns, optical coherence tomography (OCT) was used to image human fingertips in vivo, free of load as well as when loaded with rigid indenters of different shapes.(cont.) The images of undeformed and deformed finger pads were obtained, processed, and used for biomechanically validating the fingertip model. To obtain material properties of skin layers, axial strain imaging using high frequency ultrasound backscatter microscopy (UBM) was utilized in experiments on human fingertips in vivo to estimate the ratio of stiffnesses of the epidermis and dermis. By utilizing the data from OCT and UBM experiments, a multilayered three dimensional finite element model of the human fingertip composed of the ridged fingerpad skin surface as well as the papillary interface between the epidermis and dermis was developed. The model was used to simulate static indentation of the fingertip by rigid objects of different shapes and to compute stress and strain measures, such as strain energy density (SED), and maximum compressive or tensile strain (MCS, MTS), which have been previously proposed as the relevant stimuli that trigger mechanoreceptor response.(cont.) The results showed that the intricate geometry of skin layers and inhomogeneous material properties around the locations of the SA-I and RA mechanoreceptors caused significant differences in the spatial distribution of candidate relevant stimuli, compared with other locations at the same depths or the predictions from previous homogeneous models of the fingertip. The distribution of the SED at the locations of SA-I mechanoreceptors and the distribution of MCS/MTS at the locations of RA mechanoreceptors under indentation of different object shapes were obtained to serve as predictions to be tested in future biomechanical and neurophysiological experiments.by Wan-Chen Wu.Ph.D

    Design and analysis of fingernail sensors for measurement of fingertip touch fouce and finger posture

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2002.Includes bibliographical references (leaves 142-148).A new type of wearable sensor for detecting fingertip touch force and finger posture is presented. Unlike traditional electronic gloves, in which sensors are embedded along the finger and on the fingerpads, this new device does not constrict finger motion and allows the fingers to directly contact the environment without obstructing the human's natural haptic senses. The fingertip touch force and finger posture are detected by measuring changes in the coloration of the fingernail; hence, the sensor is mounted on the fingernail and does not interfere with bending or touching actions. Specifically, the fingernail is instrumented with miniature light emitting diodes (LEDs) and photodetectors in order to measure changes in the reflection intensity when the fingertip is pressed against a surface or when the finger is bent. The changes in intensity are then used to determine changes in the blood volume under the fingernail, a technique termed "reflectance photoplethysmography." By arranging the LEDs and photodetectors in a spatial array, the two-dimensional pattern of blood volume can be measured and used to predict the touch force and posture. This thesis first underscores the role of the fingernail sensor as a means of indirectly detecting fingertip touch force and finger posture by measuring the internal state of the finger. Desired functionality and principles of photoplethysmography are used to create a set of design goals and guidelines for such a sensor.(cont.) A working miniaturized prototype nail sensor is designed, built, tested, and analyzed. Based on fingertip anatomy and photographic evidence, mechanical and hemodynamic models are created in order to understand the mechanism of the blood volume change at multiple locations within the fingernail bed. These models are verified through experiment and simulation. Next, data-driven, mathematical models or filters are designed to comprehensively predict normal touching forces, shear touching forces, and finger bending based on readings from the sensor. A method to experimentally calibrate the filters is designed, implemented, and validated. Using these filters, the sensors are capable of predicting forces to within 0.5 N RMS error and posture angle to within 10 degrees RMS error. Performances of the filters are analyzed, compared, and used to suggest design guidelines for the next generation of sensors. Finally, applications to human-machine interface are discussed and tested, and potential impacts of this work on the fields of virtual reality and robotics are proposed.by Stephen A. Mascaro.Ph.D

    Modeling of frictional forces during bare-finger interactions with solid surfaces

    Get PDF
    Touching an object with our fingers yields frictional forces that allow us to perceive and explore its texture, shape, and other features, facilitating grasping and manipulation. While the relevance of dynamic frictional forces to sensory and motor function in the hand is well established, the way that they reflect the shape, features, and composition of touched objects is poorly understood. Haptic displays -electronic interfaces for stimulating the sense of touch- often aim to elicit the perceptual experience of touching real surfaces by delivering forces to the fingers that mimic those felt when touching real surfaces. However, the design and applications of such displays have been limited by the lack of knowledge about what forces are felt during real touch interactions. This represents a major gap in current knowledge about tactile function and haptic engineering. This dissertation addresses some aspects that would assist in their understanding. The goal of this research was to measure, characterize, and model frictional forces produced by a bare finger sliding over surfaces of multiple shapes. The major contributions of this work are (1) the design and development of a sensing system for capturing fingertip motion and forces during tactile exploration of real surfaces; (2) measurement and characterization of contact forces and the deformation of finger tissues during sliding over relief surfaces; (3) the development of a low order model of frictional force production based on surface specifications; (4) the analysis and modeling of contact geometry, interfacial mechanics, and their effects in frictional force production during tactile exploration of relief surfaces. This research aims to guide the design of algorithms for the haptic rendering of surface textures and shape. Such algorithms can be used to enhance human-machine interfaces, such as touch-screen displays, by (1) enabling users to feel surface characteristics also presented visually; (2) facilitating interaction with these devices; and (3) reducing the need for visual input to interact with them.Ph.D., Electrical Engineering -- Drexel University, 201

    The Presentation and Perception of Virtual Textures through a Haptic Matrix Display Device

    Get PDF
    Dynamic, refreshable tactile displays offer a method of displaying graphical information to people who are blind or visually impaired. Texture, which is already used as an effective method to present graphical information in physical tactile diagrams, conceivably constitutes the best way to present graphics through a tactile display. This thesis presents the design of a new low-cost haptic matrix display device capable of displaying graphical information through virtual textures. The perception of virtual textures through the display is examined through three main experiments. The first two experiments examine the perception of square wave gratings through the device. The final experiment examines the effect of texture adaptation when using the device, and compares it to exploration with a handheld probe and the bare finger. The results show that haptic matrix displays can be used to display graphical information through texture and offer guidelines in the production of such textures

    Virtual environments for medical training : graphic and haptic simulation of tool-tissue interactions

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2004.Includes bibliographical references (leaves 122-127).For more than 2,500 years, surgical teaching has been based on the so called "see one, do one, teach one" paradigm, in which the surgical trainee learns by operating on patients under close supervision of peers and superiors. However, higher demands on the quality of patient care and rising malpractice costs have made it increasingly risky to train on patients. Minimally invasive surgery, in particular, has made it more difficult for an instructor to demonstrate the required manual skills. It has been recognized that, similar to flight simulators for pilots, virtual reality (VR) based surgical simulators promise a safer and more comprehensive way to train manual skills of medical personnel in general and surgeons in particular. One of the major challenges in the development of VR-based surgical trainers is the real-time and realistic simulation of interactions between surgical instruments and biological tissues. It involves multi-disciplinary research areas including soft tissue mechanical behavior, tool-tissue contact mechanics, computer haptics, computer graphics and robotics integrated into VR-based training systems. The research described in this thesis addresses many of the problems of simulating tool-tissue interactions in medical virtual environments. First, two kinds of physically based real time soft tissue models - the local deformation and the hybrid deformation model - were developed to compute interaction forces and visual deformation fields that provide real-time feed back to the user. Second, a system to measure in vivo mechanical properties of soft tissues was designed, and eleven sets of animal experiments were performed to measure in vivo and in vitro biomechanical properties of porcine intra-abdominal organs. Viscoelastic tissue(cont.) parameters were then extracted by matching finite element model predictions with the empirical data. Finally, the tissue parameters were combined with geometric organ models segmented from the Visible Human Dataset and integrated into a minimally invasive surgical simulation system consisting of haptic interface devices inside a mannequin and a graphic display. This system was used to demonstrate deformation and cutting of the esophagus, where the user can haptically interact with the virtual soft tissues and see the corresponding organ deformation on the visual display at the same time.by Jung Kim.Ph.D
    corecore