176 research outputs found

    Contact geometry and mechanics predict friction forces during tactile surface exploration

    Get PDF
    International audienceWhen we touch an object, complex frictional forces are produced, aiding us in perceiving surface features that help to identify the object at hand, and also facilitating grasping and manipulation. However, even during controlled tactile exploration, sliding friction forces fluctuate greatly, and it is unclear how they relate to the surface topography or mechanics of contact with the finger. We investigated the sliding contact between the finger and different relief surfaces, using high-speed video and force measurements. Informed by these experiments, we developed a friction force model that accounts for surface shape and contact mechanical effects, and is able to predict sliding friction forces for different surfaces and exploration speeds. We also observed that local regions of disconnection between the finger and surface develop near high relief features, due to the stiffness of the finger tissues. Every tested surface had regions that were never contacted by the finger; we refer to these as " tactile blind spots ". The results elucidate friction force production during tactile exploration, may aid efforts to connect sensory and motor function of the hand to properties of touched objects, and provide crucial knowledge to inform the rendering of realistic experiences of touch contact in virtual reality

    Modeling of frictional forces during bare-finger interactions with solid surfaces

    Get PDF
    Touching an object with our fingers yields frictional forces that allow us to perceive and explore its texture, shape, and other features, facilitating grasping and manipulation. While the relevance of dynamic frictional forces to sensory and motor function in the hand is well established, the way that they reflect the shape, features, and composition of touched objects is poorly understood. Haptic displays -electronic interfaces for stimulating the sense of touch- often aim to elicit the perceptual experience of touching real surfaces by delivering forces to the fingers that mimic those felt when touching real surfaces. However, the design and applications of such displays have been limited by the lack of knowledge about what forces are felt during real touch interactions. This represents a major gap in current knowledge about tactile function and haptic engineering. This dissertation addresses some aspects that would assist in their understanding. The goal of this research was to measure, characterize, and model frictional forces produced by a bare finger sliding over surfaces of multiple shapes. The major contributions of this work are (1) the design and development of a sensing system for capturing fingertip motion and forces during tactile exploration of real surfaces; (2) measurement and characterization of contact forces and the deformation of finger tissues during sliding over relief surfaces; (3) the development of a low order model of frictional force production based on surface specifications; (4) the analysis and modeling of contact geometry, interfacial mechanics, and their effects in frictional force production during tactile exploration of relief surfaces. This research aims to guide the design of algorithms for the haptic rendering of surface textures and shape. Such algorithms can be used to enhance human-machine interfaces, such as touch-screen displays, by (1) enabling users to feel surface characteristics also presented visually; (2) facilitating interaction with these devices; and (3) reducing the need for visual input to interact with them.Ph.D., Electrical Engineering -- Drexel University, 201

    The virtual haptic back: A simulation for training in palpatory diagnosis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Models and simulations are finding increased roles in medical education. The Virtual Haptic Back (VHB) is a virtual reality simulation of the mechanical properties of the human back designed as an aid to teaching clinical palpatory diagnosis.</p> <p>Methods</p> <p>Eighty-nine first year medical students of the Ohio University College of Osteopathic Medicine carried out six, 15-minute practice sessions with the VHB, plus tests before and after the sessions in order to monitor progress in identifying regions of simulated abnormal tissue compliance. Students palpated with two digits, fingers or thumbs, by placing them in gimbaled thimbles at the ends of PHANToM 3.0<sup>® </sup>haptic interface arms. The interface simulated the contours and compliance of the back surface by the action of electric motors. The motors limited the compression of the virtual tissues induced by the palpating fingers, by generating counterforces. Users could see the position of their fingers with respect to the back on a video monitor just behind the plane of the haptic back. The abnormal region varied randomly among 12 locations between trials. During the practice sessions student users received immediate feedback following each trial, indicating either a correct choice or the actual location of the abnormality if an incorrect choice had been made. This allowed the user to feel the actual abnormality before going on to the next trial. Changes in accuracy, speed and Weber fraction across practice sessions were analyzed using a repeated measures analysis of variance.</p> <p>Results</p> <p>Students improved in accuracy and speed of diagnosis with practice. The smallest difference in simulated tissue compliance users were able to detect improved from 28% (SD = 9.5%) to 14% (SD = 4.4%) during the practice sessions while average detection time decreased from 39 (SD = 19.8) to 17 (SD = 11.7) seconds. When asked in anonymous evaluation questionnaires if they judged the VHB practice to be helpful to them in the clinical palpation and manual medicine laboratory, 41% said yes, 51% said maybe, and 8% said no.</p> <p>Conclusion</p> <p>The VHB has potential value as a teaching aid for students in the initial phases of learning palpatory diagnosis.</p

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Multisensory integration: does haptics improve tumour delineation?

    Get PDF
    The ability to use touch in addition to vision when searching for anomalies and differences in texture is well known to be beneficial to human perception in general. The aim of this thesis is to evaluate the potential benefit of using a haptic signal in conjunction with visual images to improve detection and delineation of tumours in medical imaging data. One of the key issues with tumour delineation in the field today is the interclinician variance in delineating tumours for diagnostics and treatment, where even clinicians who have similar sensitivity and precision levels tend to delineate widely different underlying shapes. Through three experiments we investigate whether the ability to touch a medical image improves tumour delineation. In the first experiment, we show that combined visuohaptic cues significantly improves performance for signal detection of a 2D Gaussian embedded in a noisy background. In the second experiment, we found that the relative dissimilarity of different images per modality did not systematically decrease precision in a two-alternative forced choice (2AFC) slant discrimination task, in a spatially coaligned visuohaptic rig. In the third and final experiment we successfully found that observers are significantly better at delineating generated ‘tumours’ in synthetic ‘medical images’ when the haptic representation of the image is present compared to drawing on a flat surface, in a spatially coaligned visuohaptic rig

    Tactile Arrays for Virtual Textures

    Get PDF
    This thesis describes the development of three new tactile stimulators for active touch, i.e. devices to deliver virtual touch stimuli to the fingertip in response to exploratory movements by the user. All three stimulators are designed to provide spatiotemporal patterns of mechanical input to the skin via an array of contactors, each under individual computer control. Drive mechanisms are based on piezoelectric bimorphs in a cantilever geometry. The first of these is a 25-contactor array (5 × 5 contactors at 2 mm spacing). It is a rugged design with a compact drive system and is capable of producing strong stimuli when running from low voltage supplies. Combined with a PC mouse, it can be used for active exploration tasks. Pilot studies were performed which demonstrated that subjects could successfully use the device for discrimination of line orientation, simple shape identification and line following tasks. A 24-contactor stimulator (6 × 4 contactors at 2 mm spacing) with improved bandwidth was then developed. This features control electronics designed to transmit arbitrary waveforms to each channel (generated on-the-fly, in real time) and software for rapid development of experiments. It is built around a graphics tablet, giving high precision position capability over a large 2D workspace. Experiments using two-component stimuli (components at 40 Hz and 320 Hz) indicate that spectral balance within active stimuli is discriminable independent of overall intensity, and that the spatial variation (texture) within the target is easier to detect at 320 Hz that at 40 Hz. The third system developed (again 6 × 4 contactors at 2 mm spacing) was a lightweight modular stimulator developed for fingertip and thumb grasping tasks; furthermore it was integrated with force-feedback on each digit and a complex graphical display, forming a multi-modal Virtual Reality device for the display of virtual textiles. It is capable of broadband stimulation with real-time generated outputs derived from a physical model of the fabric surface. In an evaluation study, virtual textiles generated from physical measurements of real textiles were ranked in categories reflecting key mechanical and textural properties. The results were compared with a similar study performed on the real fabrics from which the virtual textiles had been derived. There was good agreement between the ratings of the virtual textiles and the real textiles, indicating that the virtual textiles are a good representation of the real textiles and that the system is delivering appropriate cues to the user

    Digital Fabrication Approaches for the Design and Development of Shape-Changing Displays

    Get PDF
    Interactive shape-changing displays enable dynamic representations of data and information through physically reconfigurable geometry. The actuated physical deformations of these displays can be utilised in a wide range of new application areas, such as dynamic landscape and topographical modelling, architectural design, physical telepresence and object manipulation. Traditionally, shape-changing displays have a high development cost in mechanical complexity, technical skills and time/finances required for fabrication. There is still a limited number of robust shape-changing displays that go beyond one-off prototypes. Specifically, there is limited focus on low-cost/accessible design and development approaches involving digital fabrication (e.g. 3D printing). To address this challenge, this thesis presents accessible digital fabrication approaches that support the development of shape-changing displays with a range of application examples – such as physical terrain modelling and interior design artefacts. Both laser cutting and 3D printing methods have been explored to ensure generalisability and accessibility for a range of potential users. The first design-led content generation explorations show that novice users, from the general public, can successfully design and present their own application ideas using the physical animation features of the display. By engaging with domain experts in designing shape-changing content to represent data specific to their work domains the thesis was able to demonstrate the utility of shape-changing displays beyond novel systems and describe practical use-case scenarios and applications through rapid prototyping methods. This thesis then demonstrates new ways of designing and building shape-changing displays that goes beyond current implementation examples available (e.g. pin arrays and continuous surface shape-changing displays). To achieve this, the thesis demonstrates how laser cutting and 3D printing can be utilised to rapidly fabricate deformable surfaces for shape-changing displays with embedded electronics. This thesis is concluded with a discussion of research implications and future direction for this work

    Haptic and Audio-visual Stimuli: Enhancing Experiences and Interaction

    Get PDF
    corecore