7,721 research outputs found

    Haptic Glove and Platform with Gestural Control For Neuromorphic Tactile Sensory Feedback In Medical Telepresence

    Get PDF
    Advancements in the study of the human sense of touch are fueling the field of haptics. This is paving the way for augmenting sensory perception during object palpation in tele-surgery and reproducing the sensed information through tactile feedback. Here, we present a novel tele-palpation apparatus that enables the user to detect nodules with various distinct stiffness buried in an ad-hoc polymeric phantom. The contact force measured by the platform was encoded using a neuromorphic model and reproduced on the index fingertip of a remote user through a haptic glove embedding a piezoelectric disk. We assessed the effectiveness of this feedback in allowing nodule identification under two experimental conditions of real-time telepresence: In Line of Sight (ILS), where the platform was placed in the visible range of a user; and the more demanding Not In Line of Sight (NILS), with the platform and the user being 50 km apart. We found that the entailed percentage of identification was higher for stiffer inclusions with respect to the softer ones (average of 74% within the duration of the task), in both telepresence conditions evaluated. These promising results call for further exploration of tactile augmentation technology for telepresence in medical interventions

    Distributed functions of detection and discrimination of vibrotactile stimuli in the hierarchical human somatosensory system

    Get PDF
    According to the hierarchical view of human somatosensory network, somatic sensory information is relayed from the thalamus to primary somatosensory cortex (Si), and then distributed to adjacent cortical regions to perform further perceptual and cognitive functions. Although a number of neuroirnaging studies have examined neuronal activity correlated with tactile stimuli, comparatively less attention has been devoted toward understanding how vibrotactile stimulus information is processed in the hierarchical somatosensory cortical network. To explore the hierarchical perspective of tactile information processing, we studied two cases: (a) discrimination between the locations of finger stimulation; and (b) detection of stimulation against no stimulation on individual fingers, using both standard general linear model (GLM) and searchlight multi-voxel pattern analysis (MVPA) techniques. These two cases were studied on the same data set resulting from a passive vibrotactile stimulation experiment. Our results showed that vibrotactile stimulus locations on fingers could be discriminated from measurements of human functional magnetic resonance imaging (fMRI). In particular, it was in case (a) we observed activity in contralateral posterior parietal cortex (PPC) and supramarginal gyrus (SMG) but not in Si, while in case; (b) we found significant cortical activations in Si but not in PPC and SMG. These discrepant observations suggest the functional specialization with regard to vibrotactile stimulus locations, especially, the hierarchical information processing in the human somatosensory cortical areas. Our findings moreover support the general understanding that Si is the main sensory receptive area for the sense of touch, and adjacent cortical regions (i.e., PPC and SMG) are in charge of a higher level of processing and may thus contribute most for the successful classification between stimulated finger locations.open0

    Push to know! -- Visuo-Tactile based Active Object Parameter Inference with Dual Differentiable Filtering

    Full text link
    For robotic systems to interact with objects in dynamic environments, it is essential to perceive the physical properties of the objects such as shape, friction coefficient, mass, center of mass, and inertia. This not only eases selecting manipulation action but also ensures the task is performed as desired. However, estimating the physical properties of especially novel objects is a challenging problem, using either vision or tactile sensing. In this work, we propose a novel framework to estimate key object parameters using non-prehensile manipulation using vision and tactile sensing. Our proposed active dual differentiable filtering (ADDF) approach as part of our framework learns the object-robot interaction during non-prehensile object push to infer the object's parameters. Our proposed method enables the robotic system to employ vision and tactile information to interactively explore a novel object via non-prehensile object push. The novel proposed N-step active formulation within the differentiable filtering facilitates efficient learning of the object-robot interaction model and during inference by selecting the next best exploratory push actions (where to push? and how to push?). We extensively evaluated our framework in simulation and real-robotic scenarios, yielding superior performance to the state-of-the-art baseline.Comment: 8 pages. Accepted at IROS 202

    Doctor of Philosophy

    Get PDF
    dissertationVirtual environments provide a consistent and relatively inexpensive method of training individuals. They often include haptic feedback in the form of forces applied to a manipulandum or thimble to provide a more immersive and educational experience. However, the limited haptic feedback provided in these systems tends to be restrictive and frustrating to use. Providing tactile feedback in addition to this kinesthetic feedback can enhance the user's ability to manipulate and interact with virtual objects while providing a greater level of immersion. This dissertation advances the state-of-the-art by providing a better understanding of tactile feedback and advancing combined tactilekinesthetic systems. The tactile feedback described within this dissertation is provided by a finger-mounted device called the contact location display (CLD). Rather than displaying the entire contact surface, the device displays (feeds back) information only about the center of contact between the user's finger and a virtual surface. In prior work, the CLD used specialized two-dimensional environments to provide smooth tactile feedback. Using polygonal environments would greatly enhance the device's usefulness. However, the surface discontinuities created by the facets on these models are rendered through the CLD, regardless of traditional force shading algorithms. To address this issue, a haptic shading algorithm was developed to provide smooth tactile and kinesthetic interaction with general polygonal models. Two experiments were used to evaluate the shading algorithm. iv To better understand the design requirements of tactile devices, three separate experiments were run to evaluate the perception thresholds for cue localization, backlash, and system delay. These experiments establish quantitative design criteria for tactile devices. These results can serve as the maximum (i.e., most demanding) device specifications for tactile-kinesthetic haptic systems where the user experiences tactile feedback as a function of his/her limb motions. Lastly, a revision of the CLD was constructed and evaluated. By taking the newly evaluated design criteria into account, the CLD device became smaller and lighter weight, while providing a full two degree-of-freedom workspace that covers the bottom hemisphere of the finger. Two simple manipulation experiments were used to evaluate the new CLD device
    corecore