7 research outputs found

    Haptic Glove and Platform with Gestural Control For Neuromorphic Tactile Sensory Feedback In Medical Telepresence

    Get PDF
    Advancements in the study of the human sense of touch are fueling the field of haptics. This is paving the way for augmenting sensory perception during object palpation in tele-surgery and reproducing the sensed information through tactile feedback. Here, we present a novel tele-palpation apparatus that enables the user to detect nodules with various distinct stiffness buried in an ad-hoc polymeric phantom. The contact force measured by the platform was encoded using a neuromorphic model and reproduced on the index fingertip of a remote user through a haptic glove embedding a piezoelectric disk. We assessed the effectiveness of this feedback in allowing nodule identification under two experimental conditions of real-time telepresence: In Line of Sight (ILS), where the platform was placed in the visible range of a user; and the more demanding Not In Line of Sight (NILS), with the platform and the user being 50 km apart. We found that the entailed percentage of identification was higher for stiffer inclusions with respect to the softer ones (average of 74% within the duration of the task), in both telepresence conditions evaluated. These promising results call for further exploration of tactile augmentation technology for telepresence in medical interventions

    A meta-learning algorithm for respiratory flow prediction from FBG-based wearables in unrestrained conditions

    Get PDF
    The continuous monitoring of an individual's breathing can be an instrument for the assessment and enhancement of human wellness. Specific respiratory features are unique markers of the deterioration of a health condition, the onset of a disease, fatigue and stressful circumstances. The early and reliable prediction of high-risk situations can result in the implementation of appropriate intervention strategies that might be lifesaving. Hence, smart wearables for the monitoring of continuous breathing have recently been attracting the interest of many researchers and companies. However, most of the existing approaches do not provide comprehensive respiratory information. For this reason, a meta-learning algorithm based on LSTM neural networks for inferring the respiratory flow from a wearable system embedding FBG sensors and inertial units is herein proposed. Different conventional machine learning approaches were implemented as well to ultimately compare the results. The meta-learning algorithm turned out to be the most accurate in predicting respiratory flow when new subjects are considered. Furthermore, the LSTM model memory capability has been proven to be advantageous for capturing relevant aspects of the breathing pattern. The algorithms were tested under different conditions, both static and dynamic, and with more unobtrusive device configurations. The meta-learning results demonstrated that a short one-time calibration may provide subject-specific models which predict the respiratory flow with high accuracy, even when the number of sensors is reduced. Flow RMS errors on the test set ranged from 22.03 L/min, when the minimum number of sensors was considered, to 9.97 L/min for the complete setting (target flow range: 69.231 ± 21.477 L/min). The correlation coefficient r between the target and the predicted flow changed accordingly, being higher (r = 0.9) for the most comprehensive and heterogeneous wearable device configuration. Similar results were achieved even with simpler settings which included the thoracic sensors (r ranging from 0.84 to 0.88; test flow RMSE = 10.99 L/min, when exclusively using the thoracic FBGs). The further estimation of respiratory parameters, i.e., rate and volume, with low errors across different breathing behaviors and postures proved the potential of such approach. These findings lay the foundation for the implementation of reliable custom solutions and more sophisticated artificial intelligence-based algorithms for daily life health-related applications

    Neuromorphic haptic glove and platform with gestural control for tactile sensory feedback in medical telepresence applications

    No full text
    This paper presents a tactile telepresence system employed for the localization of stiff inclusions embedded in a soft matrix. The system delivers a neuromorphic spike-based haptic feedback, encoding object stiffness, to the human fingertip. For the evaluation of the developed system, in this study a customized silicon phantom was fabricated inserting 12 inclusions made of 4 different polymers (3 replicas for each material). Such inclusions, all of them having the same shape, were encapsulated in a softer silicon matrix in randomized positions. Two main blocks composed the experimental setup. The first sub-setup included an optical sensor for tracking human hand movements and a piezoelectric disk, inserted into a glove at the level of the index fingertip, to deliver tactile feedback. The second sub-setup was a 3-axis cartesian motorized sensing platform which explored the silicon phantom through a spherical indenter mechanically linked to a load cell. The movements of the platform were based on the acquired hand gestures of the user. The normal force exerted during the active sliding was converted into temporal patterns of spikes through a neuronal model, and delivered to the fingertip via the vibrotactile glove. Inclusions were detected through modulation in the aforementioned patterns generated during the experimental trials. Results suggest that the presented system allows the recognition of the stiffness variation between the encapsulated inclusions and the surrounding matrix. As expected, stiffer inclusions were more frequently discriminated than softer ones, with about 70% of stiffer inclusions being identified in the proposed task. Future works will address the investigation of a larger set of materials in order to evaluate a finer distribution of stiffness values

    Neuromorphic haptic glove and platform with gestural control for tactile sensory feedback in medical telepresence applications

    No full text
    This paper presents a tactile telepresence system employed for the localization of stiff inclusions embedded in a soft matrix. The system delivers a neuromorphic spike-based haptic feedback, encoding object stiffness, to the human fingertip. For the evaluation of the developed system, in this study a customized silicon phantom was fabricated inserting 12 inclusions made of 4 different polymers (3 replicas for each material). Such inclusions, all of them having the same shape, were encapsulated in a softer silicon matrix in randomized positions. Two main blocks composed the experimental setup. The first sub-setup included an optical sensor for tracking human hand movements and a piezoelectric disk, inserted into a glove at the level of the index fingertip, to deliver tactile feedback. The second sub-setup was a 3-axis cartesian motorized sensing platform which explored the silicon phantom through a spherical indenter mechanically linked to a load cell. The movements of the platform were based on the acquired hand gestures of the user. The normal force exerted during the active sliding was converted into temporal patterns of spikes through a neuronal model, and delivered to the fingertip via the vibrotactile glove. Inclusions were detected through modulation in the aforementioned patterns generated during the experimental trials. Results suggest that the presented system allows the recognition of the stiffness variation between the encapsulated inclusions and the surrounding matrix. As expected, stiffer inclusions were more frequently discriminated than softer ones, with about 70% of stiffer inclusions being identified in the proposed task. Future works will address the investigation of a larger set of materials in order to evaluate a finer distribution of stiffness values

    Terrain recognition using neuromorphic haptic feedback

    No full text
    Recent years have witnessed relevant advancements in the quality of life of people with lower limb amputations thanks to technological developments in prosthetics. However, prostheses providing information about the foot-ground interaction, in particular about irregularities in terrain structures are still missing on the market. Lacking tactile feedback from the foot surface, subjects might step into uneven terrain without noticing, increasing the risk of falling. Here, this issue is addressed by evaluating in intact subjects a biomimetic unilateral haptic vibrotactile feedback conveying information about discrete gait events and terrain features relying on the readings of an integrated insole. After shortly experiencing both even and uneven terrains, subjects discriminated them with an accuracy of 87.5%, solely relying on the replay of the vibrotactile feedback. Via a machine learning approach, we estimated that the subjects achieved such performance taking into account a temporal resolution of 45 ms. This work is a leap forward in bringing lower-limb amputees to appreciate the floor conditions while walking, to allow adapting the gait and promoting a more confident use of the artificial limb

    Terrain recognition using neuromorphic haptic feedback

    No full text
    Recent years have witnessed relevant advancements in the quality of life of people with lower limb amputations thanks to technological developments in prosthetics. However, prostheses providing information about the foot-ground interaction, in particular about irregularities in terrain structures are still missing on the market. Lacking tactile feedback from the foot surface, subjects might step into uneven terrain without noticing, increasing the risk of falling. Here, this issue is addressed by evaluating in intact subjects a biomimetic unilateral haptic vibrotactile feedback conveying information about discrete gait events and terrain features relying on the readings of an integrated insole. After shortly experiencing both even and uneven terrains, subjects discriminated them with an accuracy of 87.5%, solely relying on the replay of the vibrotactile feedback. Via a machine learning approach, we estimated that the subjects achieved such performance taking into account a temporal resolution of 45 ms. This work is a leap forward in bringing lower-limb amputees to appreciate the floor conditions while walking, to allow adapting the gait and promoting a more confident use of the artificial limb
    corecore