2 research outputs found

    Finding a Viable Neural Network Architecture for Use with Upper Limb Prosthetics

    Get PDF
    This paper attempts to answer the question of if it’s possible to produce a simple, quick, and accurate neural network for the use in upper-limb prosthetics. Through the implementation of convolutional and artificial neural networks and feature extraction on electromyographic data different possible architectures are examined with regards to processing time, complexity, and accuracy. It is found that the most accurate architecture is a multi-entry categorical cross entropy convolutional neural network with 100% accuracy. The issue is that it is also the slowest method requiring 9 minutes to run. The next best method found was a single-entry binary cross entropy convolutional neural network, which was able to reach an accuracy of about 95% in as little as 5 minutes. These time values, while being high for this research, are still a good deal faster than those found in some previous studies. These methods show promise in the popularization of machine learning algorithms in commercial prosthetics, which is something that is still uncommon

    Evoked Somatosensory Feedback for Closed-Loop Control of Prosthetic Hand

    Get PDF
    Somatosensory feedback, such as tactile and proprioceptive feedback, is essential to our daily sensorimotor tasks. A lack of sensory information limits meaningful human-machine interactions. Different somatosensory feedback strategies have been developed in recent years. Non-invasive sensory substitutional approaches often evoke sensations that are unintuitive, requiring extensive sensory training. Alternatively, invasive neural stimulation can elicit intuitive percepts that are interpretable readily by prosthetic hand users; however, the invasive nature of the procedure limits wide clinical applications. To overcome these issues, we developed a multimodal sensory feedback approach that delivers tactile and proprioceptive information non-invasively. We used a skin-surface nerve stimulation array to target afferent fibers in the peripheral nerves, which can elicit intuitive tactile feedback at the fingertips. We used a vibrotactile array to deliver proprioceptive percepts encoding kinematic information of prosthetic joints. First, we evaluated whether the peripheral nerve stimulation technique could be used for the recognition of object properties. Evoked tactile sensations were modulated using forces recorded by a sensorized prosthesis not actively controlled by the users. We demonstrated that the elicited tactile sensation at the fingertips can enable recognition of object shape and surface topology. Second, we evaluated how evoked tactile feedback can be integrated into the functional utility of a prosthetic hand. We quantified the benefits of tactile feedback under different myoelectric control strategies, when participants performed an object manipulation task. We showed an improved task success rate and reduced muscle activation effort when tactile feedback was provided. Finally, we investigated whether multimodal (tactile and proprioceptive) feedback can enable the recognition of more complex object properties during active control of a prosthetics hand. We found that integrated tactile and proprioceptive feedback allowed for simultaneous recognition of multiple object properties (size and stiffness) in individuals with and without an arm amputation. Overall, this work demonstrates that artificially evoked somatosensory feedback can be utilized effectively to improve the closed-loop control of prostheses. These outcomes highlight the critical role of somatosensory feedback during human-machine interactions, which can enhance functional utility of prosthetic devices and promote user experience and confidence.Doctor of Philosoph
    corecore