434 research outputs found

    Neuromorphic hardware for somatosensory neuroprostheses

    Get PDF
    In individuals with sensory-motor impairments, missing limb functions can be restored using neuroprosthetic devices that directly interface with the nervous system. However, restoring the natural tactile experience through electrical neural stimulation requires complex encoding strategies. Indeed, they are presently limited in effectively conveying or restoring tactile sensations by bandwidth constraints. Neuromorphic technology, which mimics the natural behavior of neurons and synapses, holds promise for replicating the encoding of natural touch, potentially informing neurostimulation design. In this perspective, we propose that incorporating neuromorphic technologies into neuroprostheses could be an effective approach for developing more natural human-machine interfaces, potentially leading to advancements in device performance, acceptability, and embeddability. We also highlight ongoing challenges and the required actions to facilitate the future integration of these advanced technologies

    A Sustainable & Biologically Inspired Prosthetic Hand for Healthcare

    Get PDF
    There are many persons in the world affected by amputation. Upper limb amputations require high cost prosthetic devices in order to provide significant motor recovery. We propose a sustainable design and control of a new anthropomorphic prosthetic hand: all components are modular and exchangeable and they can be assembled by non-expert users. Phalanges & articulations of the fingers and the palm are manufactured via a 3D printing process in Acrylonitrile Butadiene Styrene (ABS) or Polyactic Acid (PLA) materials. The design is optimized in order to provide human-like motion and grasping taxonomy through linear actuators and flexion tendon mechanisms, which are embedded within the palm. HardWare (HW) and Software (SW) open sourced units for ElectroMyography (EMG) input and control can be combined with a user-friendly and intuitive Graphical User Interface (GUI) to enable amputees handling the prosthesis. To reduce the environmental impact of the device lifetime cycle, the material and energy consumption were optimized by adopting: simple design & manufacturing, high dexterity, open source HW and SW, low cost components, anthropomorphic design

    Soft Biomimetic Finger with Tactile Sensing and Sensory Feedback Capabilities

    Get PDF
    The compliant nature of soft fingers allows for safe and dexterous manipulation of objects by humans in an unstructured environment. A soft prosthetic finger design with tactile sensing capabilities for texture discrimination and subsequent sensory stimulation has the potential to create a more natural experience for an amputee. In this work, a pneumatically actuated soft biomimetic finger is integrated with a textile neuromorphic tactile sensor array for a texture discrimination task. The tactile sensor outputs were converted into neuromorphic spike trains, which emulate the firing pattern of biological mechanoreceptors. Spike-based features from each taxel compressed the information and were then used as inputs for the support vector machine (SVM) classifier to differentiate the textures. Our soft biomimetic finger with neuromorphic encoding was able to achieve an average overall classification accuracy of 99.57% over sixteen independent parameters when tested on thirteen standardized textured surfaces. The sixteen parameters were the combination of four angles of flexion of the soft finger and four speeds of palpation. To aid in the perception of more natural objects and their manipulation, subjects were provided with transcutaneous electrical nerve stimulation (TENS) to convey a subset of four textures with varied textural information. Three able-bodied subjects successfully distinguished two or three textures with the applied stimuli. This work paves the way for a more human-like prosthesis through a soft biomimetic finger with texture discrimination capabilities using neuromorphic techniques that provides sensory feedback; furthermore, texture feedback has the potential to enhance the user experience when interacting with their surroundings. Additionally, this work showed that an inexpensive, soft biomimetic finger combined with a flexible tactile sensor array can potentially help users perceive their environment better

    The implications of embodiment for behavior and cognition: animal and robotic case studies

    Full text link
    In this paper, we will argue that if we want to understand the function of the brain (or the control in the case of robots), we must understand how the brain is embedded into the physical system, and how the organism interacts with the real world. While embodiment has often been used in its trivial meaning, i.e. 'intelligence requires a body', the concept has deeper and more important implications, concerned with the relation between physical and information (neural, control) processes. A number of case studies are presented to illustrate the concept. These involve animals and robots and are concentrated around locomotion, grasping, and visual perception. A theoretical scheme that can be used to embed the diverse case studies will be presented. Finally, we will establish a link between the low-level sensory-motor processes and cognition. We will present an embodied view on categorization, and propose the concepts of 'body schema' and 'forward models' as a natural extension of the embodied approach toward first representations.Comment: Book chapter in W. Tschacher & C. Bergomi, ed., 'The Implications of Embodiment: Cognition and Communication', Exeter: Imprint Academic, pp. 31-5

    Hierarchical tactile sensation integration from prosthetic fingertips enables multi-texture surface recognition\u3csup\u3e†\u3c/sup\u3e

    Get PDF
    Multifunctional flexible tactile sensors could be useful to improve the control of prosthetic hands. To that end, highly stretchable liquid metal tactile sensors (LMS) were designed, manufactured via photolithography, and incorporated into the fingertips of a prosthetic hand. Three novel contributions were made with the LMS. First, individual fingertips were used to distinguish between different speeds of sliding contact with different surfaces. Second, differences in surface textures were reliably detected during sliding contact. Third, the capacity for hierarchical tactile sensor integration was demonstrated by using four LMS signals simultaneously to distinguish between ten complex multi-textured surfaces. Four different machine learning algorithms were compared for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the LMSs were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 ± 0.8% accuracy to distinguish between ten different multi-textured surfaces using four LMSs from four fingers simultaneously. The capability for hierarchical multi-finger tactile sensation integration could be useful to provide a higher level of intelligence for artificial hands

    A Biomimetic Approach to Controlling Restorative Robotics

    Get PDF
    Movement is the only way a person can interact with the world around them. When trauma to the neuromuscular systems disrupts the control of movement, quality of life suffers. To restore limb functionality, active robotic interventions and/or rehabilitation are required. Unfortunately, the primary obstacle in a person’s recovery is the limited robustness of the human-machine interfaces. Current systems rely on control approaches that rely on the person to learn how the system works instead of the system being more intuitive and working with the person naturally. My research goal is to design intuitive control mechanisms based on biological processes termed the biomimetic approach. I have applied this control scheme to problems with restorative robotics focused on the upper and lower limb control. Operating an advanced active prosthetic hand is a two-pronged problem of actuating a high-dimensional mechanism and controlling it with an intuitive interface. Our approach attempts to solve these problems by going from muscle activity, electromyography (EMG), to limb kinematics calculated through dynamic simulation of a musculoskeletal model. This control is more intuitive to the user because they attempt to move their hand naturally, and the prosthetic hand performs that movement. The key to this approach was validating simulated muscle paths using both experimental measurements and anatomical constraints where data is missing. After the validation, simulated muscle paths and forces are used to decipher the intended movement. After we have calculated the intended movement, we can move a prosthetic hand to match. This approach required minimal training to give an amputee the ability to control prosthetic hand movements, such as grasping. A more intuitive controller has the potential to improve how people interact and use their prosthetic hands. Similarly, the rehabilitation of the locomotor system in people with damaged motor pathways or missing limbs require appropriate interventions. The problem of decoding human motor intent in a treadmill walking task can be solved with a biomimetic approach. Estimated limb speed is essential for this approach according to the theoretical input-output computation performed by spinal central pattern generators (CPGs), which represents neural circuitry responsible for autonomous control of stepping. The system used the locomotor phases, swing and stance, to estimate leg speeds and enable self-paced walking as well as steering in virtual reality with congruent visual flow. The unique advantage of this system over the previous state-of-art is the independent leg speed control, which is required for multidirectional movement in VR. This system has the potential to contribute to VR gait rehab techniques. Creating biologically-inspired controllers has the potential to improve restorative robotics and allow people a better opportunity to recover lost functionality post-injury. Movement is the only way a person can interact with the world around them. When trauma to the neuromuscular systems disrupts the control of movement, quality of life suffers. To restore limb functionality, active robotic interventions and/or rehabilitation are required. Unfortunately, the primary obstacle in a person’s recovery is the limited robustness of the human-machine interfaces. Current systems rely on control approaches that rely on the person to learn how the system works instead of the system being more intuitive and working with the person naturally. My research goal is to design intuitive control mechanisms based on biological processes termed the biomimetic approach. I have applied this control scheme to problems with restorative robotics focused on the upper and lower limb control.Operating an advanced active prosthetic hand is a two-pronged problem of actuating a high-dimensional mechanism and controlling it with an intuitive interface. Our approach attempts to solve these problems by going from muscle activity, electromyography (EMG), to limb kinematics calculated through dynamic simulation of a musculoskeletal model. This control is more intuitive to the user because they attempt to move their hand naturally, and the prosthetic hand performs that movement. The key to this approach was validating simulated muscle paths using both experimental measurements and anatomical constraints where data is missing. After the validation, simulated muscle paths and forces are used to decipher the intended movement. After we have calculated the intended movement, we can move a prosthetic hand to match. This approach required minimal training to give an amputee the ability to control prosthetic hand movements, such as grasping. A more intuitive controller has the potential to improve how people interact and use their prosthetic hands.Similarly, the rehabilitation of the locomotor system in people with damaged motor pathways or missing limbs require appropriate interventions. The problem of decoding human motor intent in a treadmill walking task can be solved with a biomimetic approach. Estimated limb speed is essential for this approach according to the theoretical input-output computation performed by spinal central pattern generators (CPGs), which represents neural circuitry responsible for autonomous control of stepping. The system used the locomotor phases, swing and stance, to estimate leg speeds and enable self-paced walking as well as steering in virtual reality with congruent visual flow. The unique advantage of this system over the previous state-of-art is the independent leg speed control, which is required for multidirectional movement in VR. This system has the potential to contribute to VR gait rehab techniques.Creating biologically-inspired controllers has the potential to improve restorative robotics and allow people a better opportunity to recover lost functionality post-injury
    • …
    corecore