1,786 research outputs found

    Real-time classification of multi-modal sensory data for prosthetic hand control

    Get PDF

    Embedded Electronic Systems for Electronic Skin Applications

    Get PDF
    The advances in sensor devices are potentially providing new solutions to many applications including prosthetics and robotics. Endowing upper limb prosthesis with tactile sensors (electronic/sensitive skin) can be used to provide tactile sensory feedback to the amputees. In this regard, the prosthetic device is meant to be equipped with tactile sensing system allowing the user limb to receive tactile feedback about objects and contact surfaces. Thus, embedding tactile sensing system is required for wearable sensors that should cover wide areas of the prosthetics. However, embedding sensing system involves set of challenges in terms of power consumption, data processing, real-time response and design scalability (e-skin may include large number of tactile sensors). The tactile sensing system is constituted of: (i) a tactile sensor array, (ii) an interface electronic circuit, (iii) an embedded processing unit, and (iv) a communication interface to transmit tactile data. The objective of the thesis is to develop an efficient embedded tactile sensing system targeting e-skin application (e.g. prosthetic) by: 1) developing a low power and miniaturized interface electronics circuit, operating in real-time; 2) proposing an efficient algorithm for embedded tactile data processing, affecting the system time latency and power consumption; 3) implementing an efficient communication channel/interface, suitable for large amount of data generated from large number of sensors. Most of the interface electronics for tactile sensing system proposed in the literature are composed of signal conditioning and commercial data acquisition devices (i.e. DAQ). However, these devices are bulky (PC-based) and thus not suitable for portable prosthetics from the size, power consumption and scalability point of view. Regarding the tactile data processing, some works have exploited machine learning methods for extracting meaningful information from tactile data. However, embedding these algorithms poses some challenges because of 1) the high amount of data to be processed significantly affecting the real time functionality, and 2) the complex processing tasks imposing burden in terms of power consumption. On the other hand, the literature shows lack in studies addressing data transfer in tactile sensing system. Thus, dealing with large number of sensors will pose challenges on the communication bandwidth and reliability. Therefore, this thesis exploits three approaches: 1) Developing a low power and miniaturized Interface Electronics (IE), capable of interfacing and acquiring signals from large number of tactile sensors in real-time. We developed a portable IE system based on a low power arm microcontroller and a DDC232 A/D converter, that handles an array of 32 tactile sensors. Upon touch applied to the sensors, the IE acquires and pre-process the sensor signals at low power consumption achieving a battery lifetime of about 22 hours. Then we assessed the functionality of the IE by carrying out Electrical and electromechanical characterization experiments to monitor the response of the interface electronics with PVDF-based piezoelectric sensors. The results of electrical and electromechanical tests validate the correct functionality of the proposed system. In addition, we implemented filtering methods on the IE that reduced the effect of noise in the system. Furthermore, we evaluated our proposed IE by integrating it in tactile sensory feedback system, showing effective deliver of tactile data to the user. The proposed system overcomes similar state of art solutions dealing with higher number of input channels and maintaining real time functionality. 2) Optimizing and implementing a tensorial-based machine learning algorithm for touch modality classification on embedded Zynq System-on-chip (SoC). The algorithm is based on Support Vector Machine classifier to discriminate between three input touch modality classes \u201cbrushing\u201d, \u201crolling\u201d and \u201csliding\u201d. We introduced an efficient algorithm minimizing the hardware implementation complexity in terms of number of operations and memory storage which directly affect time latency and power consumption. With respect to the original algorithm, the proposed approach \u2013 implemented on Zynq SoC \u2013 achieved reduction in the number of operations per inference from 545 M-ops to 18 M-ops and the memory storage from 52.2 KB to 1.7 KB. Moreover, the proposed method speeds up the inference time by a factor of 43 7 at a cost of only 2% loss in accuracy, enabling the algorithm to run on embedded processing unit and to extract tactile information in real-time. 3) Implementing a robust and efficient data transfer channel to transfer aggregated data at high transmission data rate and low power consumption. In this approach, we proposed and demonstrated a tactile sensory feedback system based on an optical communication link for prosthetic applications. The optical link features a low power and wide transmission bandwidth, which makes the feedback system suitable for large number of tactile sensors. The low power transmission is due to the employed UWB-based optical modulation. We implemented a system prototype, consisting of digital transmitter and receiver boards and acquisition circuits to interface 32 piezoelectric sensors. Then we evaluated the system performance by measuring, processing and transmitting data of the 32 piezoelectric sensors at 100 Mbps data rate through the optical link, at 50 pJ/bit communication energy consumption. Experimental results have validated the functionality and demonstrated the real time operation of the proposed sensory feedback system

    Gaussian process autoregression for simultaneous proportional multi-modal prosthetic control with natural hand kinematics

    Get PDF
    Matching the dexterity, versatility, and robustness of the human hand is still an unachieved goal in bionics, robotics, and neural engineering. A major limitation for hand prosthetics lies in the challenges of reliably decoding user intention from muscle signals when controlling complex robotic hands. Most of the commercially available prosthetic hands use muscle-related signals to decode a finite number of predefined motions and some offer proportional control of open/close movements of the whole hand. Here, in contrast, we aim to offer users flexible control of individual joints of their artificial hand. We propose a novel framework for decoding neural information that enables a user to independently control 11 joints of the hand in a continuous manner-much like we control our natural hands. Toward this end, we instructed six able-bodied subjects to perform everyday object manipulation tasks combining both dynamic, free movements (e.g., grasping) and isometric force tasks (e.g., squeezing). We recorded the electromyographic and mechanomyographic activities of five extrinsic muscles of the hand in the forearm, while simultaneously monitoring 11 joints of hand and fingers using a sensorized data glove that tracked the joints of the hand. Instead of learning just a direct mapping from current muscle activity to intended hand movement, we formulated a novel autoregressive approach that combines the context of previous hand movements with instantaneous muscle activity to predict future hand movements. Specifically, we evaluated a linear vector autoregressive moving average model with exogenous inputs and a novel Gaussian process (gP) autoregressive framework to learn the continuous mapping from hand joint dynamics and muscle activity to decode intended hand movement. Our gP approach achieves high levels of performance (RMSE of 8°/s and ρ = 0.79). Crucially, we use a small set of sensors that allows us to control a larger set of independently actuated degrees of freedom of a hand. This novel undersensored control is enabled through the combination of nonlinear autoregressive continuous mapping between muscle activity and joint angles. The system evaluates the muscle signals in the context of previous natural hand movements. This enables us to resolve ambiguities in situations, where muscle signals alone cannot determine the correct action as we evaluate the muscle signals in their context of natural hand movements. gP autoregression is a particularly powerful approach which makes not only a prediction based on the context but also represents the associated uncertainty of its predictions, thus enabling the novel notion of risk-based control in neuroprosthetics. Our results suggest that gP autoregressive approaches with exogenous inputs lend themselves for natural, intuitive, and continuous control in neurotechnology, with the particular focus on prosthetic restoration of natural limb function, where high dexterity is required for complex movements

    Touching on elements for a non-invasive sensory feedback system for use in a prosthetic hand

    Get PDF
    Hand amputation results in the loss of motor and sensory functions, impacting activities of daily life and quality of life. Commercially available prosthetic hands restore the motor function but lack sensory feedback, which is crucial to receive information about the prosthesis state in real-time when interacting with the external environment. As a supplement to the missing sensory feedback, the amputee needs to rely on visual and audio cues to operate the prosthetic hand, which can be mentally demanding. This thesis revolves around finding potential solutions to contribute to an intuitive non-invasive sensory feedback system that could be cognitively less burdensome and enhance the sense of embodiment (the feeling that an artificial limb belongs to one’s own body), increasing acceptance of wearing a prosthesis.A sensory feedback system contains sensors to detect signals applied to the prosthetics. The signals are encoded via signal processing to resemble the detected sensation delivered by actuators on the skin. There is a challenge in implementing commercial sensors in a prosthetic finger. Due to the prosthetic finger’s curvature and the fact that some prosthetic hands use a covering rubber glove, the sensor response would be inaccurate. This thesis shows that a pneumatic touch sensor integrated into a rubber glove eliminates these errors. This sensor provides a consistent reading independent of the incident angle of stimulus, has a sensitivity of 0.82 kPa/N, a hysteresis error of 2.39±0.17%, and a linearity error of 2.95±0.40%.For intuitive tactile stimulation, it has been suggested that the feedback stimulus should be modality-matched with the intention to provide a sensation that can be easily associated with the real touch on the prosthetic hand, e.g., pressure on the prosthetic finger should provide pressure on the residual limb. A stimulus should also be spatially matched (e.g., position, size, and shape). Electrotactile stimulation has the ability to provide various sensations due to it having several adjustable parameters. Therefore, this type of stimulus is a good candidate for discrimination of textures. A microphone can detect texture-elicited vibrations to be processed, and by varying, e.g., the median frequency of the electrical stimulation, the signal can be presented on the skin. Participants in a study using electrotactile feedback showed a median accuracy of 85% in differentiating between four textures.During active exploration, electrotactile and vibrotactile feedback provide spatially matched modality stimulations, providing continuous feedback and providing a displaced sensation or a sensation dispatched on a larger area. Evaluating commonly used stimulation modalities using the Rubber Hand Illusion, modalities which resemble the intended sensation provide a more vivid illusion of ownership for the rubber hand.For a potentially more intuitive sensory feedback, the stimulation can be somatotopically matched, where the stimulus is experienced as being applied on a site corresponding to their missing hand. This is possible for amputees who experience referred sensation on their residual stump. However, not all amputees experience referred sensations. Nonetheless, after a structured training period, it is possible to learn to associate touch with specific fingers, and the effect persisted after two weeks. This effect was evaluated on participants with intact limbs, so it remains to evaluate this effect for amputees.In conclusion, this thesis proposes suggestions on sensory feedback systems that could be helpful in future prosthetic hands to (1) reduce their complexity and (2) enhance the sense of body ownership to enhance the overall sense of embodiment as an addition to an intuitive control system

    Designing Prosthetic Hands With Embodied Intelligence: The KIT Prosthetic Hands

    Get PDF
    Hand prostheses should provide functional replacements of lost hands. Yet current prosthetic hands often are not intuitive to control and easy to use by amputees. Commercially available prostheses are usually controlled based on EMG signals triggered by the user to perform grasping tasks. Such EMG-based control requires long training and depends heavily on the robustness of the EMG signals. Our goal is to develop prosthetic hands with semi-autonomous grasping abilities that lead to more intuitive control by the user. In this paper, we present the development of prosthetic hands that enable such abilities as first results toward this goal. The developed prostheses provide intelligent mechatronics including adaptive actuation, multi-modal sensing and on-board computing resources to enable autonomous and intuitive control. The hands are scalable in size and based on an underactuated mechanism which allows the adaptation of grasps to the shape of arbitrary objects. They integrate a multi-modal sensor system including a camera and in the newest version a distance sensor and IMU. A resource-aware embedded system for in-hand processing of sensory data and control is included in the palm of each hand. We describe the design of the new version of the hands, the female hand prosthesis with a weight of 377 g, a grasping force of 40.5 N and closing time of 0.73 s. We evaluate the mechatronics of the hand, its grasping abilities based on the YCB Gripper Assessment Protocol as well as a task-oriented protocol for assessing the hand performance in activities of daily living. Further, we exemplarily show the suitability of the multi-modal sensor system for sensory-based, semi-autonomous grasping in daily life activities. The evaluation demonstrates the merit of the hand concept, its sensor and in-hand computing systems

    Multimodal human hand motion sensing and analysis - a review

    Get PDF

    A Review of Non-Invasive Haptic Feedback stimulation Techniques for Upper Extremity Prostheses

    Get PDF
    A sense of touch is essential for amputees to reintegrate into their social and work life. The design of the next generation of the prostheses will have the ability to effectively convey the tactile information between the amputee and the artificial limbs. This work reviews non-invasive haptic feedback stimulation techniques to convey the tactile information from the prosthetic hand to the amputee’s brain. Various types of actuators that been used to stimulate the patient’s residual limb for different types of artificial prostheses in previous studies have been reviewed in terms of functionality, effectiveness, wearability and comfort. The non-invasive hybrid feedback stimulation system was found to be better in terms of the stimulus identification rate of the haptic prostheses’ users. It can be conclude that integrating hybrid haptic feedback stimulation system with the upper limb prostheses leads to improving its acceptance among users
    corecore