5,796 research outputs found

    Advancing Medical Technology for Motor Impairment Rehabilitation: Tools, Protocols, and Devices

    Get PDF
    Excellent motor control skills are necessary to live a high-quality life. Activities such as walking, getting dressed, and feeding yourself may seem mundane, but injuries to the neuromuscular system can render these tasks difficult or even impossible to accomplish without assistance. Statistics indicate that well over 100 million people are affected by diseases or injuries, such as stroke, Parkinson’s Disease, Multiple Sclerosis, Cerebral Palsy, peripheral nerve injury, spinal cord injury, and amputation, that negatively impact their motor abilities. This wide array of injuries presents a challenge to the medical field as optimal treatment paradigms are often difficult to implement due to a lack of availability of appropriate assessment tools, the inability for people to access the appropriate medical centers for treatment, or altogether gaps in technology for treating the underlying impairments causing the disability. Addressing each of these challenges will improve the treatment of movement impairments, provide more customized and continuous treatment to a larger number of patients, and advance rehabilitative and assistive device technology. In my research, the key approach was to develop tools to assess and treat upper extremity movement impairment. In Chapter 2.1, I challenged a common biomechanical[GV1] modeling technique of the forearm. Comparing joint torque values through inverse dynamics simulation between two modeling platforms, I discovered that representing the forearm as a single cylindrical body was unable to capture the inertial parameters of a physiological forearm which is made up of two segments, the radius and ulna. I split the forearm segment into a proximal and distal segment, with the rationale being that the inertial parameters of the proximal segment could be tuned to those of the ulna and the inertial parameters of the distal segment could be tuned to those of the radius. Results showed a marked increase in joint torque calculation accuracy for those degrees of freedom that are affected by the inertial parameters of the radius and ulna. In Chapter 2.2, an inverse kinematic upper extremity model was developed for joint angle calculations from experimental motion capture data, with the rationale being that this would create an easy-to-use tool for clinicians and researchers to process their data. The results show accurate angle calculations when compared to algebraic solutions. Together, these chapters provide easy-to-use models and tools for processing movement assessment data. In Chapter 3.1, I developed a protocol to collect high-quality movement data in a virtual reality task that is used to assess hand function as part of a Box and Block Test. The goal of this chapter is to suggest a method to not only collect quality data in a research setting but can also be adapted for telehealth and at home movement assessment and rehabilitation. Results indicate that the data collected in this protocol are good and the virtual nature of this approach can make it a useful tool for continuous, data driven care in clinic or at home. In Chapter 3.2 I developed a high-density electromyography device for collecting motor unit action potentials of the arm. Traditional surface electromyography is limited by its ability to obtain signals from deep muscles and can also be time consuming to selectively place over appropriate muscles. With this high-density approach, muscle coverage is increased, placement time is decreased, and deep muscle activity can potentially be collected due to the high-density nature of the device[GV2] . Furthermore, the high-density electromyography device is built as a precursor to a high-density electromyography-electrical stimulation device for functional electrical stimulation. The customizable nature of the prototype in Chapter 3.2 allows for the implementation both recording and stimulating electrodes. Furthermore, signal results show that the electromyography data obtained from the device are of high quality and are correlated with gold standard surface electromyography sensors. One key factor in a device that can record and then stimulate based on the information from the recorded signals is an accurate movement intent decoder. High-quality movement decoders have been designed by closed-loop device controllers in the past, but they still struggle when the user interacts with objects of varying weight due to underlying alterations in muscle signals. In Chapter 4, I investigate this phenomenon by administering an experiment where participants perform a Box and Block Task with objects of 3 different weights, 0 kg, 0.02 kg, and 0.1 kg. Electromyography signals of the participants right arm were collected and co-contraction levels between antagonistic muscles were analyzed to uncover alterations in muscle forces and joint dynamics. Results indicated contraction differences between the conditions and also between movement stages (contraction levels before grabbing the block vs after touching the block) for each condition. This work builds a foundation for incorporating object weight estimates into closed-loop electromyography device movement decoders. Overall, we believe the chapters in this thesis provide a basis for increasing availability to movement assessment tools, increasing access to effective movement assessment and rehabilitation, and advance the medical device and technology field

    Human-centered Electric Prosthetic (HELP) Hand

    Get PDF
    Through a partnership with Indian non-profit Bhagwan Mahaveer Viklang Sahayata Samiti, we designed a functional, robust, and and low cost electrically powered prosthetic hand that communicates with unilateral, transradial, urban Indian amputees through a biointerface. The device uses compliant tendon actuation, a small linear servo, and a wearable garment outfitted with flex sensors to produce a device that, once placed inside a prosthetic glove, is anthropomorphic in both look and feel. The prosthesis was developed such that future groups can design for manufacturing and distribution in India

    An Assessment of Single-Channel EMG Sensing for Gestural Input

    Get PDF
    Wearable devices of all kinds are becoming increasingly popular. One problem that plagues wearable devices, however, is how to interact with them. In this paper we construct a prototype electromyography (EMG) sensing device that captures a single channel of EMG sensor data corresponding to user gestures. We also implement a machine learning pipeline to recognize gestural input received via our prototype sensing device. Our goal is to assess the feasibility of using a BITalino EMG sensor to recognize gestural input on a mobile health (mHealth) wearable device known as Amulet. We conduct three experiments in which we use the EMG sensor to collect gestural input data from (1) the wrist, (2) the forearm, and (3) the bicep. Our results show that a single channel EMG sensor located near the wrist may be a viable approach to reliably recognizing simple gestures without mistaking them for common daily activities such as drinking from a cup, walking, or talking while moving your arms

    A myographic-based HCI solution proposal for upper limb amputees

    Get PDF
    "Conference on ENTERprise Information Systems / International Conference on Project MANagement / Conference on Health and Social Care Information Systems and Technologies, CENTERIS / ProjMAN / HCist 2016, October 5-7, 2016 "Interaction plays a fundamental role as it sets bridges between humans and computers. However, people with disability are prevented to use computers by the ordinary means, due to physical or intellectual impairments. Thus, the human-computer interaction (HCI) research area has been developing solutions to improve the technological accessibility of impaired people, by enhancing computers and similar devices with the necessary means to attend to the different disabilities, thereby contributing to reduce digital exclusion. Within the aforementioned scope, this paper presents an interaction solution for upper limb amputees, supported on a myographic gesture-control device named Myo. This device is an emergent wearable technology, which consists in a muscle-sensitive bracelet. It transmits myographic and inertial data, susceptible of being converted into actions for interaction purposes (e.g. clicking or moving a mouse cursor). Although being a gesture control armband, Myo can also be used in the legs, as was ascertained through some preliminary tests with users. Both data types (myographic and inertial) remain to be transmitted and are available to be converted into gestures. A general architecture, a use case diagram and the two main functional modules specification are presented. These will guide the future implementation of the proposed Myo-based HCI solution, which is intended to be a solid contribution for the interaction between upper limb amputees and computers

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control

    Neuromotor Control of the Hand During Smartphone Manipulation

    Get PDF
    The primary focus of this dissertation was to understand the motor control strategy used by our neuromuscular system for the multi-layered motor tasks involved during smartphone manipulation. To understand this control strategy, we recorded the kinematics and multi-muscle activation pattern of the right limb during smartphone manipulation, including grasping with/out tapping, movement conditions (MCOND), and arm heights. In the first study (chapter 2), we examined the neuromuscular control strategy of the upper limb during grasping with/out tapping executed with a smartphone by evaluating muscle-activation patterns of the upper limb during different movement conditions (MCOND). There was a change in muscle activity for MCOND and segments. We concluded that our neuromuscular system generates the motor strategy that would allow smartphone manipulation involving grasping and tapping while maintaining MCOND by generating continuous and distinct multi-muscle activation patterns in the upper limb muscles. In the second study (chapter 3), we examined the muscle activity of the upper limb when the smartphone was manipulated at two arm heights: shoulder and abdomen to understand the influence of the arm height on the neuromuscular control strategy of the upper limb. Some muscles showed a significant effect for ABD, while some muscle showed a significant effect for SHD. We concluded that the motor control strategy was influenced by the arm height as there were changes in the shoulder and elbow joint angles along with the muscular activity of the upper limb. Further, shoulder position helped in holding the head upright while abdomen reduced the moment arm and moment and ultimately, muscle loading compared to the shoulder. Overall, our neuromuscular system generates motor command by activating a multi-muscle activation pattern in the upper limb, which would be dependent upon the task demands such as grasping with/out tapping, MCOND, and arm heights. Similarly, our neuromuscular system does not appear to increase muscle activation when there is a combined effect of MCOND and arm heights. Instead, it utilizes a simple control strategy that would select an appropriate muscle and activate them based on the levels of MCOND and arm heights
    • 

    corecore