32 research outputs found
Synergy-based Hand Pose Sensing: Reconstruction Enhancement
Low-cost sensing gloves for reconstruction posture provide measurements which
are limited under several regards. They are generated through an imperfectly
known model, are subject to noise, and may be less than the number of Degrees
of Freedom (DoFs) of the hand. Under these conditions, direct reconstruction of
the hand posture is an ill-posed problem, and performance can be very poor.
This paper examines the problem of estimating the posture of a human hand
using(low-cost) sensing gloves, and how to improve their performance by
exploiting the knowledge on how humans most frequently use their hands. To
increase the accuracy of pose reconstruction without modifying the glove
hardware - hence basically at no extra cost - we propose to collect, organize,
and exploit information on the probabilistic distribution of human hand poses
in common tasks. We discuss how a database of such an a priori information can
be built, represented in a hierarchy of correlation patterns or postural
synergies, and fused with glove data in a consistent way, so as to provide a
good hand pose reconstruction in spite of insufficient and inaccurate sensing
data. Simulations and experiments on a low-cost glove are reported which
demonstrate the effectiveness of the proposed techniques.Comment: Submitted to International Journal of Robotics Research (2012
Recommended from our members
Advancing the Functionality and Wearability of Robotic Hand Orthoses Towards Activities of Daily Living in Stroke Patients
Post stroke rehabilitation is effective when a large number of motor repetitions are provided to patients. However, conventional physical therapy or traditional desktop-size robot aided rehabilitation do not provide sufficient number of repetitions due to cost and logistical barriers. Our vision is to realize a wearable and functional hand orthosis that could be used outside of controlled, clinical settings, thus allowing for more training repetitions. Furthermore, if such a device can prove effective for Activities of Daily Living (ADLs) while actively worn, this can incentivize patients to increase its use, further enhancing rehabilitative effects. However, in order to provide such clinical benefits, the device must be completely wearable without obtrusive features, and intuitive to control even for non-experts. In this thesis, we thus focus on wearability, functionality, and intuitive intent detection technology for a novel hand robot, and assess its performance when used both as a rehabilitative device and an assistive tool.
A fully wearable device must deliver meaningful manipulation capability in small and lightweight package. In this context, we investigate the capability of single-actuator devices to assist whole hand movement patterns through a network of exotendons. Our prototypes combine a single linear actuator (mounted on a forearm splint) with a network of exotendons (routed on the surface of a soft glove). We investigate two possible tendon network configurations: one that produces full finger extension (overcoming flexor spasticity) and one that combines proximal flexion with distal extension at each finger. In experiments with stroke survivors, we measure the force levels needed to overcome various levels of spasticity and to open the hand for grasping using the first of these configurations, and qualitatively demonstrate the ability to execute fingertip grasps using the second. Our results support the feasibility of developing future wearable devices able to assist a range of manipulation tasks.
In order to further improve the wearability of the device, we propose two designs that provide effective force transmission by increasing moment arms around finger joints. We evaluate the designs with geometric models and experiment using a 3D-printed artificial finger to find force and joint angle characteristics of the suggested structures. We also perform clinical tests with stroke patients to demonstrate the feasibility of the designs. The testing supports the hypothesis that the proposed designs efficiently elicit extension of the digits in patients with spasticity as compared to existing baselines. With the suggested transmission designs, the device can deliver sufficient extension force even when the users have increased muscle tone due to fatigue.
The vision of an orthotic device used for ADLs can only be realized if the patients are able to operate the device themselves. However, the field is generally lacking effective methods by which the user can operate the device: such controls must be effective, intuitive, and robust to the wide range of possible impairment patterns. The variety of encountered upper limb impairment patterns in stroke patients means that a single sensing modality, such as electromyography, might not be sufficient to enable controls for a broad range of users. To address this significant gap, we introduce a multimodal sensing and interaction paradigm for an active hand orthosis. In our proof-of-concept implementation, EMG is complemented by other sensing modalities, such as finger bend and contact pressure sensors. We propose multimodal interaction methods that utilize this sensory data as input, and show they can enable tasks for stroke survivors who exhibit different impairment patterns.
We then assess the performance of the robotic orthosis for two possible roles: as a therapeutic tool that facilitates device mediated hand exercises to recover neuromuscular function, or as an assistive device for use in everyday activities to aid functional use of the hand. 11 chronic stroke (> 2 years) patients with moderate muscle tone (Modified Ashworth Scale †2 in upper extremity) engage in a month-long training protocol using the orthosis. Individuals are evaluated using standardized outcome measures, both with and without orthosis assistance. The results highlight the potential for wearable and user-driven robotic hand orthoses to extend the use and training of the affected upper limb after stroke.
The advances proposed in this thesis have the potential to enable robotic based hand rehabilitation during daily activities (as opposed to isolated hand exercises with limited upper limb engagement) and over extended periods of time, even in a patientâs home environment. Numerous challenges must still be overcome in order to achieve this vision, related to design (compact devices with easier donning/doffing), control (robust yet intuitive intent inferral), and effectiveness (improved functionality in a wider range of metrics). However, if these challenges can be addressed, wearable robotic devices have the potential to greatly extend the use and training of the affected upper limb after stroke, and help improve the quality of life for a large patient population
Smart Fabric sensors for foot motion monitoring
Smart Fabrics or fabrics that have the characteristics of sensors are a wide and emerging field of study. This thesis summarizes an investigation into the development of fabric sensors for use in sensorized socks that can be used to gather real time information about the foot such as gait features. Conventional technologies usually provide 2D information about the foot. Sensorized socks are able to provide angular data in which foot angles are correlated to the output from the sensor enabling 3D monitoring of foot position. Current angle detection mechanisms are mainly heavy and cumbersome; the sensorized socks are not only portable but also non-invasive to the subject who wears them. The incorporation of wireless features into the sensorized socks enabled a remote monitoring of the foot
Haptics Rendering and Applications
There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future
Design and Development of a Human Gesture Recognition System in Tridimensional Interactive Virtual Environment
This thesis describes the design and the development of a recognition
system for human gestures. The main goal of this work is to demonstrate
the possibility to extract enough information, both semantic and quantitative,
from the human action, to perform complex tasks in a virtual environment.
To manage the complexity and the variability adaptive systems are
exploited, both in building a codebook (by unsupervised neural networks),
and to recognize the sequence of symbols describing a gesture (by Hidden
Markov models)
On the Role of Haptic Synergies in Modelling the Sense of Touch and in Designing Artificial Haptic Systems
This thesis aims at defining strategies to reduce haptic information complexity, with minimum loss of information, to design more effective haptic interfaces and artificial systems. Nowadays, haptic device design can be complex. Moreover, the artificial reproduction of the full spectrum of haptic information is a daunting task and far to be achieved. The central idea of this work is to simplify this information by exploiting the concept of synergies, which has been developed to describe the covariation patterns in multi-digit movements and forces in common motor tasks. Here I extend and exploit it also in the perceptual domain, to find projections from the heterogeneous information manifold, generated by the mechanics of touch, and what can be actually perceived by humans. In this manner, design trade-off between costs, feasibility and quality of the rendered perception can be individuated.
With this as motivation, referring to cutaneous sensing, I discuss the development of a fabric-based softness display inspired by ``Contact Area Spread Rate'' hypothesis as well as the characterization of an air-jet lump display method for Robot-assisted Minimally Invasive Surgery. Considering kinaesthesia, I analyze the problem of hand posture estimation from noisy and limited in number measures provided by low cost hand pose sensing devices. By using the information about how humans most frequently use their hands, system performance is enhanced and optimal system design enabled.
Finally, an integrated device, where a conventional kinaesthetic haptic display is combined with a cutaneous softness one, is proposed, showing that the fidelity by which softness is artificially rendered increases
A virtual hand assessment system for efficient outcome measures of hand rehabilitation
Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subjectâs performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digitsâ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation Ï = 0.7), inter-subject repeatability across the subjectâs performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingersâ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subjectâs specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplinesâ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subjectâs performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digitsâ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation Ï = 0.7), inter-subject repeatability across the subjectâs performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingersâ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subjectâs specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplinesâ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control
Design and Evaluation of a Contact-Free Interface for Minimally Invasive Robotics Assisted Surgery
Robotic-assisted minimally invasive surgery (RAMIS) is becoming increasingly more common for many surgical procedures. These minimally invasive techniques offer the benefit of reduced patient recovery time, mortality and scarring compared to traditional open surgery. Teleoperated procedures have the added advantage of increased visualization, and enhanced accuracy for the surgeon through tremor filtering and scaling down hand motions. There are however still limitations in these techniques preventing the widespread growth of the technology. In RAMIS, the surgeon is limited in their movement by the operating console or master device, and the cost of robotic surgery is often too high to justify for many procedures. Sterility issues arise as well, as the surgeon must be in contact with the master device, preventing a smooth transition between traditional and robotic modes of surgery.
This thesis outlines the design and analysis of a novel method of interaction with the da Vinci Surgical Robot. Using the da Vinci Research Kit (DVRK), an open source research platform for the da Vinci robot, an interface was developed for controlling the robotic arms with the Leap Motion Controller. This small device uses infrared LEDs and two cameras to detect the 3D positions of the hand and fingers. This data from the hands is mapped to the da Vinci surgical tools in real time, providing the surgeon with an intuitive method of controlling the instruments. An analysis of the tracking workspace is provided, to give a solution to occlusion issues. Multiple sensors are fused together in order to increase the range of trackable motion over a single sensor. Additional work involves replacing the current viewing screen with a virtual reality (VR) headset (Oculus Rift), to provide the surgeon with a stereoscopic 3D view of the surgical site without the need for a large monitor. The headset also provides the user with a more intuitive and natural method of positioning the camera during surgery, using the natural motions of the head. The large master console of the da Vinci system has been replaced with an inexpensive vision based tracking system, and VR headset, allowing the surgeon to operate the da Vinci Surgical Robot with more natural movements for the user. A preliminary evaluation of the system is provided, with recommendations for future work