7,898 research outputs found

    Robust Hand Motion Capture and Physics-Based Control for Grasping in Real Time

    Get PDF
    Hand motion capture technologies are being explored due to high demands in the fields such as video game, virtual reality, sign language recognition, human-computer interaction, and robotics. However, existing systems suffer a few limitations, e.g. they are high-cost (expensive capture devices), intrusive (additional wear-on sensors or complex configurations), and restrictive (limited motion varieties and restricted capture space). This dissertation mainly focus on exploring algorithms and applications for the hand motion capture system that is low-cost, non-intrusive, low-restriction, high-accuracy, and robust. More specifically, we develop a realtime and fully-automatic hand tracking system using a low-cost depth camera. We first introduce an efficient shape-indexed cascaded pose regressor that directly estimates 3D hand poses from depth images. A unique property of our hand pose regressor is to utilize a low-dimensional parametric hand geometric model to learn 3D shape-indexed features robust to variations in hand shapes, viewpoints and hand poses. We further introduce a hybrid tracking scheme that effectively complements our hand pose regressor with model-based hand tracking. In addition, we develop a rapid 3D hand shape modeling method that uses a small number of depth images to accurately construct a subject-specific skinned mesh model for hand tracking. This step not only automates the whole tracking system but also improves the robustness and accuracy of model-based tracking and hand pose regression. Additionally, we also propose a physically realistic human grasping synthesis method that is capable to grasp a wide variety of objects. Given an object to be grasped, our method is capable to compute required controls (e.g. forces and torques) that advance the simulation to achieve realistic grasping. Our method combines the power of data-driven synthesis and physics-based grasping control. We first introduce a data-driven method to synthesize a realistic grasping motion from large sets of prerecorded grasping motion data. And then we transform the synthesized kinematic motion to a physically realistic one by utilizing our online physics-based motion control method. In addition, we also provide a performance interface which allows the user to act out before a depth camera to control a virtual object

    Robust Hand Motion Capture and Physics-Based Control for Grasping in Real Time

    Get PDF
    Hand motion capture technologies are being explored due to high demands in the fields such as video game, virtual reality, sign language recognition, human-computer interaction, and robotics. However, existing systems suffer a few limitations, e.g. they are high-cost (expensive capture devices), intrusive (additional wear-on sensors or complex configurations), and restrictive (limited motion varieties and restricted capture space). This dissertation mainly focus on exploring algorithms and applications for the hand motion capture system that is low-cost, non-intrusive, low-restriction, high-accuracy, and robust. More specifically, we develop a realtime and fully-automatic hand tracking system using a low-cost depth camera. We first introduce an efficient shape-indexed cascaded pose regressor that directly estimates 3D hand poses from depth images. A unique property of our hand pose regressor is to utilize a low-dimensional parametric hand geometric model to learn 3D shape-indexed features robust to variations in hand shapes, viewpoints and hand poses. We further introduce a hybrid tracking scheme that effectively complements our hand pose regressor with model-based hand tracking. In addition, we develop a rapid 3D hand shape modeling method that uses a small number of depth images to accurately construct a subject-specific skinned mesh model for hand tracking. This step not only automates the whole tracking system but also improves the robustness and accuracy of model-based tracking and hand pose regression. Additionally, we also propose a physically realistic human grasping synthesis method that is capable to grasp a wide variety of objects. Given an object to be grasped, our method is capable to compute required controls (e.g. forces and torques) that advance the simulation to achieve realistic grasping. Our method combines the power of data-driven synthesis and physics-based grasping control. We first introduce a data-driven method to synthesize a realistic grasping motion from large sets of prerecorded grasping motion data. And then we transform the synthesized kinematic motion to a physically realistic one by utilizing our online physics-based motion control method. In addition, we also provide a performance interface which allows the user to act out before a depth camera to control a virtual object

    Simplified Hand Configuration for Object Manipulation

    Get PDF
    This work is focused on obtaining realistic human hand models that are suitable for manipulation tasks. Firstly, a 24 DOF kinematic model of the human hand is defined. This model is based on the human skeleton. Intra-finger and inter-finger constraints have been included in order to improve the movement realism. Secondly, two simplified hand descriptions (9 and 6 DOF) have been developed according to the constraints predefined. These simplified models involve some errors in reconstructing the hand posture. These errors are calculated with respect to the 24 DOF model and evaluated according to the hand gestures. Finally, some criteria are defined by which to select the hand description best suited to the features of the manipulation task

    A Biomechanical Model for the Development of Myoelectric Hand Prosthesis Control Systems

    Get PDF
    Advanced myoelectric hand prostheses aim to reproduce as much of the human hand's functionality as possible. Development of the control system of such a prosthesis is strongly connected to its mechanical design; the control system requires accurate information on the prosthesis' structure and the surrounding environment, which can make development difficult without a finalized mechanical prototype. This paper presents a new framework for the development of electromyographic hand control systems, consisting of a prosthesis model based on the biomechanical structure of the human hand. The model's dynamic structure uses an ellipsoidal representation of the phalanges. Other features include underactuation in the fingers and thumb modeled with bond graphs, and a viscoelastic contact model. The model's functions are demonstrated by the execution of lateral and tripod grasps, and evaluated with regard to joint dynamics and applied forces. Finally, additions are suggested with which this model can be of use in mechanical design and patient training as well

    Improving grasping forces during the manipulation of unknown objects

    Get PDF
    © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksMany of the solutions proposed for the object manipulation problem are based on the knowledge of the object features. The approach proposed in this paper intends to provide a simple geometrical approach to securely manipulate an unknown object based only on tactile and kinematic information. The tactile and kinematic data obtained during the manipulation is used to recognize the object shape (at least the local object curvature), allowing to improve the grasping forces when this information is added to the manipulation strategy. The approach has been fully implemented and tested using the Schunk Dexterous Hand (SDH2). Experimental results are shown to illustrate the efficiency of the approach.Peer ReviewedPostprint (author's final draft

    Methods and Tools for Objective Assessment of Psychomotor Skills in Laparoscopic Surgery

    Get PDF
    Training and assessment paradigms for laparoscopic surgical skills are evolving from traditional mentor–trainee tutorship towards structured, more objective and safer programs. Accreditation of surgeons requires reaching a consensus on metrics and tasks used to assess surgeons’ psychomotor skills. Ongoing development of tracking systems and software solutions has allowed for the expansion of novel training and assessment means in laparoscopy. The current challenge is to adapt and include these systems within training programs, and to exploit their possibilities for evaluation purposes. This paper describes the state of the art in research on measuring and assessing psychomotor laparoscopic skills. It gives an overview on tracking systems as well as on metrics and advanced statistical and machine learning techniques employed for evaluation purposes. The later ones have a potential to be used as an aid in deciding on the surgical competence level, which is an important aspect when accreditation of the surgeons in particular, and patient safety in general, are considered. The prospective of these methods and tools make them complementary means for surgical assessment of motor skills, especially in the early stages of training. Successful examples such as the Fundamentals of Laparoscopic Surgery should help drive a paradigm change to structured curricula based on objective parameters. These may improve the accreditation of new surgeons, as well as optimize their already overloaded training schedules
    corecore