29 research outputs found

    Planning dextrous robot hand grasps from range data, using preshapes and digit trajectories

    Get PDF
    Dextrous robot hands have many degrees of freedom. This enables the manipulation of objects between the digits of the dextrous hand but makes grasp planning substantially more complex than for parallel jaw grippers. Much of the work that addresses grasp planning for dextrous hands concentrates on the selection of contact sites to optimise stability criteria and ignores the kinematics of the hand. In more complete systems, the paradigm of preshaping has emerged as dominant. However, the criteria for the formation and placement of the preshapes have not been adequately examined, and the usefulness of the systems is therefore limited to grasping simple objects for which preshapes can be formed using coarse heuristics.In this thesis a grasp metric based on stability and kinematic feasibility is introduced. The preshaping paradigm is extended to include consideration of the trajectories that the digits take during closure from preshape to final grasp. The resulting grasp family is dependent upon task requirements and is designed for a set of "ideal" object-hand configurations. The grasp family couples the degrees of freedom of the dextrous hand in an anthropomorphic manner; the resulting reduction in freedom makes the grasp planning less complex. Grasp families are fitted to real objects by optimisation of the grasp metric; this corresponds to fitting the real object-hand configuration as close to the ideal as possible. First, the preshape aperture, which defines the positions of the fingertips in the preshape, is found by optimisation of an approximation to the grasp metric (which makes simplifying assumptions about the digit trajectories and hand kinematics). Second, the full preshape kinematics and digit closure trajectories are calculated to optimise the full grasp metric.Grasps are planned on object models built from laser striper range data from two viewpoints. A surface description of the object is used to prune the space of possible contact sites and to allow the accurate estimation of normals, which is required by the grasp metric to estimate the amount of friction required. A voxel description, built by ray-casting, is used to check for collisions between the object and the robot hand using an approximation to the Euclidean distance transform.Results are shown in simulation for a 3-digit hand model, designed to be like a simplified human hand in terms of its size and functionality. There are clear extensions of the method to any dextrous hand with a single thumb opposing multiple fingers and several different hand models that could be used are described. Grasps are planned on a wide variety of curved and polyhedral object

    Visual perception and grasping for the extravehicular activity robot

    Get PDF
    The development of an approach to the visual perception of object surface information using laser range data in support of robotic grasping is discussed. This is a very important problem area in that a robot such as the EVAR must be able to formulate a grasping strategy on the basis of its knowledge of the surface structure of the object. A description of the problem domain is given as well as a formulation of an algorithm which derives an object surface description adequate to support robotic grasping. The algorithm is based upon concepts of differential geometry namely, Gaussian and mean curvature

    The SmartHand transradial prosthesis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prosthetic components and control interfaces for upper limb amputees have barely changed in the past 40 years. Many transradial prostheses have been developed in the past, nonetheless most of them would be inappropriate if/when a large bandwidth human-machine interface for control and perception would be available, due to either their limited (or inexistent) sensorization or limited dexterity. <it>SmartHand </it>tackles this issue as is meant to be clinically experimented in amputees employing different neuro-interfaces, in order to investigate their effectiveness. This paper presents the design and on bench evaluation of the SmartHand.</p> <p>Methods</p> <p>SmartHand design was bio-inspired in terms of its physical appearance, kinematics, sensorization, and its multilevel control system. Underactuated fingers and differential mechanisms were designed and exploited in order to fit all mechatronic components in the size and weight of a natural human hand. Its sensory system was designed with the aim of delivering significant afferent information to the user through adequate interfaces.</p> <p>Results</p> <p>SmartHand is a five fingered self-contained robotic hand, with 16 degrees of freedom, actuated by 4 motors. It integrates a bio-inspired sensory system composed of 40 proprioceptive and exteroceptive sensors and a customized embedded controller both employed for implementing automatic grasp control and for potentially delivering sensory feedback to the amputee. It is able to perform everyday grasps, count and independently point the index. The weight (530 g) and speed (closing time: 1.5 seconds) are comparable to actual commercial prostheses. It is able to lift a 10 kg suitcase; slippage tests showed that within particular friction and geometric conditions the hand is able to stably grasp up to 3.6 kg cylindrical objects.</p> <p>Conclusions</p> <p>Due to its unique embedded features and human-size, the SmartHand holds the promise to be experimentally fitted on transradial amputees and employed as a bi-directional instrument for investigating -during realistic experiments- different interfaces, control and feedback strategies in neuro-engineering studies.</p

    Autonomous vision-guided bi-manual grasping and manipulation

    Get PDF
    This paper describes the implementation, demonstration and evaluation of a variety of autonomous, vision-guided manipulation capabilities, using a dual-arm Baxter robot. Initially, symmetric coordinated bi-manual manipulation based on kinematic tracking algorithm was implemented on the robot to enable a master-slave manipulation system. We demonstrate the efficacy of this approach with a human-robot collaboration experiment, where a human operator moves the master arm along arbitrary trajectories and the slave arm automatically follows the master arm while maintaining a constant relative pose between the two end-effectors. Next, this concept was extended to perform dual-arm manipulation without human intervention. To this extent, an image-based visual servoing scheme has been developed to control the motion of arms for positioning them at a desired grasp locations. Next we combine this with a dynamic position controller to move the grasped object using both arms in a prescribed trajectory. The presented approach has been validated by performing numerous symmetric and asymmetric bi-manual manipulations at different conditions. Our experiments demonstrated 80% success rate in performing the symmetric dual-arm manipulation tasks; and 73% success rate in performing asymmetric dualarm manipulation tasks

    A Continuous Grasp Representation for the Imitation Learning of Grasps on Humanoid Robots

    Get PDF
    Models and methods are presented which enable a humanoid robot to learn reusable, adaptive grasping skills. Mechanisms and principles in human grasp behavior are studied. The findings are used to develop a grasp representation capable of retaining specific motion characteristics and of adapting to different objects and tasks. Based on the representation a framework is proposed which enables the robot to observe human grasping, learn grasp representations, and infer executable grasping actions

    Robotic Grasping: A Generic Neural Network Architecture

    Get PDF

    Synergy-Based Human Grasp Representations and Semi-Autonomous Control of Prosthetic Hands

    Get PDF
    Das sichere und stabile Greifen mit humanoiden Roboterhänden stellt eine große Herausforderung dar. Diese Dissertation befasst sich daher mit der Ableitung von Greifstrategien für Roboterhände aus der Beobachtung menschlichen Greifens. Dabei liegt der Fokus auf der Betrachtung des gesamten Greifvorgangs. Dieser umfasst zum einen die Hand- und Fingertrajektorien während des Greifprozesses und zum anderen die Kontaktpunkte sowie den Kraftverlauf zwischen Hand und Objekt vom ersten Kontakt bis zum statisch stabilen Griff. Es werden nichtlineare posturale Synergien und Kraftsynergien menschlicher Griffe vorgestellt, die die Generierung menschenähnlicher Griffposen und Griffkräfte erlauben. Weiterhin werden Synergieprimitive als adaptierbare Repräsentation menschlicher Greifbewegungen entwickelt. Die beschriebenen, vom Menschen gelernten Greifstrategien werden für die Steuerung robotischer Prothesenhände angewendet. Im Rahmen einer semi-autonomen Steuerung werden menschenähnliche Greifbewegungen situationsgerecht vorgeschlagen und vom Nutzenden der Prothese überwacht

    Structured manifolds for motion production and segmentation : a structured Kernel Regression approach

    Get PDF
    Steffen JF. Structured manifolds for motion production and segmentation : a structured Kernel Regression approach. Bielefeld (Germany): Bielefeld University; 2010

    An intelligent, free-flying robot

    Get PDF
    The ground based demonstration of the extensive extravehicular activity (EVA) Retriever, a voice-supervised, intelligent, free flying robot, is designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The major objective of the EVA Retriever Project is to design, develop, and evaluate an integrated robotic hardware and on-board software system which autonomously: (1) performs system activation and check-out; (2) searches for and acquires the target; (3) plans and executes a rendezvous while continuously tracking the target; (4) avoids stationary and moving obstacles; (5) reaches for and grapples the target; (6) returns to transfer the object; and (7) returns to base
    corecore