3 research outputs found
Recommended from our members
Hand-eye coordination for grasping moving objects
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm
Recommended from our members
Automated tracking and grasping of a moving object with a robotic hand-eye system
An attempt to achieve a high level of interaction between a real-time vision system capable of tracking moving objects in 3-D and a robot arm with gripper that can be used to pick up a moving object is described. The interplay of hand-eye coordination in dynamic grasping tasks such as grasping of parts on a moving conveyor system, assembly of articulated parts, or for grasping from a mobile robotic system is explored. The goal is to build an integrated sensing and actuation system that can operate in dynamic as opposed to static environments. The system built addresses three distinct problems in using robotic hand-eye coordination for grasping moving objects: fast computation of 3-D motion parameters from vision, predictive control of a moving robotic arm to track a moving object, and interception and grasping. The system operates at approximately human arm movement rates. Experimental results in which a moving model train is tracked, stably grasped, and picked up by the system are presented. The algorithms developed to relate sensing to actuation are quite general and applicable to a variety of complex robotic tasks
Recommended from our members
Trajectory filtering and prediction for automated tracking and grasping of a moving object
The authors explore the requirements for grasping a moving object. This task requires proper coordination between at least three separate subsystems: real-time vision sensing, trajectory-planning/arm-control, and grasp planning. As with humans, the system first visually tracks the object's 3D position. Because the object is in motion, this must be done in real-time to coordinate the motion of the robotic arm as it tracks the object. The vision system is used to feed an arm control algorithm that plans a trajectory. The arm control algorithm is implemented into two steps: filtering and prediction and kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. Experimental results are presented in which a moving model train was tracked, stably grasped, and picked up by the system