150 research outputs found
Recommended from our members
Real-time visual servoing
A real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is tracking a moving object is described. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipeline-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5-Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an α-β-γ filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories
Weighted feature selection criteria for visual servoing of a telerobot
Because of the continually changing environment of a space station, visual feedback is a vital element of a telerobotic system. A real time visual servoing system would allow a telerobot to track and manipulate randomly moving objects. Methodologies for the automatic selection of image features to be used to visually control the relative position between an eye-in-hand telerobot and a known object are devised. A weighted criteria function with both image recognition and control components is used to select the combination of image features which provides the best control. Simulation and experimental results of a PUMA robot arm visually tracking a randomly moving carburetor gasket with a visual update time of 70 milliseconds are discussed
Visual Registration and Navigation using Planar Features
This paper addresses the problem of registering the hexapedal robot RHex, relative to a known set of beacons, by real-time visual servoing. A suitably constructed navigation function represents the task, in the sense that for a completely actuated machine in the horizontal plane, the gradient dynamics guarantee convergence to the visually cued goal without ever losing sight of the beacons that define it. Since the horizontal plane behavior of RHex can be represented as a unicycle, feeding back the navigation function gradient avoids loss of beacons, but does not yield an asymptotically stable goal. We address new problems arising from the configuration of the beacons and present preliminary experimental results that illustrate the discrepancies between the idealized and physical robot actuation capabilities
Real-time Stereo Visual Servoing for Rose Pruning with Robotic Arm
The paper presents a working pipeline which integrates hardware and software in an automated robotic rose cutter. To the best of our knowledge, this is the first robot able to prune rose bushes in a natural environment. Unlike similar approaches like tree stem cutting, the proposed method does not require to scan the full plant, have multiple cameras around the bush, or assume that a stem does not move. It relies on a single stereo camera mounted on the end-effector of the robot and real-time visual servoing to navigate to the desired cutting location on the stem. The evaluation of the whole pipeline shows a good performance in a garden with unconstrained conditions, where finding and approaching a specific location on a stem is challenging due to occlusions caused by other stems and dynamic changes caused by the win
[[alternative]]Design and Implementation of Vision System, Behavior Strategy and Control Circuit for Middle Size Soccer Robot (I)
計畫編號:NSC92-2213-E032-023研究期間:200308~200407研究經費:618,000[[sponsorship]]行政院國家科學委員
Recommended from our members
Hand-eye coordination for grasping moving objects
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: dynamic vision sensing, real-time arm control, and grasp control. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in a dynamic manner to coordinate the motion of the robotic arm as it tracks the object. The dynamic vision system is used to feed a real-time arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present 3 different strategies for intercepting the object and results from the tracking algorithm
- …