332 research outputs found
A Novel Uncalibrated Visual Servoing Controller Baesd on Model-Free Adaptive Control Method with Neural Network
Nowadays, with the continuous expansion of application scenarios of robotic
arms, there are more and more scenarios where nonspecialist come into contact
with robotic arms. However, in terms of robotic arm visual servoing,
traditional Position-based Visual Servoing (PBVS) requires a lot of calibration
work, which is challenging for the nonspecialist to cope with. To cope with
this situation, Uncalibrated Image-Based Visual Servoing (UIBVS) frees people
from tedious calibration work. This work applied a model-free adaptive control
(MFAC) method which means that the parameters of controller are updated in real
time, bringing better ability of suppression changes of system and environment.
An artificial intelligent neural network is applied in designs of controller
and estimator for hand-eye relationship. The neural network is updated with the
knowledge of the system input and output information in MFAC method. Inspired
by "predictive model" and "receding-horizon" in Model Predictive Control (MPC)
method and introducing similar structures into our algorithm, we realizes the
uncalibrated visual servoing for both stationary targets and moving
trajectories. Simulated experiments with a robotic manipulator will be carried
out to validate the proposed algorithm.Comment: 16 pages, 8 figure
Robot eye-hand coordination learning by watching human demonstrations: a task function approximation approach
We present a robot eye-hand coordination learning method that can directly
learn visual task specification by watching human demonstrations. Task
specification is represented as a task function, which is learned using inverse
reinforcement learning(IRL) by inferring differential rewards between state
changes. The learned task function is then used as continuous feedbacks in an
uncalibrated visual servoing(UVS) controller designed for the execution phase.
Our proposed method can directly learn from raw videos, which removes the need
for hand-engineered task specification. It can also provide task
interpretability by directly approximating the task function. Besides,
benefiting from the use of a traditional UVS controller, our training process
is efficient and the learned policy is independent from a particular robot
platform. Various experiments were designed to show that, for a certain DOF
task, our method can adapt to task/environment variances in target positions,
backgrounds, illuminations, and occlusions without prior retraining.Comment: Accepted in ICRA 201
Image based visual servoing using bitangent points applied to planar shape alignment
We present visual servoing strategies based on bitangents for aligning planar shapes. In order to acquire bitangents we use convex-hull of a curve. Bitangent points are employed in the construction of a feature vector to be used in visual control. Experimental results obtained on a 7 DOF Mitsubishi PA10 robot, verifies the proposed method
Sim2Real View Invariant Visual Servoing by Recurrent Control
Humans are remarkably proficient at controlling their limbs and tools from a
wide range of viewpoints and angles, even in the presence of optical
distortions. In robotics, this ability is referred to as visual servoing:
moving a tool or end-point to a desired location using primarily visual
feedback. In this paper, we study how viewpoint-invariant visual servoing
skills can be learned automatically in a robotic manipulation scenario. To this
end, we train a deep recurrent controller that can automatically determine
which actions move the end-point of a robotic arm to a desired object. The
problem that must be solved by this controller is fundamentally ambiguous:
under severe variation in viewpoint, it may be impossible to determine the
actions in a single feedforward operation. Instead, our visual servoing system
must use its memory of past movements to understand how the actions affect the
robot motion from the current viewpoint, correcting mistakes and gradually
moving closer to the target. This ability is in stark contrast to most visual
servoing methods, which either assume known dynamics or require a calibration
phase. We show how we can learn this recurrent controller using simulated data
and a reinforcement learning objective. We then describe how the resulting
model can be transferred to a real-world robot by disentangling perception from
control and only adapting the visual layers. The adapted model can servo to
previously unseen objects from novel viewpoints on a real-world Kuka IIWA
robotic arm. For supplementary videos, see:
https://fsadeghi.github.io/Sim2RealViewInvariantServoComment: Supplementary video:
https://fsadeghi.github.io/Sim2RealViewInvariantServ
Uncalibrated Dynamic Mechanical System Controller
An apparatus and method for enabling an uncalibrated, model independent controller for a mechanical system using a dynamic quasi-Newton algorithm which incorporates velocity components of any moving system parameter(s) is provided. In the preferred embodiment, tracking of a moving target by a robot having multiple degrees of freedom is achieved using an uncalibrated model independent visual servo control. Model independent visual servo control is defined as using visual feedback to control a robot's servomotors without a precisely calibrated kinematic robot model or camera model. A processor updates a Jacobian and a controller provides control signals such that the robot's end effector is directed to a desired location relative to a target on a workpiece.Georgia Tech Research Corporatio
Positioning and trajectory following tasks in microsystems using model free visual servoing
In this paper, we explore model free visual servoing algorithms by
experimentally evaluating their performances for various tasks
performed on a microassembly workstation developed in our lab. Model
free or so called uncalibrated visual servoing does not need the
system calibration (microscope-camera-micromanipulator) and the
model of the observed scene. It is robust to parameter changes and
disturbances. We tested its performance in point-to-point
positioning and various trajectory following tasks. Experimental
results validate the utility of model free visual servoing in
microassembly tasks
- …