308 research outputs found
Image Based Visual Servoing Using Trajectory Planning and Augmented Visual Servoing Controller
Robots and automation manufacturing machineries have become an inseparable part of industry, nowadays. However, robotic systems are generally limited to operate in highly structured environments. Although, sensors such as laser tracker, indoor GPS, 3D metrology and tracking systems are used for positioning and tracking in manufacturing and assembly tasks, these devices are highly limited to the working environment and the speed of operation and they are generally very expensive. Thus, integration of vision sensors with robotic systems and generally visual servoing system allows the robots to work in unstructured spaces, by producing non-contact measurements of the working area. However, projecting a 3D space into a 2D space, which happens in the camera, causes the loss of one dimension data. This initiates the challenges in vision based control. Moreover, the nonlinearities and complex structure of a manipulator robot make the problem more challenging. This project aims to develop new reliable visual servoing methods that allow its use in real robotic tasks.
The main contributions of this project are in two parts; the visual servoing controller and trajectory planning algorithm. In the first part of the project, a new image based visual servoing controller called Augmented Image Based Visual Servoing (AIBVS) is presented. A proportional derivative (PD) controller is developed to generate acceleration as the controlling command of the robot. The stability analysis of the controller is conducted using Lyapanov theory. The developed controller has been tested on a 6 DOF Denso robot. The experimental results on point features and image moment features demonstrate the performance of the proposed AIBVS. Experimental results show that a damped response could be achieved using a PD controller with acceleration output. Moreover, smoother feature and robot trajectories are observed compared to those in conventional IBVS controllers. Later on, this controller is used on a moving object catching process.
Visual servoing controllers have shown difficulty in stabilizing the system in global space. Hence, in the second part of the project, a trajectory planning algorithm is developed to achieve the global stability of the system. The trajectory planning is carried out by parameterizing the camera's velocity screw. The camera's velocity screw is parameterized using time-based profiles. The parameters of the velocity profile are then determined such that the velocity profile guides the robot to its desired position. This is done by minimizing the error between the initial and desired features. This method provides a reliable path for the robot considering all robotic constraints. The developed algorithm is tested on a Denso robot. The results show that the trajectory planning algorithm is able to perform visual servoing tasks which are unstable when performed using visual servoing controllers
Docking Haptics: Extending the Reach of Haptics by Dynamic Combinations of Grounded and Worn Devices
Grounded haptic devices can provide a variety of forces but have limited
working volumes. Wearable haptic devices operate over a large volume but are
relatively restricted in the types of stimuli they can generate. We propose the
concept of docking haptics, in which different types of haptic devices are
dynamically docked at run time. This creates a hybrid system, where the
potential feedback depends on the user's location. We show a prototype docking
haptic workspace, combining a grounded six degree-of-freedom force feedback arm
with a hand exoskeleton. We are able to create the sensation of weight on the
hand when it is within reach of the grounded device, but away from the grounded
device, hand-referenced force feedback is still available. A user study
demonstrates that users can successfully discriminate weight when using docking
haptics, but not with the exoskeleton alone. Such hybrid systems would be able
to change configuration further, for example docking two grounded devices to a
hand in order to deliver twice the force, or extend the working volume. We
suggest that the docking haptics concept can thus extend the practical utility
of haptics in user interfaces
Autonomous Robots for Active Removal of Orbital Debris
This paper presents a vision guidance and control method for autonomous
robotic capture and stabilization of orbital objects in a time-critical manner.
The method takes into account various operational and physical constraints,
including ensuring a smooth capture, handling line-of-sight (LOS) obstructions
of the target, and staying within the acceleration, force, and torque limits of
the robot. Our approach involves the development of an optimal control
framework for an eye-to-hand visual servoing method, which integrates two
sequential sub-maneuvers: a pre-capturing maneuver and a post-capturing
maneuver, aimed at achieving the shortest possible capture time. Integrating
both control strategies enables a seamless transition between them, allowing
for real-time switching to the appropriate control system. Moreover, both
controllers are adaptively tuned through vision feedback to account for the
unknown dynamics of the target. The integrated estimation and control
architecture also facilitates fault detection and recovery of the visual
feedback in situations where the feedback is temporarily obstructed. The
experimental results demonstrate the successful execution of pre- and
post-capturing operations on a tumbling and drifting target, despite multiple
operational constraints
Guidance and Control of a Planar Robot Manipulator Used in an Assembly Line
In order to achieve higher productivity and lower cost requirements, robot manipulators have been enrolled in assembling processes in last decades as well as other implementation areas such as transportation, welding, mounting, and quality control. As a new application of this field, the control of the synchronous movements of a planar robot manipulator and moving belt is dealt with in this study. Here, the mentioned synchronization is tried to be maintained in accordance with a guidance law which leads the robot manipulator to put selected components onto the specific slots on the moving belt without interrupting the assembling process. In this scheme, the control of the manipulator is carried out by considering the PI (proportional plus integral) control law. Having performed the relevant computer simulations based on the engagement geometry between the robot manipulator and moving belt, it is verified that the mentioned pick-and-place task can be successfully accomplished under different operating conditions
Control de un convoy robótico mediante planificación de rutas y estrategias de orientación
This paper presents an overview about control methods implemented on robotic convoy systems or cooperative systems on mobile platforms which can be used for path planning, orientation, environment perception, route tracking and control systems in which involve the measurement, analysis, and interpretation of different variables for further implementation. A review was made of investigation articles on bibliographic indexes and databases about control methods used in convoy systems to evidence progress, trends and application methods.Este artículo presenta un estado del arte relacionado con métodos de control implementados en sistemas de convoy robóticos en plataformas móviles que pueden ser utilizados para la planificación de rutas o trayectorias, orientación, percepción de entornos y sistemas de control en el que se involucra la medición, análisis e interpretación de diversas variables y su posterior implementación. Se realizó una revisión de artículos de investigación en índices bibliográficos y bases de datos sobre métodos de control aplicados en sistemas de convoy para de esta forma evidenciar avances, tendencias y métodos de aplicación.
Computational intelligence approaches to robotics, automation, and control [Volume guest editors]
No abstract available
- …