456 research outputs found

    Autonomous Robots for Active Removal of Orbital Debris

    Full text link
    This paper presents a vision guidance and control method for autonomous robotic capture and stabilization of orbital objects in a time-critical manner. The method takes into account various operational and physical constraints, including ensuring a smooth capture, handling line-of-sight (LOS) obstructions of the target, and staying within the acceleration, force, and torque limits of the robot. Our approach involves the development of an optimal control framework for an eye-to-hand visual servoing method, which integrates two sequential sub-maneuvers: a pre-capturing maneuver and a post-capturing maneuver, aimed at achieving the shortest possible capture time. Integrating both control strategies enables a seamless transition between them, allowing for real-time switching to the appropriate control system. Moreover, both controllers are adaptively tuned through vision feedback to account for the unknown dynamics of the target. The integrated estimation and control architecture also facilitates fault detection and recovery of the visual feedback in situations where the feedback is temporarily obstructed. The experimental results demonstrate the successful execution of pre- and post-capturing operations on a tumbling and drifting target, despite multiple operational constraints

    Image Based Visual Servoing Using Trajectory Planning and Augmented Visual Servoing Controller

    Get PDF
    Robots and automation manufacturing machineries have become an inseparable part of industry, nowadays. However, robotic systems are generally limited to operate in highly structured environments. Although, sensors such as laser tracker, indoor GPS, 3D metrology and tracking systems are used for positioning and tracking in manufacturing and assembly tasks, these devices are highly limited to the working environment and the speed of operation and they are generally very expensive. Thus, integration of vision sensors with robotic systems and generally visual servoing system allows the robots to work in unstructured spaces, by producing non-contact measurements of the working area. However, projecting a 3D space into a 2D space, which happens in the camera, causes the loss of one dimension data. This initiates the challenges in vision based control. Moreover, the nonlinearities and complex structure of a manipulator robot make the problem more challenging. This project aims to develop new reliable visual servoing methods that allow its use in real robotic tasks. The main contributions of this project are in two parts; the visual servoing controller and trajectory planning algorithm. In the first part of the project, a new image based visual servoing controller called Augmented Image Based Visual Servoing (AIBVS) is presented. A proportional derivative (PD) controller is developed to generate acceleration as the controlling command of the robot. The stability analysis of the controller is conducted using Lyapanov theory. The developed controller has been tested on a 6 DOF Denso robot. The experimental results on point features and image moment features demonstrate the performance of the proposed AIBVS. Experimental results show that a damped response could be achieved using a PD controller with acceleration output. Moreover, smoother feature and robot trajectories are observed compared to those in conventional IBVS controllers. Later on, this controller is used on a moving object catching process. Visual servoing controllers have shown difficulty in stabilizing the system in global space. Hence, in the second part of the project, a trajectory planning algorithm is developed to achieve the global stability of the system. The trajectory planning is carried out by parameterizing the camera's velocity screw. The camera's velocity screw is parameterized using time-based profiles. The parameters of the velocity profile are then determined such that the velocity profile guides the robot to its desired position. This is done by minimizing the error between the initial and desired features. This method provides a reliable path for the robot considering all robotic constraints. The developed algorithm is tested on a Denso robot. The results show that the trajectory planning algorithm is able to perform visual servoing tasks which are unstable when performed using visual servoing controllers

    Control de un convoy robótico mediante planificación de rutas y estrategias de orientación

    Get PDF
    This paper presents an overview about control methods implemented on robotic convoy systems or cooperative systems on mobile platforms which can be used for path planning, orientation, environment perception, route tracking and control systems in which involve the measurement, analysis, and interpretation of different variables for further implementation. A review was made of investigation articles on bibliographic indexes and databases about control methods used in convoy systems to evidence progress, trends and application methods.Este artículo presenta un estado del arte relacionado con métodos de control implementados en sistemas de convoy robóticos en plataformas móviles que pueden ser utilizados para la planificación de rutas o trayectorias, orientación, percepción de entornos y sistemas de control en el que se involucra la medición, análisis e interpretación de diversas variables y su posterior implementación. Se realizó una revisión de artículos de investigación en índices bibliográficos y bases de datos sobre métodos de control aplicados en sistemas de convoy para de esta forma evidenciar avances, tendencias y métodos de aplicación.

    UAS stealth: target pursuit at constant distance using a bio-inspired motion camouflage guidance law

    Get PDF
    The aim of this study is to derive a guidance law by which an Unmanned Aerial System(s) (UAS) can pursue a moving target at a constant distance, while concealing its own motion. We derive a closed-form solution for the trajectory of the UAS by imposing two key constraints:
 (1) the shadower moves in such a way as to be perceived as a stationary object by the shadowee, and (2) the distance between the shadower and shadowee is kept constant. Additionally, the theory presented in this paper considers constraints on the maximum achievable speed and acceleration of the shadower. Our theory is tested through Matlab simulations, which validate the camouflage strategy for both 2D and 3D conditions. Furthermore, experiments using a realistic vision-based implementation are conducted in a virtual environment, where the results demonstrate that even with noisy state information it is possible to remain well camouflaged using the Constant Distance Motion Camouflage (CDMC) technique

    Search Methods for Mobile Manipulator Performance Measurement

    Get PDF
    Mobile manipulators are a potential solution to the increasing need for additional flexibility and mobility in industrial robotics applications. However, they tend to lack the accuracy and precision achieved by fixed manipulators, especially in scenarios where both the manipulator and the autonomous vehicle move simultaneously. This thesis analyzes the problem of dynamically evaluating the positioning error of mobile manipulators. In particular, it investigates the use of Bayesian methods to predict the position of the end-effector in the presence of uncertainty propagated from the mobile platform. Simulations and real-world experiments are carried out to test the proposed method against a deterministic approach. These experiments are carried out on two mobile manipulators - a proof-of-concept research platform and an industrial mobile manipulator - using ROS and Gazebo. The precision of the mobile manipulator is evaluated through its ability to intercept retroreflective markers using a photoelectric sensor attached to the end-effector. Compared to the deterministic search approach, we observed improved interception capability with comparable search times, thereby enabling the effective performance measurement of the mobile manipulator

    Vision systems for autonomous aircraft guidance

    Get PDF
    corecore