322 research outputs found

    Generation of dynamic motion for anthropomorphic systems under prioritized equality and inequality constraints

    Get PDF
    In this paper, we propose a solution to compute full-dynamic motions for a humanoid robot, accounting for various kinds of constraints such as dynamic balance or joint limits. As a first step, we propose a unification of task-based control schemes, in inverse kinematics or inverse dynamics. Based on this unification, we generalize the cascade of quadratic programs that were developed for inverse kinematics only. Then, we apply the solution to generate, in simulation, wholebody motions for a humanoid robot in unilateral contact with the ground, while ensuring the dynamic balance on a non horizontal surface

    ForceSight: Text-Guided Mobile Manipulation with Visual-Force Goals

    Full text link
    We present ForceSight, a system for text-guided mobile manipulation that predicts visual-force goals using a deep neural network. Given a single RGBD image combined with a text prompt, ForceSight determines a target end-effector pose in the camera frame (kinematic goal) and the associated forces (force goal). Together, these two components form a visual-force goal. Prior work has demonstrated that deep models outputting human-interpretable kinematic goals can enable dexterous manipulation by real robots. Forces are critical to manipulation, yet have typically been relegated to lower-level execution in these systems. When deployed on a mobile manipulator equipped with an eye-in-hand RGBD camera, ForceSight performed tasks such as precision grasps, drawer opening, and object handovers with an 81% success rate in unseen environments with object instances that differed significantly from the training data. In a separate experiment, relying exclusively on visual servoing and ignoring force goals dropped the success rate from 90% to 45%, demonstrating that force goals can significantly enhance performance. The appendix, videos, code, and trained models are available at https://force-sight.github.io/

    Visual Control with Adaptive Dynamical Compensation for 3D Target Tracking by Mobile Manipulators

    Get PDF
    In this paper an image-based dynamic visual feedback control for mobile manipulators is presented to solve the target tracking problem in the 3D-workspace. The design of the whole controller is based on two cascaded subsystems: a minimum norm visual kinematic controller which complies with the 3D target tracking objective, and an adaptive controller that compensates the dynamics of the mobile manipulator. Both the kinematic controller and the adaptive controller are designed to prevent from command saturation. Robot commands are defined in terms of reference velocities. Stability and robustness are proved by using Lyapunov’s method. Finally, experimental results are presented to confirm the effectiveness of the proposed visual feedback controller.Fil: Andaluz, Víctor. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; ArgentinaFil: Carelli Albarracin, Ricardo Oscar. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Salinas, Lucio Rafael. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Toibero, Juan Marcos. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; ArgentinaFil: Roberti, Flavio. Universidad Nacional de San Juan. Facultad de Ingeniería. Instituto de Automática; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentin

    Robust control for a wheeled mobile robot to track a predefined trajectory in the presence of unknown wheel slips

    Get PDF
    In this paper, a robust controller for a nonholonomic wheeled mobile robot (WMR) is proposed for tracking a predefined trajectory in the presence of unknown wheel slips, bounded external disturbances, and model uncertainties. The whole control system consists of two closed loops. Specifically, the outer one is employed to control the kinematics, and the inner one is used to control the dynamics. The output of kinematic controller is adopted as the input of the inner (dynamic) closed loop. Furthermore, two robust techniques were utilized to assure the robustness. In particular, one is used in the kinematic controller to compensate the harmful effects of the unknown wheel slips, and the other is used in the dynamic controller to overcome the model uncertainties and bounded external disturbances. Thanks to this proposed controller, a desired tracking performance in which tracking errors converge asymptotically to zero is obtained. According to Lyapunov theory and LaSalle extension, the desired tracking performance is guaranteed to be achieved. The results of computer simulation have shown the validity and efficiency of the proposed controller

    Challenges and Solutions for Autonomous Robotic Mobile Manipulation for Outdoor Sample Collection

    Get PDF
    In refinery, petrochemical, and chemical plants, process technicians collect uncontaminated samples to be analyzed in the quality control laboratory all time and all weather. This traditionally manual operation not only exposes the process technicians to hazardous chemicals, but also imposes an economical burden on the management. The recent development in mobile manipulation provides an opportunity to fully automate the operation of sample collection. This paper reviewed the various challenges in sample collection in terms of navigation of the mobile platform and manipulation of the robotic arm from four aspects, namely mobile robot positioning/attitude using global navigation satellite system (GNSS), vision-based navigation and visual servoing, robotic manipulation, mobile robot path planning and control. This paper further proposed solutions to these challenges and pointed the main direction of development in mobile manipulation

    Visual servoing of a five-bar linkage mechanism /

    Get PDF
    This document is the written product of the graduation project developed: Visual Servoing of a Five-bar Linkage Mechanism. This project means to venture into the fields of a method of control, with visual feedback, known as Visual Servoing. The contents of this document show a summary of all the theory taken into account to realize the project. They also shows how other people have approached this method. These pages present the project establishing its aims, the importance of its realization, a detailed description of how it was carried out - including experiments and obstacles, - and the results obtained. This document also informs how is this work of use and what can be done from it. In the same way, here are consigned the books, articles, and works consulted in the way, which in their own pages provide a large quantity of references and information.Incluye referencias bibliográfica

    Vision-based Global Path Planning and Trajectory Generation for Robotic Applications in Hazardous Environments

    Get PDF
    The aim of this study is to find an efficient global path planning algorithm and trajectory generation method using a vision system. Path planning is part of the more generic navigation function of mobile robots that consists of establishing an obstacle-free path, starting from the initial pose to the target pose in the robot workspace.In this thesis, special emphasis is placed on robotic applications in industrial and scientific infrastructure environments that are hazardous and inaccessible to humans, such as nuclear power plants and ITER1 and CERN2 LHC3 tunnel. Nuclear radiation can cause deadly damage to the human body, but we have to depend on nuclear energy to meet our great demands for energy resources. Therefore, the research and development of automatic transfer robots and manipulations under nuclear environment are regarded as a key technology by many countries in the world. Robotic applications in radiation environments minimize the danger of radiation exposure to humans. However, the robots themselves are also vulnerable to radiation. Mobility and maneuverability in such environments are essential to task success. Therefore, an efficient obstacle-free path and trajectory generation method are necessary for finding a safe path with maximum bounded velocities in radiation environments. High degree of freedom manipulators and maneuverable mobile robots with steerable wheels, such as non-holonomic omni-directional mobile robots make them suitable for inspection and maintenance tasks where the camera is the only source of visual feedback.In this thesis, a novel vision-based path planning method is presented by utilizing the artificial potential field, the visual servoing concepts and the CAD-based recognition method to deal with the problem of path and trajectory planning. Unlike the majority of conventional trajectory planning methods that consider a robot as only one point, the entire shape of a mobile robot is considered by taking into account all of the robot’s desired points to avoid obstacles. The vision-based algorithm generates synchronized trajectories for all of the wheels on omni-directional mobile robot. It provides the robot’s kinematic variables to plan maximum allowable velocities so that at least one of the actuators is always working at maximum velocity. The advantage of generated synchronized trajectories is to avoid slippage and misalignment in translation and rotation movement. The proposed method is further developed to propose a new vision-based path coordination method for multiple mobile robots with independently steerable wheels to avoid mutual collisions as well as stationary obstacles. The results of this research have been published to propose a new solution for path and trajectory generation in hazardous and inaccessible to human environments where the one camera is the only source of visual feedback

    High-precision grasping and placing for mobile robots

    Get PDF
    This work presents a manipulation system for multiple labware in life science laboratories using the H20 mobile robots. The H20 robot is equipped with the Kinect V2 sensor to identify and estimate the position of the required labware on the workbench. The local features recognition based on SURF algorithm is used. The recognition process is performed for the labware to be grasped and for the workbench holder. Different grippers and labware containers are designed to manipulate different weights of labware and to realize a safe transportation
    corecore