165 research outputs found

    Visual servoing for path reaching with nonholonomic robots

    Get PDF
    International audienceWe present two visual servoing controllers (pose-based and image-based) en- abling mobile robots with a fixed pinhole camera to reach and follow a contin- uous path drawn on the ground. The first contribution is the theoretical and experimental comparison between pose-based and image-based techniques for a nonholonomic robot task. Moreover, our controllers are appropriate not only for path following, but also for path reaching, a problem that has been rarely tackled in the past. Thirdly, in contrast with most works, which require the path geometric model, only two path features are necessary in our image-based scheme, and three in the pose-based scheme. For both controllers, a conver- gence analysis is carried out, and the performance is validated by simulations, and outdoor experiments on a car-like robot

    From Optimal Synthesis to Optimal Visual Servoing for Autonomous Vehicles

    Get PDF
    This thesis focuses on the characterization of optimal (shortest) paths to a desired position for a robot with unicycle kinematics and an on-board camera with limited Field-Of-View (FOV), which must keep a given feature in sight. In particular, I provide a complete optimal synthesis for the problem, i.e., a language of optimal control words, and a global partition of the motion plane induced by shortest paths, such that a word in the optimal language is univocally associated to a region and completely describes the shortest path from any starting point in that region to the goal point. Moreover, I provide a generalization to the case of arbitrary FOVs, including the case that the direction of motion is not an axis of symmetry for the FOV, and even that it is not contained in the FOV. Finally, based on the shortest path synthesis available, feedback control laws are defined for any point on the motion plane exploiting geometric properties of the synthesis itself. Moreover, by using a slightly generalized stability analysis setting, which is that of stability on a manifold, a proof of stability is given for the controlled system. At the end, simulation results are reported to demonstrate the effectiveness of the proposed technique

    Kinematically-Decoupled Impedance Control for Fast Object Visual Servoing and Grasping on Quadruped Manipulators

    Full text link
    We propose a control pipeline for SAG (Searching, Approaching, and Grasping) of objects, based on a decoupled arm kinematic chain and impedance control, which integrates image-based visual servoing (IBVS). The kinematic decoupling allows for fast end-effector motions and recovery that leads to robust visual servoing. The whole approach and pipeline can be generalized for any mobile platform (wheeled or tracked vehicles), but is most suitable for dynamically moving quadruped manipulators thanks to their reactivity against disturbances. The compliance of the impedance controller makes the robot safer for interactions with humans and the environment. We demonstrate the performance and robustness of the proposed approach with various experiments on our 140 kg HyQReal quadruped robot equipped with a 7-DoF manipulator arm. The experiments consider dynamic locomotion, tracking under external disturbances, and fast motions of the target object.Comment: Accepted as contributed paper at 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023

    Visual servoing based mobile robot navigation able to deal with complete target loss

    Full text link
    International audienceThis paper combines the reactive collision avoidance methods with image-based visual servoing control for mobile robot navigation in an indoor environment. The proposed strategy allows the mobile robot to reach a desired position, described by a natural visual target, among unknown obstacles. While the robot avoids the obstacles, the camera could lose its target, which makes visual servoing fail. We propose in this paper a strategy to deal with the loss of visual features by taking advantage of the odometric data sensing. Obstacles are detected by the laser range finder and their boundaries are modeled using B-spline curves. We validate our strategy in a real experiment for an indoor mobile robot navigation in presence of obstacles
    corecore