2 research outputs found

    Collaborative localization for autonomous robots in structured environments

    No full text
    A complete approach to the visual localization and mapping problem (SLAM) is presented in this work. The presented approach exploits the enhanced capabilities of a system where a human and a robot collaborate in surveying/exploratory tasks. The human is supposed to wear a smart headwear device, which deploys a inertial measurement unit and a camera, Hv. This camera acts as a secondary sensor, and provides data to the robotic Rv camera performing mapping tasks. The data from the human-worn camera is used to produce real-time depth estimation of landmarks when its field of view overlaps with that of Rv. These measurements are mathematically fully integrated into the EKF-SLAM methodology. Experiments with real captured data validate the proposed approach.Peer ReviewedPostprint (published version

    Collaborative localization for autonomous robots in structured environments

    No full text
    A complete approach to the visual localization and mapping problem (SLAM) is presented in this work. The presented approach exploits the enhanced capabilities of a system where a human and a robot collaborate in surveying/exploratory tasks. The human is supposed to wear a smart headwear device, which deploys a inertial measurement unit and a camera, Hv. This camera acts as a secondary sensor, and provides data to the robotic Rv camera performing mapping tasks. The data from the human-worn camera is used to produce real-time depth estimation of landmarks when its field of view overlaps with that of Rv. These measurements are mathematically fully integrated into the EKF-SLAM methodology. Experiments with real captured data validate the proposed approach.Peer Reviewe
    corecore